There are 33 owner-reported driver assist & adas complaints for the 2026 Tesla Model Yin NHTSA's database. These are unverified consumer reports and may not reflect confirmed defects.
I have been a user of Tesla's "Full Self Driving" technology for four years. In one of the last major updates, the company removed the ability of the driver to manually set a "cruise control set point" to maintain a safe speed. Instead, they employed a new system with five "drive profiles," which included "sloth," "chill," "standard," "hurry," and "Mad Max." The problem is that only two of those stay anywhere near the speed limit. "Sloth" will go the speed limit, "Chill" is about 2 mph over the speed limit, "Standard" is about 7 mph over the speed limit, "Hurry" is 12 mph over the speed limit, and "Mad Max" is probably 15-17 mph over. Sloth and Chill will remain in the right lane, rarely passing another vehicle, but all the others will include lane changes. And all of these profiles DO NOT MAINTAIN A SAFE FOLLOWING DISTANCE. Users have been complaining about the cars tailgating, sometimes as close as 3 car lengths at 70 mph. However, sometimes conditions necessitate slower speeds...like Work Zones. I live near Denver, and I-70 has a work zone that will be in existence until 2029. The car doesn't recognize the 45 mph speed limit, bouncing back and forth between 45 and 65. FSD must be disengaged completely to follow the 45 mph speed limit. Which brings up the other problem. In many areas, THE SPEED LIMITS ARE WRONG. In Texas, there are areas where the speed limit is 70, and the car thinks it's 55. In New Mexico, the speed limit is 70, but the car thinks it's 60. THERE IS NO SPEED PROFILE THAT WILL MAINTAIN THE LEGAL LIMIT. Tesla also has a mechanism that's supposed to allow users to report problems, but some of these I've been reporting for over three years. They refuse to tell us where we can update the information (Grok says "HERE," Tom Tom, Open Street Maps), and they are NOT fixing the speeds on the maps.
Lane Departure Aoidane is always enabled for each drive - the car menu states: “Lane Departure Assistance will be reset to Assist on the next drive” - you have to manually go 2 menus and scroll down to the bottom and turn it off every single drive every single transition to Park and drive. The feature is very intolerant. If you swerve around a pothole or a pedestrian or a bicycle, it freaks out and corrects the steering wheel startling the driver and causing a very unsafe situation. I have had several near misses due to this horrible feature. Tesla should design it to be able to be disabled and remember its default state. It’s not a safe feature. It has caused many near accidents and is especially bad in country road driving. The only time he would want such a feature as if the driver is asleep or heavily distracted of which no good driver should be. Tesla has a camera that could detect something like that. This is a safety feature that makes the vehicle significantly less safe. The default for this feature should be disabled for all Teslas. This is a ultra high priority item. It dramatically impacts by driving every day, and I don’t always remember to turn it off until the steering wheel jerks away from me and almost causes an accident!!! This stuff should not be enabled by default!!!
While in Self driving the car veered into a driveway and hit a house.
The standard cruise control is extremely dangerous. The problem is what I hear is called ghosting. When driving using cruise, and there are no other vehicles nor any other objects such as animals or humans. The car will suddenly decelerate almost to the point of a complete if not overridden by manual takeover. I am under the impression that this is intentionally be done by Tesla, so that you will subscribe to their full self driving service. This can and WILL cause accidents in traffic.
Running Full Service Driving (FSD) v14.2.2.5 on HW4. At Skillman St & I-635 Dallas TX, FSD attempted a wrong way turn into oncoming traffic instead of taking the correct leftmost lane. In the image attached instead of following the blue path Tesla took the wrong lane highlighted in red.
Driving using Autosteer feature at 65mph on expressway, attempted to change lanes which caused the cruise control to suddenly turn off causing the car to begin braking hard with traffic behind it. Tesla’s new Model Y has combined the lane keeping and cruise control so if the lane keeping turns off the car suddenly brakes hard from the regenitive brakes. Lane keeping and cruise control should be two separate controls to prevent this issue.
On multiple lane roads whe I am driving next to traffic in the same direction, if I try to give a large truck in the next lane plenty of room the lane keep assist activates and pushes me closer to the vehicle I am trying to avoid. When LCA first takes control it feels like it turns toward the line it's alerting about. Then it steers me towards the thing I was trying to avoid. The happens every drive and causes close calls with traffic, curbs, or large shrubs in the median. I have other older Tesla Model Ys that allow you to keep this setting off permanently. My new Model Y resets on every drive. It's several menu levels deep and distracting when I have to disable while driving. Please have them let me keep this dangerous option off permanently. Thank you
Subject: Tesla Self-Driving / Autopilot Incorrect Maneuver at Intersection Vehicle: Tesla (model: ***Y_)Software: Full Self-Driving / Autopilot (specify which was active)Date: __[XXX]__Time: _[XXX]___Location: _***San Carlos, [XXX] ____ (city, intersection or street) Description of Incident: While the vehicle was operating with Tesla’s driver-assistance system engaged, the navigation indicated the car would turn right at an intersection. As the vehicle approached the intersection and began the maneuver, it unexpectedly continued straight instead of completing the right turn. This caused the vehicle to enter the intersection in front of other vehicles that were stopped at another traffic light. I had to intervene to ensure safety. There was no clear reason for the incorrect maneuver, and the system behavior was unexpected and potentially dangerous. Additional Information: •Weather conditions: ___good weather _clear__ •Traffic conditions: ___lot of traffic people getting off work___ •Driver intervention: Yes / No (explain briefly)yes heading straight to cars if I had not made a quick maneuver and turn the steering wheel. I would have crash hitting at least one or two cars. •Dashcam footage available: Yes / No NO I am submitting this report so the event can be reviewed for possible software or safety issues with the driver-assistance system. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
Yesterday at about [XXX] I was driving using FSD. I was making a left turn on a green arrow turning from [XXX] onto [XXX] . My car hesitated, then accelerated and then abruptly braked nearly causing me to be rear ended by the SUV behind me who was angrily beeping at me. I had to rapidly accelerate to avoid being hit. The left turn signal was green before during and after my turn. I believe that the sun shining brightly on the traffic light made it difficult for my Tesla to accurately recognize that the turn signal was green. I felt this was very unsafe. [XXX] [XXX] INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
There are two major safety issues that came with the vehicle upon the new car delivery. The problems existed even before the delivery as I found the first issue during the new car delivery and the second issue upon leaving the delivery center. First is the non-operational FSD (Full Self Driving). FSD never worked from the beginning. The Tesla advisor noticed it at the delivery center but insisted that it will work after 10 minutes of driving, needing calibration. However, after more than several hours of driving and 35 miles of driving, FSD still doesn't work and the display says FSD is non-functional. Second is the lane departure and lane approach sensor. When the neighboring car approaches from the side, there is no alert or notification. The blind spot sensor doesn't work either. This is a big safety issue and I almost got into an accident had I looked back and noticed a car approching from behind when I tried to move into the right lane. I expalined both of these two major issues with the vehicle and left the car at the service center in Vienna, VA, expecting the repair to be done within hours as promised. However, the service technician later sent a message via Tesla app that the part is missing from the car and the part needs to be ordered. It's been a full day and Tesla is still in the process of finding the issue, which means they still don't know what the issue is. I belive this is not the quality of product and testing that has been registered with Tesla in NHTSA. This surely represents the sub-par quality product and lack of full testing that needs to be done prior to the new car delivery. The service center employee responded via app that the car needs to in repair for 5 full days and may need more time if necessary. This is definitely not the quality and experience that are advertised by Tesla.
At approximately 5:50 PM on Feb 12, 2026, a collision occurred while the vehicle was under the control of "Full Self-Driving (Supervised)." While parked in a commercial lot, I engaged FSD to initiate a route home. A stationary semi-truck was positioned behind my vehicle. Upon engagement, the FSD system failed to detect the presence of the stationary truck. The vehicle initiated a rearward/lateral maneuver at a speed of 1 mph, directly striking the corner of the semi-truck. Safety Failure Details: The collision resulted in significant damage to the right rear quarter panel and associated sensors of my Tesla Model Y. Dashcam footage and system telemetry confirm that FSD was active (Supervised mode) during the entire duration of the maneuver. The system's occupancy network and vision suite failed to identify a large, stationary object within its immediate path of travel at a crawl speed. Request for Investigation: I am reporting a critical failure of the FSD software’s object detection and path planning capabilities. The system commanded a movement into a clear obstruction without issuing any takeover alerts or applying emergency braking, despite the low speed and high visibility of the obstacle.
On full self driving, there is no way to change the follow distance and it follows way too close very often. This generates incidents that the car reports to Tesla insurance and we are penalized for something we can't always control. We also can't change our speed but I'm more concerned about the follow distance on highways. Last incident at 7:46PM on 1/25/2026.
On January 20, around 9:35 am, Phoenix time, I had the car in self-driving mode for a left-hand turn at the intercession of Camino Real and River Road in Tucson, Arizona. A real-time view shows that it’s a tricky and dangerous left-hand turn. For the past 3 weeks, the car navigated it well, waiting until it was perfectly safe to do so. Today, however, the car moved quickly and unexpectedly into the center of River Road, narrowly escaping a head on collision with a west-bound car, and then paused, squeezed in between west and east-bound lanes when I tapped the brake and took the wheel. Everything happened so quickly. I made the left turn into the east-bound lane, but, looking back, I don’t know how an accident didn’t occur, as traffic was still moving in east-bound lanes rapidly. There must have been enough distance between two cars at just the right time, that nothing hit me.
The vehicle repeatedly [XXX] [XXX], and [XXX] displays critical safety alerts indicating failure of the parking brake and automatic vehicle hold systems. Specific error codes documented include: •DI_a246: Automatic vehicle hold disabled (Use brake pedal when stopping) •EPBL_a179: Parking brake functions degraded (Parking brake may not apply or release) •UI_a019: Parking brake functions degraded INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
the lane departure warning and lane assistance features generate numerous warning that are false. the false alarms are shocking and distracting creating alarm fatigue. alarm fatigue is dangerous. The lane assistance feature is much too aggressive. Tesla does not allow for these features to default to off. They are mandated for every start up and have to manually turned off every drive. Tesla told me, "they are working as intended." Other car makers allow you the choice of on or off. I have to go through multiple steps every single time i drive the car to remove the alarm fatigue and false steering interventions. Both are hazardous. I actually get an alarm that says "take the wheel" while my hands are on the wheel. I get numerous fantom alarms. please make Tesla release a SW update that allows people to choose their own driving defaults. thanks
While driving on a residential street at night with Full Self-Driving (Supervised) engaged, the vehicle suddenly applied Automatic Emergency Braking without any visible obstacle, vehicle, pedestrian, or hazard present. The braking was abrupt and unexpected and did not correspond to traffic, road conditions, or driver input. No collision occurred, but the sudden stop caused a whiplash effect to occupants, creating a risk of injury despite the absence of an external hazard. The driver immediately disengaged the system after the event. Tesla later reviewed vehicle data and confirmed that an Automatic Emergency Braking event occurred while FSD (Supervised) was active. Tesla identified the software version as FSD v14.2.1 and documented the incident as unexpected emergency braking, with the date, time, location, and environmental conditions recorded.
When in self driving mode, which activates the adaptive cruise control, it is not possible to set the following distance. The following distance automatically selected by Tesla self driving is much too close to the vehicle in front of me. Tesla has removed the ability to set the following distance. It follows at approximately 2 seconds behind the car in front of me, regardless of my vehicle speed .... at 80 MPH 2 seconds is not enough time for a driver to react. Following distance should be controllable by the driver. Taking away this ability deprives a driver of driving within their own limitations.
While Full Self-Driving (Supervised) was fully engaged (blue steering wheel icon active), the vehicle approached an uncontrolled intersection, selected the wrong path, hesitated severely (camera shaking violently), provided ZERO visual or audible warnings, and completely ignored driver's emergency brake input (pedal depressed >90% approximately 1.5 seconds before impact). The vehicle continued forward by inertia and collided with a roadside curb, resulting in wheel damage. Attached dashcam video clearly shows: • FSD active throughout the event • No alerts or chimes • Violent steering oscillation/hesitation • Driver's foot slamming brake pedal with no deceleration • Impact despite timely braking This incident matches the ongoing NHTSA investigation PE25-012 regarding FSD intersection hesitation and failure to respond to driver input.
Very scary! Heading west on [XXX] , my 2026 Tesla model Y ran 2 red lights! It stopped at the first red light that sits back about 100 feet from [XXX] , and then just sped ahead, went through that light and the one directly on [XXX] and made a right turn. Crazy! INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
While driving in rain at night, the vehicle’s automated driving system attempted to make a turn at an intersection near active train tracks. Instead of remaining on the roadway, the system steered the vehicle directly onto the train tracks, where the vehicle became stuck between the tracks and the paved road surface. All four tires were damaged, and the vehicle could not return to the travel lane under its own control. If a train had been approaching, this situation could have resulted in a severe or fatal collision. I had to manually reverse the vehicle a significant distance to return to the roadway. The malfunction appeared to result from the automated driving system misidentifying the roadway under rain and low-visibility conditions. This suggests a recurring risk for any vehicle using the system at this location in similar conditions. The safety issue has not yet been inspected or confirmed by the manufacturer. No warning lights or alerts were displayed prior to the incident. The affected components and system are available for inspection upon request. I also have dash-cam video of the incident documenting the event.
Showing 1–20 of 33 complaints
Complaints are unverified consumer reports submitted to NHTSA. A high complaint count may reflect vehicle popularity, not defect severity. Data sourced from NHTSA public records.
Data synced from NHTSA on May 4, 2026