There are 23 owner-reported driver assist & adas complaints for the 2021 Tesla Model Sin NHTSA's database. These are unverified consumer reports and may not reflect confirmed defects.
I engaged Tesla's Autopark to park my car in my usual spot. The car kept going after reaching the back of the parking space, crashing into a pole located around one foot behind the parking spot. The time between when it should have braked and the post was of such short duration that I had no time to apply the brakes and prevent the accident. The car lacks both radar and lidar, either of which would have detected the post. Instead it relies on a single video camera. What made this parking different from previous safe parking attempt in the same space was that the car was perfectly lined up. That meant that the camera could not make use of parallax to enable the AI to notice that the post was moving against the background and must, therefore by close behind instead of 40 feet away. For Tesla Autopark to function safely, it needs to "know" how far back possible obstructions are. It could do so by never backing up with perfect alignment. Instead, the car should move enough from side to side to generate enough parallax effect for it to be able to determine if it is about the crash. The service manager who called to tell me that Tesla decided the accident was my fault absolutely refused to even hear my opinion on what Tesla could do to prevent such accidents in the future. He told me he had be instructed to deliver Tesla's disclaimer and to not listen to anything I might have to say. As an engineer who has patents in the area of both aviation and automotive safety, I have an opinion that might have been worthy of Tesla's hearing. I now suggest you folks warn people that Autopark should not be used to complete the parking procedure unless and until Tesla decides to address this problem. In the meantime, drivers should be cautioned to either not use the feature or to tap the brakes lightly to disengage the feature several feet before reaching the back of a parking space. The driver can then complete the maneuver manually.
Driving Southeast on [XXX] between Gallup, NM and Albuquerque, NM. I was traveling at 75 MPH on an open road, it was a sunny day. I had the adaptive cruise control engaged (ACC). Literally no traffic in front of me for as far as the eye can see. Nothing that would cause an alert or obstruction in front of the car. The Tesla was on ACC and suddenly braked, hard. I immediately disengaged the ACC and drove manually. It was a frightening incident. If there had been traffic behind me, it would have resulted in a crash, as I would have been rear ended. If the traffic had been a truck behind us, we would probably have sustained extreme injuries. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
The adaptive cruise control has had phantom braking issues from day 1 of ownership (and through every software release from Tesla as I always update when a new one is available). Most of the time it's not possible to identify exactly what triggers the event. However, on July 27th 2023 I drove from Phoenix to Sacramento via I10, I210, I5 and had 6 phantom braking events. 5 were directly correlated with passing a semi with me in the #1 lane and the semi in #2. I had the car in adaptive cruise control mode, but I maintained steering control. The car approached the semis normally, but as soon as the car was completely alongside the truck it began braking aggressively. More than just allowing regenerative braking to slow the car, the brakes were applied. This did not happen with every passed truck, maybe 1 in 10. One of the 6 phantom events could not be correlated with any clearly identifiable cause. There is no warning when the car begins decelerating so it is a surprise to the driver. Further it is an extreme hazard to following vehicles who WILL be forced to react. I have not directly reported my specific events to Tesla. For one thing they have no reasonable way to do this (online access, phone access. The number they give to report problems hits an automated phone tree where there is no option for reporting issues). Clearly Tesla is well aware of their phantom braking issues. I'm reporting this to you now in order to apply more pressure to Tesla to finally resolve this. In general, considering some of the other phantom braking episode beyond this one specific trip, I have occasionally noticed the car displays a yellow signal light on the driver's display but only while phantom braking has already begun. I believe this is case where the car believed it saw a caution sign or signal light in it's cameras and was reacting to that. There were no signal lights, but the sun was either coming in that direction or reflected off of other vehicles or buildings.
Adaptive Cruise Control. The vehicle in this mode suddenly brakes for no apparent reason which under the right circumstances could cause a collision. Tesla Service Center looked at it but could not reproduce. This has happened many times on various dates not just once. Sometimes it happens more than once on a trip, sometimes it does not happen. This happens with no warning. Tesla knows about this problem for years but has failed to correct.
Going north on New York State highway I-87, there is a bridge which results in Tesla Autopilot erroring EVERY TIME AND EVERY DAY and results in a "Full Self Driving Error" which causes Telsa car to brake along with putting on hazard lights. Issue was reported to Tesla, but they responded that it happens to everyone under this bridge so there is nothing they can do right now. The fault occurs on a busy highway at 55mph zone, and happens 100% of time going under the bridge, every single day. The bridge is wide enough that shadow results in error and car loses all understanding of what lane it is in. Today at the Tesla service center, the employees all reported the same issue as well and that there is no idea if it will be fixed. Happens only when using FSD/AP driving support; manual driving is unaffected.
There was a trash can on the edge of the road and my vehicle was in autopilot/FSD mode and for some reason it swerved over too close to the edge of the road. I intervened in time to avoid hitting the trash can head-on but it still struck the passenger side mirror and scraped across the passenger rear fender. The mirror got slammed into the passenger side door causing damage to the door and trim at the bottom of the passenger side window.
When using adaptive cruise control vehicle randomly applies brakes while driving at highway speeds, for no reason. Sometimes braking is severe, sometimes moderate, sometimes light. In the instances of severe or moderate braking, a real danger exists that following vehicle will be unable to stop and will collide. This happens relatively frequently, particularly on some highways. Sometimes multiple incidents occur on one drive.
Vehicle is typically operated in FSD Beta mode. I have had approximately a thousand driver initiated disengagements due to unsafe or potentially unsafe operation in Beta mode. At times there have been indirect indications of unintended vehicle operation potentially due to anomalies with system hardware and communications between modules and control systems. The refreshed S and Y models feature a vehicle display behind the yoke steering wheel. When at stop signs, stop lights, or when turning, the yoke (in an upright or near upright position) blocks safety related displays indicating intended vehicle path, bollards and critical safety-related driver annotations. Visibility of these safety critical displays is further degraded when stopped making unprotected turns with the yoke erratically moving. The ability of the driver to safely takeover control of the vehicle is further degraded by the chaos associated with the yoke whipping back and forth on unprotected turns, and drivers and passengers being unable to see critical displays. When the vehicle, driver and passengers all have a reduced capability to ensure safe operation the chaos further degrades safe vehicle operation. At a minimum an over the air recall is required to add safety critical displays/annotations to the large rectangular display between the driver and front passenger similar to Y and 3 models. As noted above I also have hardware related concerns on my vehicle which at times appear to be masked/accommodated by the firmware but are more evident on secondary systems independent of safe vehicle operations.
When in autopilot, at highway speeds, the car suddenly will brake. There is nothing that instigates this and it seems to be a safety hazard.
I have had multiple phantom forward collision warnings while driving and phantom braking while in autopilot. It happened twice today on a short 10 mile drive.
The contact owns a 2021 Tesla Model S. The contact stated while driving 70-75 MPH with the Autonomous Self-Driving feature activated, the vehicle experienced phantom braking on numerous occasions. The dealer was made aware of the failure and informed the contact that the vehicle performed as designed. The vehicle was not diagnosed nor repaired. The manufacturer was made aware of the failure. The contact related the failure to the Full Self-Driving (FSD) Beta 10.69.2.2 software update. The failure mileage was 17,102.
The contact owns a 2021 Tesla Model S. The contact stated while driving 72 MPH with the autonomous self-driving feature activated, the vehicle experienced phantom braking. The vehicle was not diagnosed nor repaired by an independent mechanic or dealer. The manufacturer was not made aware of the failure. The failure mileage was 56,292.
On March 18th and March 19th we have experienced 4 episodes of unexpected emergency phantom braking while using auto pilot. Once on the 18th and 3 times on the 19th. Only once was there another car in our vicinity and that car was next to us within their lane passing us at the time. All 4 times the cars collision warning chimed and hard emergency braking was applied by the car for no reason. If there had been cars following us we would have been rear ended. To date these are the only times this has happened. Tesla says they have no current fix. There was no indication of any problems with the auto pilot/collision warning prior to this.
The contact owns a 2021 Tesla Model S. The contact stated while driving approximately 45 MPH, the vehicle approached another vehicle that had slowed for a traffic light and did not automatically start braking. The contact manually applied the brakes, but the vehicle crashed into the rear end of the vehicle in front. The contact stated he noticed no warning lights illuminated nor warning chimes sounded advising of a vehicle ahead. The contact said both air bags deployed. The vehicle had crashed and damaged the entire front end which crumpled to the front tires. The police were on the scene and filed a report. The contact and the occupant of the other vehicle were not injured. The vehicle was towed to a collision center. The vehicle was not repaired. The manufacturer had not been informed of the failure. The failure mileage was approximately 1,000.
While using wildly falsely marketed $10k "full self driving" add-on across multiple software revisions and after reporting each and every bug to company I have experienced potentially fatal phantom breaking while at highway speeds and could have been killed by rear ending on at least 4 occasions. Have experienced 20+ erroneous forward collision errors with automatic breaking or lane assist corrections for obstacles such as hills, mailboxes, trees, absolutely nothing at all, etc. When driving around mildly banking left turn corners with 45mph speed limits whenever approaching a left turn lane within median the car will violently lurch into the turn lane causing panic for passengers and nearly causing a collision within resuming concrete median. While using autopilot the car failed to stop, correct or warn when a vehicle entered my lane and caused an accident with several thousand in damages. Car does not properly break for turns on highways requesting driver take over for what should be an easily navigable turn. Doesn't take apex or break early and will drift lanes making other surrounding drivers nervous. Moving the horn from muscle memory location to awkwardly placed touch button has prevented timely use of horn on multiple occasions which could easily result in fatal accidents. Overall this car is a literal death trap not at all even remotely ready for production use on public roads and neither I nor any of my passengers feel safe using the feature that was over priced, over promised and under-delivered. Tesla service center employees too worried about retaliation from management to inquirer about problems because of previous reparations from corporate offices. List of problems is shockingly egregious and should not be legally permitted on roads.
Car experienced “phantom braking” at freeway speeds (it suddenly and unexpected applied brakes by itself) - AutoPilot was engaged and was on freeway. There were no vehicles or objects in front of car. It was a multi lane freeway and multiple lanes on either side. Was extremely dangerous and could have caused a severe accident had i not taken evasive action and taken control/overridden immediately (would have otherwise been rear-ended).
With autopilot engaged, the vehicle will randomly apply brakes at highway speed with no vehicle in front of it at a distance of at least 1/2 mile. The issue seems to be weather independent. The issue occurs on every occasion where autopilot is engaged at highway speed. I scheduled a maintenance appointment for the issue. My last Tesla model S (2018) did same thing with less frequency. Severe auto braking at highway speed presents serious danger to vehicles behind me.
Third time this has happened on my new Model S Plaid. On Autopilot, the car randomly slams on the brakes when we were driving down the interstate at 70 mph for no reason. There were no other cars around us.
I have never received an audible alert for my blind spot warning. There is no light on the outside mirrors or anywhere else inside the vehicle to catch your attention. If I am looking at my display in front of my steering wheel I will see a car that is in my blindspot change from GRAY to RED when I activate my blinker; however, the car has never beeped or alerted me beyond passively changing the car color to red on my display screen. If I'm changing lanes I am not looking at my screen which is directly in front of me, I'm looking at the road and I'm looking left or right. I have almost hit many other vehicles multiple times because my car will not audibly notify me that theres a car in my blind spot or a car that is speeding up behind me which I might have missed. This is a feature which I paid for and I feel is defective in it's current state. The car is available upon request for inspection. The lack of an audible BSW alert puts myself and others on the road at an increased risk of collision. I have mentioned it to the dealer but after over two years no update has been made. I have not made a big attempt to correct this issue with the dealer since it's difficult to speak to someone at Tesla. No, the issue has not been inspected by anyone. I discussed it with a rep shortly after I purchased the vehicle but it seems like an issue that Tesla knows about if they only programed the vehicle color to change and didn't program an audible alert. No, there have not been any alerts before this issue as it appears to be a simple programing issue.
Autopilot cut off while driving on curved highway exit ramp, leading to driver's side of car grazing guard rail and rims of wheels grazing curb. Mostly paint damage. Repair estimate: $7,000.00 Car sustained minor damage, but I was told by collision center that other similar incidents had occurred, including one in which car had gone off the road into a ditch. I have confirmed this is a consistent software problem, cutting off on all highway exit ramps. Problem reported to local Tesla service, and they uploaded data from the car remotely. There were three dings as an alert, and a warning appeared on driver's display "autopilot guidance complete" and beneath it, in very small letters difficult to read, "Tap accelerator to continue." I consider this a software design flaw. Default on a curve should be to switch from autopilot guidance to lane keeping, not shut off altogether. Also, warning should be more visually an audibly prominent. For example, when guidance is not available on a city street, there is a distinctive alarm and obvious flashing on dashboard display. There should be a similar warning when autopilot is about to shut off.
Showing 1–20 of 23 complaints
Complaints are unverified consumer reports submitted to NHTSA. A high complaint count may reflect vehicle popularity, not defect severity. Data sourced from NHTSA public records.
Data synced from NHTSA on May 4, 2026