There are 50 owner-reported driver assist & adas complaints for the 2021 Tesla Model 3in NHTSA's database. These are unverified consumer reports and may not reflect confirmed defects.
When using the cruise control or self driving function the car will, unprovoked, brake suddenly and severely. Hitting the acceleration pedal will disengage the cruise control or self driving and will stop the deceleration. The error happens about 1-2 times per 20 miles. I do not use these features often due to several near miss car crashes due to deceleration error. Sometimes the car will report other phantom alerts (like curvature assist on a straight road).
The rear camera malfunctions intermittently and causes other errors to happen, like Automatic emergency braking unavailable or forward collision warning unavailable. When the camera error happens, it says the camera is unavailable. I took my car to the Tesla dealership three times. The first time, they did a continuity check and said the harness was degraded, and they replaced the harness and the rear camera. A week later, it happened again. The second time they reset the camera calibration and it worked. The day after, it happened again. The car is at the Tesla dealership for the third time (02/20/2026).
The car, with no known or evident driver, stopped the lane awareness, cruise control, etc. All of those just stopped working and per Tesla that's expected which is surprising since it's all part of the safety systems. The car's visual cues stopped identifying lanes and as a byproduct of that, none of the dependent features work.
Component/System involved: Advanced Driver Assistance Systems — FSD (Supervised)/Autosteer (Lane Keeping Assistance) and Adaptive Cruise Control. Possible failure to warn/stop (FCW/AEB). Vehicle and data are available for inspection upon request. I preserved dashcam files and requested Tesla to preserve engineering logs and EDR. What happened & safety risk: On [XXX] at ~[XXX] PDT on [XXX] near East Palo Alto, CA, with FSD (Supervised) engaged, the system appeared to misinterpret an exit ramp/gore area as a continuing lane at a highway fork and maintained ~60 mph (posted ~70). As soon as I saw the trajectory was unsafe, I braked and began manual steering takeover, but the vehicle contacted a roadside sign near the gore/shoulder before I could complete the maneuver. I then stopped safely. No other vehicles were struck. Airbags did not deploy. This posed a serious collision risk to me, my passenger, and nearby traffic. Reproduction/confirmation: NOT REPRODUCED. I have not attempted to reproduce the event. UNKNOWN whether the issue has been reproduced by Tesla or a service center yet. Inspection to date: Police responded (report pending). My insurer has opened a claim. I opened a Tesla Service request asking to PRESERVE Autopilot/FSD engineering logs for the incident window and to coordinate EDR extraction; engineering review pending. Vehicle remains drivable. Warnings or symptoms before failure: No audible/visual forward-collision warning was perceived by me and I did not observe automatic emergency braking. No prior warning lamps/messages were noticed before the departure toward the gore. UNKNOWN whether any internal/partial interventions were recorded in logs. Evidence available: Four-camera Tesla dashcam footage saved and backed up (see [XXX] ); precise time window ~[XXX] PDT on [XXX]. I can provide files and cooperate with data retrieval/inspection upon request. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
The 2017-2020 Tesla model 3's have a recall for the rear camera wiring harness. The recall ends with cars built until September of 2020. My car was built in December of 2020 and my 2021 tesla model 3 was just repaired and had the exact same issue. The coaxial cable wore out and my rear camera didn't work. Because my car wasn't included in this recall, i had to pay for it. This recall needs to be expanded to other year models of the tesla model 3. The problem isn't isolated to vehicles prior to 2020.
While using traffic aware cruise control or autosteer, the car occasionally detects a nonexistent person in the center of the lane I’m driving in, the proceeds to slam on the brakes when no hazard exists. It seems that tire markings in the road and certain lighting conditions cause the car to think there is a person in the road. Additionally, when driving on a highway with rolling hills, the car detects a forward collision alert and hits the brakes and disengages the autosteer. Both of these could cause a rear and collision at highway speeds. The dealer has claimed that the cause is functionally normally and I already tried to get it serviced but the car continues to have these issues. I do have video evidence of the car slamming on the brakes and detecting a forward collision.
While backing up, the rearview camera feed occasionally displays a noticeable delay, which prevents objects and pedestrians from appearing in real time. This lag creates a significant safety risk, as the visual information on the screen does not accurately reflect the vehicle's current surroundings. Additionally, the display on the left side (vehicle proximity visualization) does not match the camera view or actual situation, further contributing to confusion and potential danger when reversing. This issue persists despite software updates and appears to be a systemic flaw that compromises driver awareness and vehicle safety. Cross-traffic alert appears impacted.
I was driving east on [XXX] in Bourne MA using the Tesla adaptive cruise control system called "Autopilot" in very light traffic conditions. The sun was low on in the sky (behind and to the right). The system automatically engaged the brakes while crossing the bridge without obvious reason, decreasing the car speed about 10 MPH in less than a second before I overrode by pressing the accelerator. I believe the car vision system mistook a shadow cast by the vertical girders on the bridge onto the road as an obstacle. The sudden deceleration was a hazard to those behind that had to brake, but there was no collision or loss of control. The manufacturer seems acknowledge and accept that the defect exists, but has not been able to resolve the problem. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I started receiving messages that the cameras were occluded and that there was an error and ALL of my system safety features were disabled. No front forward collision warnings, no emergency braking available, nothing available and the system computer that showed all of the traffic and everything completely disappeared from the screen, the car has literally gone blind. All of this started happening on with the camera issues in May, but I thought that it was a software issue and would be addressed in the next update, although the critical disengagements were supposed to be sent to Tesla. I took the car to the service center on 9/4/24 at 9am, and they stated that the did a system test and saw that the triple camera was showing a fault and so they replaced the camera and gave it back to me the same day. Everything worked for the rest of that day for a service that took at least 3hours to figure out. Today after I get off of work at 1pm and I go to initiate FSD and I get 3miles from my residence( I live <8mi from my job) and the system disengages again asking me to take over and all of my critical systems have yet again been disabled. My whole safety is compromised at this point yet again. I immediately bring the car back and they tell me that it's going to take them an additional 30-45 mins to "diagnose" the issue although it's the same thing that's been happening that was supposedly fixed yesterday. After that wait time the service consultant comes back and tell me that the car is throwing all kinds of codes and that, "It's still safe to drive." When I told that it cannot be safe to drive if all of the critical systems were disabled he suggested that I take it and they can do a remote log pull so they can look more into the issue or I can leave it as the part that's needed needs to be ordered anyway. I don't know how they can order a part if they're claiming to not know what's wrong with it. Something else is going on that they don't want to take accountability for!
While using the Full Self Driving Mode (FSD) the vehicle will randomly and without warning slam on the brakes at highway speeds and decelerate to 30-50mph on an active highway. If the highway is multilane, the car will slow and attempt to change lanes. It appears the vehicle thinks that something is in the road, even if the road is clear. This happens multiple times on my longer road trips, and is very concerning. It did this yesterday with a semi-truck closely behind me. The vehicle has gone into service and I did bring up the issue, but the service team was unable to assist on this issue (screenshots attached). The software currently installed in my vehicle is 2024.3.15, the latest available from Tesla. This issues has been ongoing from every software update I have had in the car however.
When using cruise control, auto lane assist and auto pilot, the vehicle will brake or slow down without apparent reason. This can happen suddenly and dramatically. This can happen when there are other cars nearby, but also when there is no other traffic on the road. It is distressing enough when there are no other cars nearby, but can be a hazard when there is other traffic behind me and they have no warning this is happening.
Autonomous Driving Problem [XXX] Driving from Cornelia, GA to Commerce, GA on [XXX] at 60 mph driving on the right lane with very lite traffic. 6 different times the FSD (Full Self Driving) (Supervised) 2021 Tesla Model 3 starts signaling left and changes lanes with no traffic or vehicles in front. The 5th time FSD (Supervised) changed lane to the left was a near car accident because a car was speeding at 70-80 mph from the rear and the vehicle changed lanes in front of the speeding car nearly causing an accident. The autonomous cabin driver detection sensor does continuously give false positive warning messages to the driver to "Pay attention to the road" when I am wearing my glasses, so I have to drive without my glasses. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
On or around March 31, 2024 Tesla delivered a software update to my vehicle that changed the way adaptive cruise control is engaged, as well as the behavior of the vehicle when the setting is activated. In the few days since the change, I twice inadvertently engaged the adaptive cruise control, once while attempting to park on a congested residential street where it was very dangerous to have the vehicle accelerate without my taking action. I panicked and was able to shut it off quickly, but I didn’t know why it had been enabled. Today I was driving and had intentionally engaged the adaptive cruise control when the car attempted to signal and change lanes. This was wholly unexpected, and I was not prepared to monitor the vehicle taking other actions besides maintaining its lane and speed in traffic. I reviewed the settings for the feature and discovered the adaptive cruise control with lane guidance, termed “Autosteer” in the Tesla vehicle, had been replaced by “Self-Driving (Supervised).” The activation of the feature was changed from a double pull on the shift lever to a single pull. This explained my twice inadvertently activating the feature, and the car propelling itself forward unexpectedly, on those two other occasions. It is extremely dangerous for the car manufacturer to suddenly and without warning change the way the vehicle is operated. I believe I was fortunate to avoid an accident on two occasions while the vehicle unexpectedly took control over driving. Had I not reacted within one second to disable the feature, I might not have avoided a collision.
Hi Tesla installed a new software update yesterday. Today when I began driving my car and I turned on the adaptive cruise control I found that the car was completely in autopilot and it was extremely to take control over the car again as I was initially unable to steer and it took a few seconds for it to discontinue the autopilot. This is very concerning because the car was set in the preferences (see attached photo) to autopilot (Full Self-Driving (Supervised)" in the preferences) even though I have never used it and has it set to adaptive cruise control ("Traffic-Aware Cruise Control" in the preferences). I am concerned that Tesla switched the preference to autopilot just after offering a free 30 day trial which I did not accept. I understand that full self-driving has never been approved and yet Tesla is flaunting the law and forcing drivers to use it without notifying us or requesting our permission. Tesla needs to be immediately held accountable for this reckless and dangerous behavior which put myself and likely many other driver's lives at risk. Thank you, [XXX] [XXX] [XXX] INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
After a recent update to the "Full Self-Driving" software stack (new version2023.44.30.25; FSD Beta v12.3), we drove from Charleston to Atlanta and back, with much of the trip on Interstate Highways [XXX] and [XXX]. A very troubling new phenomenon was noted: every time we passed a "Minimal Speed" sign, the car would drop the Speed Limit setting on the control panel to 40 MPH. In many of the iterations of this problem, the speed of the car dropped precipitously, creating a potentially dangerous situation. I had the software set to Auto speed limit offset, which in some circumstances maintained the speed around 70 MPH because it was keying on surrounding traffic, but in some instances the speed did not reset until we passed another "Speed Limit 70 MPH" sign. Often, I had to react by manually raising the Max Speed setting to the speed limit. This happened repeatedly (essentially every time we passed a Minimum Speed sign, unless the sign was obstructed from camera view by a semi). Please address this with Tesla and ask for an immediate bug fix and recall (software update). It is potentially very dangerous. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
Full Self Driving and cruise control brakes randomly and violently with no warning at highway speeds. This has happened no less than 20 times on one trip on 8 March 2024. This has put myself and other vehicles in extreme danger. The model has had this issue known by the manufacturer and nothing has been done. This braking occurs with no warning other than the vehicle slamming on the brakes. This needs to be fixed asap, its a hazard to me as the driver and the general public
While traveling west on [XXX] in Dane county just before exit [XXX], I experienced unexpected braking while using TACC. My cruise speed was set at 72 MPH and I observed the set speed changed itself to (I think) 40 MPH. This caused the car to quickly begin braking to slow my speed to this new speed. I had to intervene and accelerate manually while also trying to reset the set speed. Thankfully no one was directly behind me, as this was incredibly unsafe. This has occurred at this location more than once. I'm not sure if the GPS perhaps thought I was on the exit ramp instead? TACC and autopilot seem to work normally elsewhere. It has not otherwise been inspected. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
The latest software update version: 2023.44.30.5.1 08a3a7856854 Is too stringent with having to keep eyes on the road that looking in the mirror causes it to beep. Also having to touch the steering wheel constantly is dangerous because if you move with slightly too much force, it disengages autopilot. I only use the autopilot feature on the freeway and I have never had an issue until this update, which I believe was made to make the system safer. In my view it is now much less safe since I’m not able to check my mirrors and ultimately I think it makes the autopilot feature useless. (I will not be using it) this is also less safe, in my view. Because before I would watch the road as the car steered, ready to take over, so it was like having redundant safety (human + machine). Now I’m back to just human safety.
After the latest update the Tesla autopilot feature is very unsafe to use. The screen constantly pings and nags the driver to pay attention drawing my attention away from the road. The system has become so sensitive that adjusting the AC down a degree causes the warning to pop up which distracts the driver and in my opinion is significantly less safe than the older version. The warnings are too sensitive and the noise/warnings flashing are significantly more distracting
FSD Computer module (AP3 module) when starting to fail causes all active safety systems to top functioning, including but not limited to Lane Assistance, Automatic Emergency Breaking, Adaptive Cruise Control, Lane Departure Warning, Lane Keeping Assistance, Blind Spot Warning, Pre-collision systems, etc. This is due to the ECU that processes the video feeds from the cameras needed for these function begins to fail, and is not surprising. What is surprising is there are absolutely zero customer or service facing alerts for this issue. Customers simply are unable to manually activate some features, vehicle display has difficulty recognizing objects and lane markings, and the service menu (meant for vehicle service/repair) shows part of the ECU as critical in diagnostics, but does not provide an alert. I can only guess that as previously only half of the module was used at a time, with the second half being for redundancy incase of a failure on one part of the unit, still allowing for full functionality. However, in newer versions of the vehicles software both halfs of the module are used simultaneously to increase processing power, thus eliminating redundancy. It is possible that due to the past redundancy that a failure of one half of the unit was not considered a cause for alarm (though I would have hoped a failing component would have caused an alert, unless the companies hope was that one half would work long enough on its own until it no longer needed to be covered under warranty). However, given both processors and now being actively used there should definitely be alerts and warnings. In addition to losing the convenience of several of these systems there is also a negative safety aspect of systems such as accident avoidance, pre-collision seatbelt tensioning, and others may not work in a pending collision (not that I have seen these systems function properly in past collisions when the module was fully working).
Showing 1–20 of 50 complaints
Complaints are unverified consumer reports submitted to NHTSA. A high complaint count may reflect vehicle popularity, not defect severity. Data sourced from NHTSA public records.
Data synced from NHTSA on May 4, 2026