Some Tesla owners say the company's "Full Self-Driving" (FSD) software can struggle with a basic but dangerous scenario: railroad crossings. Multiple drivers, including Italo Frigoli in Texas, report their Teslas failed to recognize lowered crossing arms or flashing lights. In Frigoli's case, he had to slam on the brakes to avoid a collision, and video reviewed by NBC News supports his account. When NBC accompanied Frigoli for a follow-up test, the software again failed to detect an oncoming train.
NBC spoke to six Tesla drivers who described problems with FSD at railway crossings; four of them had videos, and the network found at least seven other videos online showing similar Tesla mishaps at rail crossings since mid-2023. Non-video complaints are also widespread across Tesla forums. The National Highway Traffic Safety Administration says it's aware of the issue and is in contact with Tesla, but the company hasn't responded to requests for comment.
Tesla markets FSD as a driver-assistance feature that still requires human supervision. Experts say the software is a "black box" AI system, meaning even its developers may not know why it fails in certain scenarios. Some suspect the system has not seen enough examples of train crossings in its training data. Meanwhile, Tesla's main US robotaxi competitor, Waymo, appears to have avoided these issues by more cautiously handling crossings and using additional audio sensors.
story continues below
Incidents include Teslas stopping on tracks, failing to yield to trains, or even turning onto tracks. In one Pennsylvania case, a Tesla in FSD mode was struck by a train after driving itself onto the rails, though no one was hurt. Read the full investigation at NBC.