You’re here Inc. fired an employee six days after posting a Youtube video of his car hitting a pylon while he was using Complete self-drivingor FSD, the automaker’s controversial driver assistance system.
John Bernal, who worked in the data annotation team for Tesla’s Autopilot system, received a separation agreement from the company on February 11, just under a week after posting a video that now counts more of 180,000 views. 3:30 a.m. Bernal’s Model 3 turns right too sharply and collides with a green pylon separating a road and a bike path in downtown San Jose, California.
Bernal, 26, said in a phone interview that although his manager refused to put the reason for his dismissal in writing, he was told it was partly due to misuse of FSD. Tesla said in January the beta software was working on nearly 60,000 vehicles in the United States
The automaker, which disbanded its media relations department in 2020, did not respond to a request for comment. CNBC announced Bernal’s firing earlier Tuesday.
Tesla’s efforts to limit information sharing by FSD beta users caught the attention of the US National Highway Traffic Safety Administration four months before the company fired Bernal. The regulator expressed concern in October over reports that participants in an FSD early access program had been subject to nondisclosure agreements that discouraged portraying the feature in a negative way.
CEO Elon Musk joked about dropping the NDAs just before NHTSA sent its letter to Tesla, tweeting that the agreements would be “available in perforated rolls.” The agency has opened two investigations into possible faults involving the Autopilot since August.
Bernal, whose YouTube AI Addict account has more than 8,300 subscribers, said another reason his manager gave for the termination was that his video channel was in a conflict of interest. Along with sharing an unsigned copy of his separation agreement, he shared a photo of his Model 3’s screen showing that the FSD beta had been suspended based on his recent driving data.
Tesla initially made the FSD beta available to members of an early access program made up of employees and vocal fans of Musk and the company. Vice’s Motherboard reported in September that program members should share their experience with the software on social media “responsibly and selectively.”
“Remember, there are a lot of people out there who want Tesla to fail,” the company’s agreement reads, according to Motherboard. “Don’t let them distort your comments and media posts.”
Bernal’s AI Addict channel stood out in part because he narrated his videos with a mix of positive and negative comments about how Tesla’s FSD software handled city streets.
About 2 minutes into the just over 9 minute video in which his Model 3 hit the signal pylon, he praises the system for slowing down to let another car pass and pass from a lane of far right at a left turn lane in time to make a traffic light. As the Model 3 completes the turn, however, it struggles to find the correct lane to turn into.
About 2 minutes and 40 seconds later, the car goes through a red light and turns right without stopping. A passenger brings up how Tesla had just disabled a setting where FSD beta users could drive slowly through intersections without coming to a complete stop when no other cars or pedestrians were present. The automaker determined a recall was necessary after meeting with NHTSA about the feature in January.
Bernal’s Model 3 hits the pylon less than a minute later.
“We hit that,” he said shortly after the crash, which scraped the Tesla’s front bumper. “This is the first, for me, to have actually touched an object on FSD.”
Later in the video, Bernal takes manual control of the Model 3 twice after trying to steer sets of tracks. Just after he rents the car for patiently waiting for pedestrians to cross a street around the 7 minute and 40 second mark, the vehicle veers towards two more sets of pylons.