Tesla makes us wonder if cars should drive like humans or robots

Many drivers misunderstand the limitations of the technology already on the road today. The public is confused about the meaning of “autonomous driving”, for example, as driver assistance systems become more common and more sophisticated. In a survey conducted last year by analyst firm JD Power, only 37% of respondents chose the correct definition of self-driving cars.

Neither Tesla nor any other company sells an autonomous or self-driving vehicle, vehicle capable of driving itself in a wide range of places and circumstances without a human being ready to take over.

Nevertheless, Tesla markets its driver assistance systems in the United States with names that regulators and safety experts say are misleading. such as autopilot for standard package and full self-driving for the premium plan.

At the same time, Tesla warns drivers in owner’s manuals that it is their responsibility to use features safely and that they should be prepared to resume the task of driving at any time with their eyes on the road. road and hands on the wheel.

The difficulty of navigating an unpredictable environment is one of the reasons why truly autonomous cars do not yet exist.

“An autonomous vehicle should be better and more agile than the driver it replaces, not worse,” said William S. Lerner, transportation safety expert and delegate to the International Organization for Standardization, a group that sets standards global industries.

“I wish we were still there, but we’re not, except for the straight highways with typical entry and exit ramps that have been mapped out,” he said.

“Caught in the Cookie Jar”

Tesla’s roll-stop feature existed for months before it was noticed. Chris, who chronicles the good and bad sides of Tesla’s latest features on YouTube under the name DirtyTesla, said his Tesla had been performing automatic shutdowns for more than a year before Tesla disabled the feature. He agreed to be interviewed on the condition that only his first name be used due to privacy concerns.

The examination resumed this year. National Highway Traffic Safety Administration regulators questioned Tesla about the feature, and in January the automaker rolled out an “over-the-air” software update to disable it. NHTSA has classified the software update as an official security recall.

Reviewers were surprised not only by the choice to design software this way, but also by Tesla’s decision to test features using customers, not professional test pilots.

Safety advocates said they are not aware of any U.S. jurisdiction where rolling stops are legal, and they cannot determine any safety rationale for allowing them.

“They are very transparently violating the letter of the law, and it completely undermines the trust that they are trying to gain from the public,” said William Widen, a law professor at the University of Miami, who wrote on autonomy. vehicle regulations.

“I’ll be upfront about it,” Widen said, “instead of getting caught in the cookie jar.”

Safety advocates also questioned two entertainment features unrelated to self-driving that they said circumvented safety laws. One, called Passenger Play, allowed drivers to play video games while moving. Another, called Boombox, allowed drivers to pump music or other sounds out of their cars as they drove, a possible hazard to pedestrians, including the blind.

Tesla recently pushed software updates to restrict both of these features, and NHTSA has launched an investigation into passenger gambling.

Tesla, the top-selling electric vehicle maker, did not call these features errors or acknowledge that they may have created safety risks. Instead, Musk denied that the rolling stops could be dangerous and called federal auto safety officials “the fun police” for standing up to Boombox.

Separately, NHTSA is investigating Tesla for possible safety flaws in Autopilot, its standard driver assistance system, after a series of crashes in which Tesla vehicles, with the systems enabled, crashed. against stationary first-aid vehicles. Tesla has faced lawsuits and accusations that Autopilot is unsafe because it can’t always detect other vehicles or obstacles in the road. Tesla has generally denied claims made in lawsuits, including in a Florida case where it said in court documents that the driver was responsible for a pedestrian death.

NHTSA denied a request for an interview.

It’s unclear what state or local regulators can do to accommodate the reality that Tesla is trying to create.

“All vehicles operated on public roads in California must comply with the California Vehicle Code and local traffic laws,” the California Department of Motor Vehicles said in a statement.

The agency added that automated vehicle technology should be deployed in ways that “encourage innovation” and “ensure public safety” – two goals that can conflict if innovation means deliberately breaking the code of safety. road. Officials declined a request for an interview.

Musk, like most proponents of self-driving technology, has focused on the number of deaths resulting from current human vehicles. He said his priority was to create a self-sustaining future as quickly as possible with the theoretical goal of reducing the 1.35 million deaths a year worldwide. However, there is no way to measure the safety of a truly autonomous vehicle, and even comparing Teslas to other vehicles is difficult due to factors such as the age of the vehicles.

Industry Commitments

At least one other company has faced a willful traffic violation allegation, but with a different outcome than Tesla.

Last year, San Francisco city officials expressed concern that Cruise, which is majority-owned by General Motors, had programmed its vehicles to pull over in traffic lanes in violation of California’s vehicle code. . Cruise’s driverless development vehicles are used in a robotic taxi service that picks up and drops off passengers without a driver behind the wheel.

Cruise responded with something Tesla hasn’t yet offered: a pledge to abide by the law.

“Our vehicles are programmed to follow all traffic laws and regulations,” Cruise spokesman Aaron Mclear said in a statement.

Another company looking for self-driving technology, Waymo, has programmed its cars to break traffic laws only when they conflict with each other, such as crossing a double yellow line to give more safety. space to a cyclist, Waymo spokeswoman Julianne McGoldrick said.

“We prioritize safety and respect for the rules of the road rather than familiarity with other drivers. For example, we do not program the vehicle to go over the speed limit because that is familiar to other drivers,” she said in a statement.

A third company, Mercedes, said it was prepared to be held liable for crashes that occur in situations where it promised its driver assistance system, Drive Pilot, would be safe and up to code. road.

Mercedes did not respond to a request for information on its approach to automated vehicles and whether they should ever circumvent traffic laws.

Security experts aren’t ready to give Tesla or anyone else a pass for breaking the law.

“At a time when pedestrian deaths are at their highest level in 40 years, we shouldn’t be relaxing the rules,” said Leah Shahum, director of the Vision Zero Network, an organization that tries to eliminate road deaths. in the USA.

“We need to think about higher goals — not having a system that’s no worse than it is today. It should be significantly better,” Shahum said.

amoloans