Free Consultation(203) 447-0000
37,133 Americans lost their lives to motor vehicle accidents in 2017, down from around 50,000 in 1980. According to the US Department of Transportation, 94 percent of all vehicle accidents are caused by human error rather than, say, mechanical malfunction. Now, a wild card has entered the deck: the self-driving car.
Are self-driving cars safe compared to human drivers? That is certainly the expectation, but at this point, there hasn’t been enough total miles travelled by self-driving cars to adequately compare them to conventional cars. Concern about their safety will undoubtedly continue until more is known.
Many corporate heavyweights are now jumping into the self-driving car market – Google, Uber, General Motors, Ford, Tesla, and Volvo, among others are all jockeying for position. Not all of these companies intend to get involved in vehicle manufacturing – Google, for example, intends only to generate the technology that operates self-driving cars, not the cars themselves.
On the demand side of the equation, the numbers are dizzying. The global autonomous vehicle market is valued at well over $50 billion, and it is projected to grow to half a trillion dollars by 2030. Currently, self-driving cars are relatively unusual, but they could become ordinary or even typical sooner than many people realize. Some observers expect them to be a common sight as early as 2021.
Self-driving cars rely largely on LIDAR, a “light-detecting and ranging” sensor. LIDAR uses literally millions of lasers to create a constantly changing 3D image of the car’s environment. The vehicle also links up with GPS signals to locate the car within a city to help plan the most efficient route. To deal with other traffic, radar sensors measure the size and speed of anything near the car that moves (including vehicles and pedestrians). Meanwhile, the car’s cameras can read street signs and traffic signals.
Self-driving cars also operate software programs that can make real-time decisions, independent of human input, about how the vehicle will respond to the actions of other vehicles. This type of software can learn from its experience – in fact, it must do so in order to reach its full potential. It is for this reason that live beta testing on public roads is necessary despite its potential dangers.
As of January 2020, only six fatalities have been reported that appear to have been caused by self-driving cars – five in the US and one in China. No one doubts that there will be more. The only question is how many more. Could self-driving cars cut the accident rate in half, or double it?
The death of Elaine Herzberg in March 2018 garnered wide publicity, in part because she was the first pedestrian ever killed by a self-driving car. Ms. Herzberg was struck while pushing her bicycle across a four-lane highway. At the time of the accident, the vehicle was operating in self-drive mode with a human backup driver in the driver’s seat.
The accident was not necessarily the sole fault of the automated system, however. According to an investigation of the accident conducted by the National Transportation Safety Board, both methamphetamine and marijuana were found in Ms. Herzberg’s bloodstream (this does not necessarily establish that she was intoxicated at the time of the accident, however). She entered the roadway in a dark area with no crosswalk, and she was wearing dark clothing
Regardless, Herzberg’s presence on the road ahead should have resulted in near-immediate braking. Indeed, the car identified Ms. Herzberg as an object six seconds before the accident while traveling at 43 mph, but it waited until a second prior to impact before it flagged the need to brake. The investigation concluded that there was no braking before impact. Uber, which was administering the test drive, responded by suspending the test-driving of all self-driving cars in Arizona.
Automated driving is considerably more complex than an on/off switch. The following is a description of the six levels of automated driving:
At this point, the most obvious dangers of self-driving cars include:
If safety concerns can be effectively addressed, self-driving cars could offer many benefits, including:
When it comes to self-driving cars, the regulatory landscape is still evolving. Currently, the US federal government has issued no mandatory regulations, only voluntary guidelines. Local laws vary significantly from state to state.
Connecticut has enacted basic legislation that mandates certain procedures for testing self-driving cars. Testing may take place only in certain designated Connecticut cities with populations of at least 100,000, for example. A human driver must be physically present at all times in any self-driving car.
As stated above, at present nearly 95 percent of all accidents are caused by human error, and only a small percentage are caused by mechanical malfunction. If self-driving cars take over, however, a software malfunction will likely be classified as a mechanical malfunction rather than human error. As a consequence, when personal injury and wrongful death lawsuits are filed over vehicle accidents, product liability claims rather than negligence claims are likely to become more and more common.
Product liability cases, already fraught with technical complexity in many instances, are likely to become even more so. For example, a self-driving car might feature software and AI developed in Silicon Valley, sensors and cameras manufactured by another company, with the car itself manufactured by still another company. In some cases, all of these companies might have contributed to an accident.
One of the cutting-edge issues relating to self-driving cars is ethics, because ethics become challenging when an automated system can make decisions without human input. How will the system be programmed to respond when an ethical decision is required?
Suppose, for example, that an automated car with one passenger – the car’s owner – discovers three pedestrians jaywalking, too late to stop. The choices are to veer off the road into a ravine, which will certainly kill the driver, or to hit all three pedestrians and kill them. Should an automated system be programmed to respond how a particularly altruistic driver should behave, or how the average driver probably would behave?
At Berkowitz Hanna, our lawyers enjoy decades of combined experience, and we have won dozens of multi-million dollar verdicts and settlements. If you have suffered an injury that you believe was at least partially someone else’s fault, contact us immediately for a free case evaluation, You don’t need any money to retain us. Either we win your case or we charge you nothing. And our bill doesn’t come due until your money actually arrives.
Telephone one of our offices in Stamford, Bridgeport, Danbury, or Shelton or contact us online to get the process started. We serve clients throughout the state of Connecticut.
Berkowitz Hanna