Can Self-Driving Cars Ever Really Be Safe?

Can Self-Driving Cars Ever Really Be Safe?

Analysts estimate that by 2030, self-driving cars and trucks (autonomous vehicles) could account for as much as 60 percent of US auto sales. That’s great! But autonomous vehicles are basically computers on wheels, and computers crash all the time. Besides that, computers get hacked every day. So you gotta ask, “Can self-driving cars ever really be safe?”

The Short Answer

No. Self-driving cars can never really be safe. They will be safer! So much safer that it’s worth a few minutes to understand why.

Humans Are Very Dangerous

First and foremost, according to the National Highway Traffic Safety Administration (NHTSA), 90 percent of all traffic accidents can be blamed on human error. Next, according to the AAA Foundation for Traffic Safety, nearly 80 percent of drivers expressed significant anger, aggression, or road rage behind the wheel at least once in the past year. Alcohol-impaired driving fatalities accounted for 29% of the total vehicle traffic fatalities in 2015. And finally, of the roughly 35,000 annual traffic fatalities, approximately 10 percent of them (3,477 lives in 2015) are caused by distracted driving.

Remove human error from driving, and you will not only save a significant number of lives, you will also dramatically reduce the number of serious injuries associated with traffic accidents – there were over 4.4 million in the United States during 2015.

Data Begins to Make a Case

In May 2016, a 40-year-old man named Joshua Brown died behind the wheel of a Tesla cruising in Autopilot mode on a Florida divided highway. He was the first.

Rage against the machine quickly followed, along with some valid questions about whether Tesla had pushed this nascent technology too fast and too far. Everyone expected the accident to be the fault of a software glitch or a technology failure, but it was not.

The NHTSA investigation found that “a safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.” In other words, the car didn’t cause the crash. But there was more to the story. The NHTSA’s report concluded, “The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.” In reality, while Mr. Brown’s death was both tragic and unprecedented, the investigation highlighted a simple truth: semi-autonomous vehicles crash significantly less often than vehicles piloted by humans.

What Do You Mean by “Safe”?

The same NHTSA report mentioned 99 percent of US automakers had agreed to include Automatic Emergency Braking (AEB) systems in all new cars by 2025 with the goal of preventing 28,000 crashes and 12,000 injuries. The AEB program is limited to rear-end crashes, but there are a host of other semi-autonomous features in the works, and by the numbers, all of them will make us safer.

That said, this is very new technology, and regulators will need to define what they mean by “safe.” Must our autonomous vehicles drive flawlessly, or do they just need to be better at it than we are? The RAND Corp think tank says, “A fleet of 100 cars would have to drive 275 million miles without failure to meet the safety standards of today’s vehicles in terms of deaths. At the time of the fatal May 2016 crash, Tesla car owners had logged 130 million miles in Autopilot mode.”

The Transition to Fully Autonomous Vehicles

In April 2016, Ford, Alphabet, Lyft, Volvo Cars, and Waymo established the Self-Driving Coalition for Safer Streets to “work with lawmakers, regulators, and the public to realize the safety and societal benefits of self-driving vehicles.” They have their work cut out for them.

Self-Driving Cars Need to Be Trained

In January 2017, Elon Musk tweeted that a software update featuring Shadow mode was being pushed to all Teslas with HW2 Autopilot capabilities. This enabled the car’s autonomous driving AI to “shadow” its human drivers and compare decisions that it (the AI) would make to the decisions that were being made by the human driver. Think of it as self-driving AI in training. The auto industry and several tech giants are working as fast as they can to make autonomous vehicles mainstream. To speed the process, they may need to share some data. Will they? My guess is, absolutely.

Hacks and Crashes

In September 2016, Chinese researchers discovered some “security vulnerabilities” in the Tesla Model S and remotely hacked into the car. This was notable because it was the first time anyone had remotely hacked into a Tesla. We have a thesis here at The Palmer Group, “Anything that can be hacked, will be hacked.” Is this going to be an issue? Yes, but it’s also going to be an arms race. I’m betting on the good guys, but to be fair, hacking across every digital touchpoint is a never-ending battle. We will do our best to combat the bad guys.

As for computer crashes, yes, it is possible for the computer that runs your self-driving car to crash, but it will happen so infrequently that, by the numbers, you will be significantly safer in an autonomous vehicle than if you were driving yourself.

Fear and Assessment of Risk

Some people are afraid to fly. When you point out that flying is the safest form of travel by several orders of magnitude, the response is always some version of, “But when a plane crashes everyone dies.” Human beings are not very good at assessing risk. If you don’t have a gas pedal, a brake pedal, or a steering wheel, and your car crashes, you will feel helpless and out of control. And you may die. But, by the numbers, tens of thousands of people will not die or be injured because semi-autonomous driving and ultimately fully autonomous driving will be much safer than pure human driving. Some will counter that it’s cold comfort if you’re the one who is killed or injured, no matter how rare it is. I agree. But, by the numbers, if you were going to make a policy decision for our society at large, you have to agree that saving tens of thousands of lives and millions of injuries is a worthy endeavor.

(BTW: Please do not bring up the absurd “Why Self-Driving Cars Must Be Programmed to Kill” scenario where “One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?” If you had situational awareness and time to consider all of the outcomes posited by this nonsense hypothetical, you’d have time to step on the brake. If you didn’t have time to consider all of the potential actions and outcomes, the AEB would have engaged to prevent the car from hiting what was in front of it – the people you would have killed while you were thinking about what to do.)

A Prediction

I’m pretty sure that before 2030, if you are under the age of 25 or over the age of 70, you are going to need a special permit to manually drive a car. I’m also pretty sure that you will not be allowed to manually drive on certain streets and highway lanes because you will pose too great of a threat to the caravans of autonomous vehicles on those roads.

With any luck, the fear-mongers and bureaucrats will get out of the way, and we will all be much safer sooner.

Related Articles

About Shelly Palmer

Named one of LinkedIn’s Top 10 Voices in TechnologyShelly Palmer is CEO of The Palmer Group, a strategic advisory, technology solutions and business development practice focused at the nexus of media and marketing with a special emphasis on machine learning and data-driven decision-making. He is Fox 5 New York's on-air tech and digital media expert, writes a weekly column for AdAge, and is a regular commentator on CNBC and CNN. Follow @shellypalmer or visit shellypalmer.com or subscribe to our daily email http://ow.ly/WsHcb


Catherine Brennan

Author / Speaker / Entrepreneur

2y

I love my self-driving car (Tesla Model 3). It requires that I keep my hands on the wheel and pay attention. If we can work with the car, self-driving cars will be a very good solution. "Humans are dangerous"

Like
Reply
Beverly Butler

“Promise Yourself Nothing Less Than An Extraordinary Life.”™️

2y

Maybe? But who wants to give up "control".

nagy Sophoclis

Hotel Manager at Sheraton Hotels & Resorts

3y

مستر احمد عندي مشكلة في حجز سيارة اوبل جراند لاند من مكتب الطيران مدينة نصر . دفعت ٣٠٠٠٠٠ الف يوم ٢٨ نوفمبر ٢٠٢٠ لحجز اوبل جراند لاند و حددوا لي ميعاد خلال شهر للاستلام و الي الان ام استلم السيارة واليوم ذهبت للمركز قالوا لي ميعاد الاستلام ١٥ فبراير يعني شهرين و نصف انتظار .. كنت اتمني الاستلام خلال هذا الشهر لو امكن .. اشكر حضرتك ناجي فريد ت/01023860661

Like
Reply

Still need human for driving because can be damage computerize attact by hacker and virus.

John Cox

Technical Consultant, Digital Solutions at AHEAD

6y

Can't wait for Full Self Driving!

Like
Reply

To view or add a comment, sign in

Insights from the community

Explore topics