Charlottesville, Virginia


Email Greg Webb Greg Webb on LinkedIn Greg Webb on Facebook
Greg Webb
Greg Webb
Attorney • (800) 451-1288

Self-Driving Cars—Do We Know Enough About Their Safety?

Comments Off

The “future” is (almost) here:  robotic, or driverless, cars are coming in droves many say. Cars that park themselves and brake automatically are already commonplace. Cars that can take us from one location to another while we read a book, take a nap, or anxiously grip our seats are nearly here in force, not just experimentally wandering around. According to some reports fully automated cars could be on the road in the next year or so. Many of the luxury automakers are already using automated features in their vehicles. The question is, are we ready to give up control of the wheel?  How many people actually want to do this, or will trust a self-driving car?  (When I write these words I fear I am like those who once said that gas-powered cars, or computers, would never catch on. . . .)

As with each new biggest thing there have been challenges. The latest issue for automakers designing robot cars happens to be snow. Snow blindness, the inability to see when driving in a major snowstorm, is an obstacle for robot cars as well. Heavy snow causes sensors to stop working as Volvo discovered during tests in a Swedish town near the Arctic Circle.  Volvo’s solution will be to embed the radar sensors behind the windshield.

The challenge presented by snow brings up all the other problems that complicate the entry of driverless cars into our everyday life. There will be issues for those owning the cars as well as for people driving the old-fashioned way and encountering robot cars on the highway.

Here’s a basic description of how the automated vehicle will function, “Driverless cars “see” the world around them using data from cameras, radar and lidar, which bounces laser light off objects to assess shape and location. High-speed processors crunch the data to provide 360-degree detection of lanes, traffic, pedestrians, signs, stoplights and anything else in the vehicle’s path. That enables it to decide, in real time, where to go.”

Let’s look at a few of known issues that have to be addressed as driverless technology moves forward.

  • In Google’s testing, their driverless cars had an accident rate twice as high as in human-driven models. Researchers place the blame mostly on the humans driving other cars. Regardless of who or what is at fault, that’s an unacceptable accident rate.



  • Think about all the ethical issues that will arise from driverless vehicles. Is the robot ‘driver’ going to decide what to do when something is about to happen? Will it choose to hit the pedestrian rather than run off the road? Who is responsible when things go wrong? Who, or what, is liable in an accident?  (Likely answer to that:  whoever is responsible for:  sending the vehicle out, or whoever programmed the vehicle, or whoever selected the route, or whoever (person or entity) designed or manufactured the software or technology that malfunctioned (assuming a malfunction or design-flaw).  NOTE:  if people think this will be the end of accidents or lawsuits, think again. . . .


If we look at the problems caused by drones we see what happens when technology outpaces regulation. We become reactive rather than proactive—rushing to fix problems rather than setting the standards as to how new technology will fit into existing conditions. We don’t have any laws on the most basic aspects of using robotic technology in vehicles. Will we be prepared when driverless cars hit the roads? One of the first anticipated uses of robot cars is for taxi service, or ridesharing, in big cities. What a potential nightmare.


The current challenge arising from the issue of snow blindness in driverless cars shifts the focus to the liabilities and limitations of this new technology. This is the ideal time for a more in-depth conversation about this new technology and all the areas of concern, regulations, driver safety, and more that need to be addressed now.

Ryan Eustice, an associate professor of engineering at the University of Michigan is working with Ford on snow testing for its future driverless vehicles. He talks about the issue of training the robotic vehicle to figure out lane lines in a snowstorm.   “For us to barrel down the road in our lane and ignore the ruts would be unnatural to the other drivers,” Eustice said. So Ford has to figure out how to read the ruts and navigate just like a person, which is “really hard.”

It seems to me that the minute we start talking about training a robot to think like a person we’ve got problems. There are so many factors involved that could go wrong. People misjudge all the time and there’s not sufficient evidence, in my mind, to think that a combination of radar, lidar, and bits of metal and wire could adequately judge where the snow rut has been made by other vehicles, along with all the other factors inherent in driving in a snow storm. Or the big city, or any number of other hazardous road conditions.


The Wall Street Journal reports that by 2035, “sales of fully autonomous vehicles will reach $39 billion, or 10% of the total market for new light vehicle sales.”     So we’re not looking at a total invasion by robotic cars—but it only takes one car to create a bit of chaos on the highway.