For some people, the thought of autonomous vehicles opens up a range of partying possibilities. They imagine hanging out all night with friends, drinking whatever they’d like, and then getting into their own cars, which will drive them safely home. They won’t have to worry about getting pulled over for a DUI because the vehicle won’t swerve, run stop lights or travel erratically. The artificial intelligence that’s controlling their vehicle won’t be affected by its owner’s alcohol consumption.
While that scenario could become a reality at some point, it’s not likely that it will occur any time in the immediate future. For one thing, self-driving vehicles have a long way to go before they become feasible and/or widely accepted everywhere in the U.S. Plus, under most current scenarios, at least one human occupant has to sit behind the wheel of the vehicle and that human occupant (not the computer) is ultimately responsible for its operation. To accept that responsibility, that person will have to remain sober.
Autonomous Vehicles Today
So how close are we to having roads full of vehicles operating on their own? A headline in the October 17, 2016 issue of Business Insider proclaims that “19 companies [are] racing to put self-driving cars on the road by 2021.” They include Tesla, Google, Uber, Volvo, Toyota, BMW, Ford, Nissan, GM and more—in other words, all of the big players in the automotive and/or tech worlds.
But the artificial intelligence to safely guide the autonomous vehicles on highways, city streets and back roads will have to be fairly sophisticated, and it’s not all the way there yet. A column by Neal Boudetter in the June 4, 2016, issue of the New York Times points out that there are several situations that autonomous vehicles, unlike humans, currently can’t figure out how to handle. For example, how will a self-driving car react when the human drivers in their cars don’t follow the rules and behave erratically? Will autonomous vehicles that navigate by using cameras to follow pavement markings be able to function if those marks disappear (under a layer of snow, for example)? What happens if a road is closed and the vehicle needs to follow a detour? How will the self -driving car react if it’s faced with a situation where its only options are to strike a telephone pole or hit a child?
Even the technology that these vehicles will use to navigate everyday traffic situations still needs quite a bit of work. In late February, one of Uber’s self-driving vehicles breezed through six red lights without stopping. Uber initially blamed the incident on human error, but later admitted that it was a problem in the vehicle’s mapping system that was responsible. (The Uber car also failed to stay out of bike lanes.)
Then there are the problems with the U.S infrastructure. Writing in the Detroit Free Press, John Gallagher points out that “Cities and states have done little to grapple with the enormous demands that autonomous vehicles will place on transportation infrastructure and on civic policy. States and municipalities barely able to fill potholes today could soon be charged with creating the world’s most sophisticated roads with embedded sensors, cameras and communication devices to help autonomous vehicles talk to one another and the environment around them.”
It seems unlikely that these government agencies will be able to transform this outdated infrastructure in a decade, much less within three or four years.
In October 2016, Autoweek put it bluntly: “Experts say a fully automated vehicle that is 100 percent safe 100 percent of the time and can operate on any street in any weather condition in the U.S is not right around the corner. It’s a decade or more down the road.”
Even after the all the technology and infrastructure problems are resolved—and they probably will be at some point down the road—there are matters of public policy that governments will have to address. For example, when does the autonomous vehicle become responsible for its own performance, letting the human traveling inside off the hook for any accidents or problems?
The National Highway Transportation Safety Administration says that it can envision autonomous vehicle technology’s potential for saving lives. According to its website, more than 30,000 people dies on U.S. roads each year, and the agency can tie 94 percent of those crashes to human era. Take the human out of that equation, and the technology could save as many as 28,000 lives each year.
The U.S. Department of Transportation has developed a Federal Automated Vehicle Policy to set the framework for the safe and rapid deployment of these advanced technologies. It has published a 15-point safety assessment that manufacturers of automated vehicles should follow for the safe design, development, testing and deployment of automated vehicles. But the DOT acknowledges that states will have to set the policy and develop regulations for the deployment of highly autonomous vehicles (HAVs) within their boundaries, and it has devised a model state policy that they could adopt (or adapt).
But the National Conference of State Legislatures reports that only eleven states—Alabama, California, Florida, Louisiana, Michigan, Nevada, North Dakota, Pennsylvania, Tennessee, Utah and Virginia—as well as Washington D.C—have passed legislation related to autonomous vehicles. Some of that legislation has been fairly restrictive.
Until recently, for example, California regulations required a human “driver” in autonomous vehicles. Autonomous vehicles had to be equipped with brakes, a steering wheel and a driver sitting behind that wheel.
Last fall, however, California Governor Jerry Brown signed a bill allowing a completely autonomous shuttle—without a human operator or a steering wheel, brakes, or accelerator—to operate for a short distance on a public road. The vehicle will be crossing a public roadway, going from one part of the private campus on which it operates to another. The two autonomous shuttles will travel no faster than 25 mph during their six-month test.
Other states have a more lenient attitude. An article in Fortune article points out that many companies developing autonomous vehicles have complained that California’s laws on autonomous vehicles are still too restrictive. Google, for example, has moved its testing to Texas, where the laws are less restrictive.
Autonomous Vehicles and DUIs
So where does all of this leave the people who want to party and get a ride home in a self-driving car? Will automated vehicles really help them avoid charges of DUI?s
The brief answer is not in the short term, even with the support of organizations like Mothers Against Drunk Driving (MADD), which is publicly pushing for the implementation of self-driving vehicles as a way to avoid DUI-related deaths. The alcohol industry would also like to see autonomous cars become a reality, since they envision alcohol sales increasing if people no longer have to worry about police pulling them over for a DUI.
But lawmakers are not hurrying to absolve the human driver/passenger of any responsibility for controlling the self-driving vehicle. In fact, one state legislator in New Jersey has even gone as far as introducing a bill that would require all self-driving cars to have an ignition interlock devices; the vehicles wouldn’t start unless the human occupant breathed into the device to prove that he/she was sober.
Overcoming the technological problems of autonomous vehicles probably won’t be the key factor in absolving occupants of autonomous vehicles of any DUI charges. The real change will come only when they no longer bear any responsibility for accidents in the self-driving vehicles they are using for transporting them.