Where’s My Flying — I Mean Self-Driving — Car

Image for post
Image for post
Photo by A. L. on Unsplash

Imagine this: You go to the bar. You have a couple more than you intended to, and your car is parked nearby.

Oops!

Right now, your only options are to drive while drunk (please don’t) or to hire a cab and then hire another one to come back and retrieve your car.

But imagine a future where you can just have your car drive you home. In fact, your car might detect the alcohol on your breath and lock out any manual controls it has so you can’t drive while drunk.

Or imagine losing your vision, but still being able to keep the freedom of having a car. Imagine being able to sleep while your car trundles along the highway.

Self-driving cars offer all of these possibilities; but when will we see them on the road?

The Problem With Self-Driving Technology

The basic problem with self-driving technology is this: Driving is hard.

For those of us who do it every day, we don’t realize how hard it is. But driving is hard. It requires a lot of split second judgment, and to create an artificial intelligence capable of doing so takes time.

For the most part, I tend to think of current AI with a science fiction term that occasionally pops up: Artificial Stupid.

For example, I was just listening to a friend with a Dutch name complain that every time he types his name into Microsoft Word, the automated language detector…switches to Dutch.

Facebook’s AI went on a rampage earlier this year and deleted every post about COVID-19 as spam. Remember that? The reason was because they were short handed and nobody was properly babysitting the AI.

There’s been talk about using AI to read slush and I’m like please no, although I wouldn’t mind an AI that detects when somebody sends a file in the wrong format and sends them a nice message to resend.

AI is not ready to take over human responsibilities, and driving is an area where we can take few risks. In theory, eventually, self-driving cars (especially if they can talk to each other) would be safer.

And, of course, there’s the possibility of somebody hacking one. For example, cybersecurity researchers Charlie Miller and Chris Valasek have had fun with hacking their Jeep Cherokee. Fortunately, they are white hats with the goal of finding all the vulnerabilities and reporting them to the manufacturer so the software can be patched.

This is going to take security architecture, and even then the risk is still present. Thankfully, unless you are somebody who warrants assassination, it’s unlikely a hacker would bother taking over your car at highway speed and driving it into a ditch.

However, hackers could still cause problems. The growing popularity of ransomware leads one’s mind promptly to the specter of hackers locking a vehicle’s ignition until a ransom is paid.

How Much Progress is Being Made?

One company that is determined to crack self-driving cars (and even self-driving semi trucks, which would help the current truck driver shortage) is Tesla.

Tesla vehicles are equipped with Autopilot software, which aims to slowly ramp up to full self driving mode. Right now, the full release version of the software includes options such as traffic-aware cruise control (which matches your speed to traffic) and the likely highly popular option of autopark. Because nobody enjoys parallel parking.

Tesla has also provided full self-driving software to “a limited number of customers.” The beta test, however, has shown its limitations. The beta testers have posted more than a dozen instances where they had to override the software. It’s arguable that self-driving software that has to be babysat is more dangerous than just driving yourself. In 2018, Elaine Herzberg was killed when a self-driving car failed to recognize her as a jaywalking pedestrian. The test driver was not fast enough to override; likely they were trusting the software a bit too much…in fact, they were watching a video, not the road. I ride horses, and trusting your horse either too much or little can have consequences which range from “not pretty” to “dangerous.”

In other words, we have to get this software perfect before we can widely distribute it; because humans will trust it too much. Right now, it’s not legal in most places to operate a self-driving vehicle without a licensed driver to override. Which means that the full potential of them cannot be realized.

Waymo has introduced autonomous taxis in some places, but you have to be screened to use them, and they often still have, yup, a human driver.

We have made a lot of progress. The biggest reason, though, why 2020 is here and self-driving cars aren’t is the sheer amount of data needed to train these AIs.

And the fact that we can’t trust AIs to tell the difference between English and Dutch. Yet.

Written by

Freelance writer, freelance editor, novelist and short story writer. Jack of many trades. https://www.jenniferrpovey.com/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store