Every day there’s more exciting news about self-driving automobiles. Unfortunately, much of it is bad. This is to be expected in testing radical new technology in complicated, real-life environments. The shift from horse-drawn carriages to Model T Fords was not easy, and it was dangerous, too. Today, we don’t remember the buggy-and-car accidents, horses stampeded by cars, etc. Instead, we worry about today’s problems; congestion, pollution, distracted driving and the rest.

Tomorrow will likely be much the same. So we currently worry about self-driving cars not seeing things, hackers, and crashes. These are all real problems that must be solved. However, in the future, once these dream auto-pilot vehicles arrive, we may then have to worry about people playing chicken with them, hacking of whole fleets, hoaxers deliberately causing mass crashes and messes, traffic congestion caused by cars endlessly looping around waiting for their drivers. As for where things are now, perhaps these recent headlines are signals to slow down and enjoy the drive.

Tricking Teslas to Speed

Security researchers fooled multiple Tesla cars into speeding up by 50 miles per hour. They put a 2″ piece of tape on a 35 MPH speed limit sign, making it look enough like 85 MPH that it fooled the Mobileye EyeQ3 vision system on the cars into thinking they should speed up. If a small piece of tape can dangerously disrupt traffic, how do we expect cars to deal with freeways with billboards, graffiti, trash, or stickers?

Ghost Images Trick Driverless Cars to Stop

Just as unexpectedly speeding up can cause accidents, so can stopping suddenly. Researchers caused both fully- and semi-autonomous vehicles to stop by projecting an image of a person on the road in front of them. Other projection tricks included a stop sign on a tree, and fake lane markings on the street. This represents a fundamental flaw in the object detectors (as opposed to a bug or edge-case that can be patched with a quick bit of coding). On the other hand, are human-controlled cars much more resilient to this type of attack? The convincing image of a person in the road in front of my car would cause me to freak the heck out and swerve or slam the brakes. Even an unconvincing one probably would!

Driver Stranded When Car Can’t Call Home

There very likely could be many more stories like this as self-driving cars get on the back roads. A woman rented a car from a car-sharing service called GIG Car Share in the Bay Area, for a trip into the mountains. When she ventured out of cell phone signal range, the car stopped and refused to start again. Luckily her cell phone used a different mobile carrier from the car, so she was able to call for help. The first “help” offered by the company was that she should spend the night in the car and see if it worked in the morning. She was eventually able to get a tow truck to take the car back to civilization. It required two tow trucks though, as the car didn’t work after it was first towed into an area with a good mobile signal. It turned out the problem by that point was software — the car had to be rebooted before it could be used again. GIG Car Share does offer an option of using an “RFID Tag” to make the cars usable outside of cell signal range. However, you have to order the tag and wait for it to arrive, which doesn’t quite match with the image they promote of signing up and renting a car instantly.

Rental Car Still Connected to Previous User

This sort of thing happens much more often than one might think – even to people here. A man rented a Ford Expedition from Enterprise, and paired it with the FordPass app on his phone. FordPass allows you to remotely start or unlock the vehicle. After renting and returning the car, he did not think to un-pair it with his phone. Five months later, he discovered that his app was still paired with that Expedition, and he could still start the engine, turn it off, lock and unlock the doors, and even track its location.