7 Terrifying Scenarios Self-Driving Cars Make Possible

Bertel King 04-09-2017

Google, Tesla, Uber, and traditional car manufacturers are all developing self-driving cars. The future, they believe, will be one where we no longer drive cars from one place to another. Instead, we merely tell them where to go. The implications could fundamentally change the nature of transportation How Self-Driving Cars Will Change Transportation Forever As we move into 2015, the question is no longer whether self-driving cars will replace manually driven cars, but how quickly they'll take over. Read More .


This can sound alluring, especially for people who never fostered a love for driving or who live in areas where the traffic could suck the soul out of anyone. But as is often the case, there are many unintended consequences to watch out for. Self-driving cars, like flying cars before them, could end up as a concept too dangerous or difficult to replace our current methods of moving bodies around.

Some of the risks inherent with self-driving cars range from physically dangerous to morally questionable. Here are seven potential dangers lurking in a self-driving future.

1. Self-Driving Abductions

Imagine this. You’re on your way home from work and stop at a traffic light, the same one as usual. Then you feel a gun pressed against the back of your neck. Someone was hiding in the back of your car. How they got in, you have no idea. They start barking out directions, and you drive where they say, not sure if you will live through the day.

Now imagine there’s no gun. There isn’t even someone else in the car. You stop at the traffic light, but after the light turns green, your car turns in the direction away from home. You try to correct, but nothing has any effect. The car seems to have a mind of its own. Next thing you know, the car is speeding down the interstate and you’re already miles outside of your comfort zone.

Back home, your loved one receives a phone call asking for a ransom. Or the perpetrators could have something far worse in mind.


Don’t think this could happen? Researchers have already demonstrated how it’s possible to hack internet-connected vehicles that are already in use today. Modern cars are computers on wheels, with various parts communicating over a small (CAN bus) network. And these aren’t self-driving cars. Consider what hackers could do once even more aspects of the vehicle are controlled by software.

2. Deliberate Traffic Accidents

Maybe it’s not your vehicle that gets compromised. It could be the one in oncoming traffic. One minute it’s speeding along in a straight line, then its lights go off and you lose sight of it. The next thing you know, the car has crossed the median and is staring you in the face.

Say you’re a driver a couple cars back. You witness the wreck, but you’re powerless to do anything about it. Your vehicle doesn’t respond to commands and plows into the wreckage.

Traffic comes to a halt. People are dead, no one’s moving, and commerce is stalled. There are ambulances that can’t get to the hospital, tractor trailors that can’t deliver food to the grocery store, and delivery trucks that will no longer arrive within the expected two-day window.


The person responsible isn’t anywhere in sight. Whether they wanted to target a specific person or merely wanted to cause carnage, no one knows.

3. Remote Terrorism

As recent attacks in Berlin, Nice, London, Charlottesville, and Barcelona have reminded us, vehicles can make for deadly weapons. They’re easy to acquire and hard to stop. People can be on the lookout for guns… but cars?

While effective, there are limitations to this type of attack. They require a person to drive the vehicle, and said person risks bodily harm. Involving multiple cars requires multiple drivers, all of whom need to be in the area.

Self-driving cars get around these limitations. Yes, hacking a car takes more knowledge than driving, but once someone develops that skill, the options are frightening. They could remotely access an internet-connected car on the other side of the globe and run it into pedestrians or into a building. Multiple cars could strike at once. Incididents could take place in cities all over the world simultaneously.


Attacks don’t even need to be so drastic. Reducing the speed of traffic by 15 minutes in every urban U.S. city could have a dire effect on the country’s GDP, and that could be done by remotely telling vehicles to hit the brakes.

We’re already seeing cybercriminals steal more money online than they could ever hope to physically haul from a bank. A future with self-driving vehicles could enable a handful of terrorists to fund an organization and take innocent lives from the comfort of a couch. With cars talking to one another How Cars Will One Day Talk to Each Other Tomorrow's transportation is not just about the self-driving car. The future will see networks of cars working together to keep passengers safe and deliver them to their destinations efficiently. Read More , the number of networks available to exploit is virtually endless.

4. Data Collection and Theft

Google surely sees a big opportunity with self-driving cars. The advertising company collects in-depth data about us in order to deliver hyper-targeted ads. Self-driving cars could ramp the search giant’s data collection up a notch.

Using Google Maps provides Google with a good look at where you live and where you go. Android takes the tracking further, allowing Google to draw a detailed map of where you go each day See Where You've Been with Google Maps' New Timeline Feature Google can see everywhere you've ever been. Creepy or awesome? Read More . But a phone can be confusing. Are you on foot? Are you biking? Were you riding with a friend? A self-driving car’s data would show all the places you drive to, which is the kind of specific data many advertisers would love.


The privacy implications alone are enough to scare many of us, but given the amount of smartphones already tracking people’s locations 6 Things You Didn't Realize Your iPhone Is Tracking Your iPhone keeps tabs on all kinds of things -- here's how to use that to your advantage. Read More , this clearly hasn’t yet become a widespread societal concern. But the risk remains of this detailed data portfolio falling into unwanted hands. This is especially the case when you consider that Google isn’t the only game in town.

While Google may have billions to throw at maintaining security, this isn’t an area car companies have much experience in. Would you trust Ford, Nissan, or Volkswagen’s servers not to get infiltrated some day?

5. Software Failure

Software has bugs. Even mission critical applications screw up. There’s no such thing as code that is 100 percent reliable. There are simply too many variables, including how code interacts with physical hardware failure — something especially relevant in cars, where parts experience far more wear and tear than your average desktop Microsoft Will Pay You to Find Bugs in Windows 10 Microsoft has launched a new bounty program aimed at encouraging Windows users to find bugs in its products. And doing so could bag you up to $250,000. Read More .

What do self-driving cars do when software fails? Currently, they fall back to manual controls. Someone needs to be behind the wheel just in case things go wrong. Right now, that seems like a reasonable degree of safety, though it hasn’t stopped one driver from dying due to computer error already.

Think to the future. Will all cars have steering wheels? We’ve seen discussion of models that look nothing like traditional vehicles. Instead, they’re small rooms that you can rest or work in while the car wisks you off somewhere else. Will you need to be a programmer to interact with the vehicle?

Even if the car has a steering wheel, that’s no good if you no longer know how to use it. One benefit of self-driving cars is the ability to let us continue to get around after we’re too old to drive. That would mean we’re not fit to take over in the event of a failure, either. It’s feasible that decades after self-driving cars were to become mainstream, young drivers would no longer need to take driving tests. A generation of drivers will then come up with zero driving experience. Their fates would be left up entirely to the software.

6. Environmental Hazards

Autonomous vehicles rely on sensors to read the world around them. Lasers and cameras are two technologies a self-driving car uses to see How Self-Driving Cars Work: The Nuts and Bolts Behind Google's Autonomous Car Program Being able to commute back and forth to work while sleeping, eating, or catching up on your favorite blogs is a concept that is equally appealing and seemingly far-off and too futuristic to actually happen. Read More . When this hardware fails, the car’s ability to drive becomes significantly diminished. Nothing has to break for this to happen, either. Sure, a rock could bounce off the road and break a sensor. But a cold ice storm can also freeze over all of the equipment. What then?

Then there’s unpredictability of human-made environments. A few sloppily placed stickers are all it takes to confuse an autonomous vehicle and cause an accident. Researchers at the University of Washington demonstrate this in a paper title “Robust Physical-World Attacks on Machine Learning Models.” An autonomous vehicle consistently misread the stop sign below as speed limit 45:

terrifying self-driving car sign

Not good.

And who’s to say whether those stickers were an accident or a prank? Maybe future models will be smart enough overcome such shortcomings. Perhaps. But keep in mind that these vehicles don’t think for themselves. They navigate the road based on how software engineers program them. If someone understands this programming, or the assumptions behind the programming, then they understand how to manipulate it.

7. Dispassionate Ethical Decisions

Say you’re in a situation where you either run over a pedestrian or veer off the road, possibly killing yourself. Many of us will instinctively do the latter. We don’t want to kill anyone, and we wouldn’t be able to live with ourselves if we did. Depending on the circumstances, we could end up facing charges as well.

If you die as a result, it’s a tragedy. Family members might blame the pedestrian. Or they may know that you died trying to do the right thing in an awful situation.

In the case of a self-driving car, everything is different. The choice is no longer yours to make. If the car kills the pedestrian, you’re left feeling similar guilt to what you would have felt if you were driving. Someone died because of your vehicle. If the car veers away and you manage to survive, you may feel compelled to blame the car manufacturer (or the software developer, if the software were to come from Google, for example) for your injuries. If you die, family members may think the self-driving car killed their loved one, not the situation.

What should the car do Who Should Die in Self-Driving Car Accidents? MIT Lets You Decide How would you program a self-driving car to react when an accident is imminent and all choices lead to someone's death? Read More ? That’s a difficult question to answer. Sometimes the logical choice isn’t the best choice. That alone could be one reason to leave the lid on Pandora’s box.

Would You Feel Safe in a Self-Driving Car?

Whenever we ride with a friend, hop into a taxi, or enter a public bus, we’re trusting another human with our life. Self-driving cars require even more trust. We have to trust the software to be competent and reliable. We trust the software not to fail.

We trust that the manufacturer isn’t hoarding our location data and selling it to interested third parties. We trust those same companies to take cybersecurity seriously, because we’re trusting them to keep our information safe from hackers and our vehicles safe from remote attack.

Do you trust a self-driving car to keep you safe? Would you like to see a self-driving bus shuttle your kids to school? How about driving beside a driverless gas tanker? Or would you feel more comfortable not having to be the person in control? I’m interested in your thoughts. Please share them in the comments below!

Image Credit: Skreidzeleu via

Related topics: Cyber Warfare, Self-Driving Car.

Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.

Whatsapp Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. earl
    January 11, 2018 at 4:18 pm

    Two points:
    1. AI will not be able to duplicate the function of the human brain, at least not in the foreseeable future.
    2. No computer connected to a WAN is immune to being compromised.

    Driverless cars are an non-starter.

  2. anthony
    October 21, 2017 at 3:46 pm

    I'll take the computer driving over people any day. The streets are over crowded and many cant drive.

  3. Doc
    September 6, 2017 at 2:14 pm

    Don't forget Grandma driving without her glasses, or Grandpa falling asleep behind the wheel.
    Oh wait - self-driving cars will eliminate over-aged drivers!

  4. dragonmouth
    September 5, 2017 at 8:51 pm

    "2. Deliberate Traffic Accidents"
    Will ambulances and other emergency vehicles be autonomous or will they have drivers?

    "3. Remote Terrorism"
    You did not mention loading up a driverless car with explosives and guiding it to the target. Timothy McVeigh controlling the truck from Tulsa or New York. OTOH, there is no eternal glory of martyrdom for a remote car bombing.

    "4. Data Collection and Theft"
    I wouldn't trust ANYBODY'S servers. Google's servers haven't been hacked.....YET! At least not that we know of.

    "5. Software Failure"
    It is impossible to make software 100% reliable. No way, no how! Anybody who says different, does not know what they are talking about. Even a quality testing team of thousands of programmers cannot find all the bugs because they cannot test for EVERY eventuality. All I have to do is mention Microsoft and Windows. When there are tens or hundreds of millions of autonomous cars driving tens of thousands of miles under all conditions, bugs are GUARANTEED to show up, How serious the effect of those bugs can be is anybody's guess.

    It's a proven fact that mutations in humans and animals can be caused by stray radiation. The same radiation can cause 'mutations' in computer/AI firmware. Granted, the probability of that happening to a single car is infinitesimal. BUT as the number of cars and the number oif miles driven by those cars increases, so does a chance for a 'mutation'.

    "6. Environmental Hazards"
    Pilots have been blinded during landings by miscreants with laser pointers. Autonomous cars can be similarly blinded. Since cars do not leave the ground, the blinding can occur at any point in their trip and at speed.

    The solution to defacing of signs could be 'virtual' signs. The location and type (stop, yield, speed limit, etc.) of signs can be programmed into the car so that no visual recognition is needed. That information is already available from mapping services (Google Maps, Mapquest, etc)

    "Would You Feel Safe in a Self-Driving Car?"
    Only if it was sitting up on blocks, in a locked garage, with the motor removed. Otherwise, NEVER, under any circumstances.

  5. Aga
    September 5, 2017 at 2:47 am

    I think that this type of thing can happen in current cars. Like the article says "you get into your car, and feel a gun pressed against your neck". I'd say it probably has the same chance of happening, if not more often. Cars are now used as terrorist tools. I don't think driver-less cars will make it any easier.

    Most of these can be avoided by installation of a mechanical kill switch accessible to the driver. If there isn't one already.

    On the same train of thought - there are clear situations where driverless cars save lives by their nature. If a person passes out, has a heart attack or, as so often happens, falls asleep on a long distance road or decides to drive drunk.

    These cars could save the driver and other road users in these situations.

  6. bobpat56
    September 5, 2017 at 1:34 am

    One problem will be added congestion. When parking is too expensive or inconvenient, a self-driving car will be told to orbit the area and return at a particular time. That will leave more cars on the road and cut into parking revenue, too.

  7. Max
    September 4, 2017 at 9:03 pm

    Most of these problems could be caused by proprietary software, I think exploits like slowing down the passenger to see the new immobile launch on their route, or even changing a route with the intention of showing an advertisement!
    First of all this artificial intelligence must be OPENSOURCE. And with the compile keys to check to avoid scams.

    • Doc
      September 6, 2017 at 2:15 pm

      What in heck are "compile keys to check to avoid scams"? You mean code signing keys? Or pressing CTRL+F6 to compile your code? Would help if I knew (or YOU knew) what you were talking about...