Google, Tesla, Uber, and traditional car manufacturers are all developing self-driving cars. The future, they believe, will be one where we no longer drive cars from one place to another. Instead, we merely tell them where to go. The implications could fundamentally change the nature of transportation.

This can sound alluring, especially for people who never fostered a love for driving or who live in areas where the traffic could suck the soul out of anyone. But as is often the case, there are many unintended consequences to watch out for. Self-driving cars, like flying cars before them, could end up as a concept too dangerous or difficult to replace our current methods of moving bodies around.

Some of the risks inherent with self-driving cars range from physically dangerous to morally questionable. Here are seven potential dangers lurking in a self-driving future.

1. Self-Driving Abductions

Imagine this. You're on your way home from work and stop at a traffic light, the same one as usual. Then you feel a gun pressed against the back of your neck. Someone was hiding in the back of your car. How they got in, you have no idea. They start barking out directions, and you drive where they say, not sure if you will live through the day.

Now imagine there's no gun. There isn't even someone else in the car. You stop at the traffic light, but after the light turns green, your car turns in the direction away from home. You try to correct, but nothing has any effect. The car seems to have a mind of its own. Next thing you know, the car is speeding down the interstate and you're already miles outside of your comfort zone.

Back home, your loved one receives a phone call asking for a ransom. Or the perpetrators could have something far worse in mind.

Don't think this could happen? Researchers have already demonstrated how it's possible to hack internet-connected vehicles that are already in use today. Modern cars are computers on wheels, with various parts communicating over a small (CAN bus) network. And these aren't self-driving cars. Consider what hackers could do once even more aspects of the vehicle are controlled by software.

2. Deliberate Traffic Accidents

Maybe it's not your vehicle that gets compromised. It could be the one in oncoming traffic. One minute it's speeding along in a straight line, then its lights go off and you lose sight of it. The next thing you know, the car has crossed the median and is staring you in the face.

Say you're a driver a couple cars back. You witness the wreck, but you're powerless to do anything about it. Your vehicle doesn't respond to commands and plows into the wreckage.

Traffic comes to a halt. People are dead, no one's moving, and commerce is stalled. There are ambulances that can't get to the hospital, tractor trailors that can't deliver food to the grocery store, and delivery trucks that will no longer arrive within the expected two-day window.

The person responsible isn't anywhere in sight. Whether they wanted to target a specific person or merely wanted to cause carnage, no one knows.

3. Remote Terrorism

As recent attacks in Berlin, Nice, London, Charlottesville, and Barcelona have reminded us, vehicles can make for deadly weapons. They're easy to acquire and hard to stop. People can be on the lookout for guns... but cars?

While effective, there are limitations to this type of attack. They require a person to drive the vehicle, and said person risks bodily harm. Involving multiple cars requires multiple drivers, all of whom need to be in the area.

Self-driving cars get around these limitations. Yes, hacking a car takes more knowledge than driving, but once someone develops that skill, the options are frightening. They could remotely access an internet-connected car on the other side of the globe and run it into pedestrians or into a building. Multiple cars could strike at once. Incididents could take place in cities all over the world simultaneously.

Attacks don't even need to be so drastic. Reducing the speed of traffic by 15 minutes in every urban U.S. city could have a dire effect on the country's GDP, and that could be done by remotely telling vehicles to hit the brakes.

We're already seeing cybercriminals steal more money online than they could ever hope to physically haul from a bank. A future with self-driving vehicles could enable a handful of terrorists to fund an organization and take innocent lives from the comfort of a couch. With cars talking to one another, the number of networks available to exploit is virtually endless.

4. Data Collection and Theft

Google surely sees a big opportunity with self-driving cars. The advertising company collects in-depth data about us in order to deliver hyper-targeted ads. Self-driving cars could ramp the search giant's data collection up a notch.

Using Google Maps provides Google with a good look at where you live and where you go. Android takes the tracking further, allowing Google to draw a detailed map of where you go each day. But a phone can be confusing. Are you on foot? Are you biking? Were you riding with a friend? A self-driving car's data would show all the places you drive to, which is the kind of specific data many advertisers would love.

The privacy implications alone are enough to scare many of us, but given the amount of smartphones already tracking people's locations, this clearly hasn't yet become a widespread societal concern. But the risk remains of this detailed data portfolio falling into unwanted hands. This is especially the case when you consider that Google isn't the only game in town.

While Google may have billions to throw at maintaining security, this isn't an area car companies have much experience in. Would you trust Ford, Nissan, or Volkswagen's servers not to get infiltrated some day?

5. Software Failure

Software has bugs. Even mission critical applications screw up. There's no such thing as code that is 100 percent reliable. There are simply too many variables, including how code interacts with physical hardware failure -- something especially relevant in cars, where parts experience far more wear and tear than your average desktop.

What do self-driving cars do when software fails? Currently, they fall back to manual controls. Someone needs to be behind the wheel just in case things go wrong. Right now, that seems like a reasonable degree of safety, though it hasn't stopped one driver from dying due to computer error already.

Think to the future. Will all cars have steering wheels? We've seen discussion of models that look nothing like traditional vehicles. Instead, they're small rooms that you can rest or work in while the car wisks you off somewhere else. Will you need to be a programmer to interact with the vehicle?

Even if the car has a steering wheel, that's no good if you no longer know how to use it. One benefit of self-driving cars is the ability to let us continue to get around after we're too old to drive. That would mean we're not fit to take over in the event of a failure, either. It's feasible that decades after self-driving cars were to become mainstream, young drivers would no longer need to take driving tests. A generation of drivers will then come up with zero driving experience. Their fates would be left up entirely to the software.

6. Environmental Hazards

Autonomous vehicles rely on sensors to read the world around them. Lasers and cameras are two technologies a self-driving car uses to see. When this hardware fails, the car's ability to drive becomes significantly diminished. Nothing has to break for this to happen, either. Sure, a rock could bounce off the road and break a sensor. But a cold ice storm can also freeze over all of the equipment. What then?

Then there's unpredictability of human-made environments. A few sloppily placed stickers are all it takes to confuse an autonomous vehicle and cause an accident. Researchers at the University of Washington demonstrate this in a paper title "Robust Physical-World Attacks on Machine Learning Models." An autonomous vehicle consistently misread the stop sign below as speed limit 45:

terrifying self-driving car sign

Not good.

And who's to say whether those stickers were an accident or a prank? Maybe future models will be smart enough overcome such shortcomings. Perhaps. But keep in mind that these vehicles don't think for themselves. They navigate the road based on how software engineers program them. If someone understands this programming, or the assumptions behind the programming, then they understand how to manipulate it.

7. Dispassionate Ethical Decisions

Say you're in a situation where you either run over a pedestrian or veer off the road, possibly killing yourself. Many of us will instinctively do the latter. We don't want to kill anyone, and we wouldn't be able to live with ourselves if we did. Depending on the circumstances, we could end up facing charges as well.

If you die as a result, it's a tragedy. Family members might blame the pedestrian. Or they may know that you died trying to do the right thing in an awful situation.

In the case of a self-driving car, everything is different. The choice is no longer yours to make. If the car kills the pedestrian, you're left feeling similar guilt to what you would have felt if you were driving. Someone died because of your vehicle. If the car veers away and you manage to survive, you may feel compelled to blame the car manufacturer (or the software developer, if the software were to come from Google, for example) for your injuries. If you die, family members may think the self-driving car killed their loved one, not the situation.

What should the car do? That's a difficult question to answer. Sometimes the logical choice isn't the best choice. That alone could be one reason to leave the lid on Pandora's box.

Would You Feel Safe in a Self-Driving Car?

Whenever we ride with a friend, hop into a taxi, or enter a public bus, we're trusting another human with our life. Self-driving cars require even more trust. We have to trust the software to be competent and reliable. We trust the software not to fail.

We trust that the manufacturer isn't hoarding our location data and selling it to interested third parties. We trust those same companies to take cybersecurity seriously, because we're trusting them to keep our information safe from hackers and our vehicles safe from remote attack.

Do you trust a self-driving car to keep you safe? Would you like to see a self-driving bus shuttle your kids to school? How about driving beside a driverless gas tanker? Or would you feel more comfortable not having to be the person in control? I'm interested in your thoughts. Please share them in the comments below!

Image Credit: Skreidzeleu via Shutterstock.com