Written by Cameron DeWith
While searching for a topic to blog about, I stumbled upon self-driving vehicles while researching Hydrogen Fuel Cells. Because I see autonomous driving as the future and because I am interested in cars, I decided it would be a good topic for a blog post. Enjoy!

Introduction
In a study about the public’s views on autonomous vehicles, Daniel Howard and Danielle Dai found that people were “most attracted to potential safety benefits, the convenience of not having to find parking, and amenities such as multitasking while en route” (2013). These advantages of fully self-driving vehicles sound incredible, but as of yet, have not come to fruition. It will likely be at least another ten years before completely autonomous vehicles become available. Before discussing the barriers standing in the way of self-driving vehicles, it is important to understand how self-driving vehicles work.
How They Work
Fully self-driving vehicles utilize sensors such as cameras and radars along with software (programmed ‘instructions’) to drive. The sensors map the vehicle’s surroundings while the software uses that information to plan routes, accelerate, brake, and steer. Many vehicles are partially autonomous but unable to drive themselves in all situations. A scale, ranging from 0 to 5, grades vehicle autonomy. Level 0 vehicles are completely human controlled while Level 5 vehicles are completely self-driving. In between these two extremes lies varying self-driving capabilities.
Barriers
Why aren’t fully self-driving vehicles available yet? Well, there are multiple barriers that still stand in the way of a Level 5 autonomous vehicle.
- On unpaved or snow-covered roads navigating is very difficult for self-driving vehicles. They must accurately guess where they are allowed to drive without using lane markings.
- Self-driving vehicles must be capable of precisely recognizing surrounding hazards. This proves difficult with similar looking objects that require drastically different responses, such as motorcycles and bicycles.
- Self-driving vehicles must make decisions in a wide variety of situations. They must ‘practice’ in order to understand how to navigate hazards and situations safely. Because there are an infinite number of hazards and situations a vehicle could encounter on the road, perfect software on an autonomous vehicle becomes a near impossibility.
Ethical Issues
Ethical issues in the testing, implementation, and effect of self-driving vehicles might also stand in the way of autonomous vehicles becoming a reality in the very near future.
First, the testing of autonomous vehicles requires driving on public roads. As Sam Abeulsamid from “Navigant Research” said, “when you’re testing autonomous vehicles on a public road, not only are the people riding in that car part of the experiment, but so is everybody else around you, and they didn’t consent to being part of an experiment” (2019). This puts into question how ethical it is to test self-driving vehicles on public roads when they could potentially put other road users at risk.
Next, if autonomous vehicles were programmed to follow all the rules of the road, numerous accidents could potentially be avoided. However, if vehicle owners want to travel faster than the posted speed limit and manufacturers allow that capability in order to increase the number of customers, autonomous driving would be no safer than human driving. This ethical issue in the implementation of autonomous vehicles highlights the potential disparity between safety and profit.
Finally, self-driving vehicles could adversely affect millions of professional drivers by replacing them and their jobs.
The Future
While fully autonomous vehicles are becoming more and more of a possibility, many technological barriers and ethical issues still stand in the way of a finalized product that would be entirely safe on the road. In ten years, the long-awaited self-driving car could be a reality. Until then, keep your eyes on the road!
Conclusion
I love the possibility of fully autonomous vehicles becoming a reality in the future because they could make our roads much safer to drive on. However, the technology cannot be pushed too far, too fast. Already, there have been multiple car accidents involving Tesla’s autopilot system. Many drivers have placed too much confidence in the autonomous features despite the system’s limited capacities as a Level 2 or 3 on the scale that measures vehicle autonomy. Personally, I think that companies should carry on with developing and testing self-driving systems, but should put a pause on the release of autonomous technology for fear that it will be misused beyond its capabilities.
Leave a Reply