Self-driving vehicles are meant to bring us closer to a zero-accident future, or at least to a zero road death scenario. At least, that is the theory of it all, but not everyone is convinced. Those self-driving vehicles will encounter humans on public roads, and humans are unpredictable, after all.
As you may be aware, or you will after reading this, humans are the most important reason for most accidents on public roads. Usually, the driver of the vehicle is directly to blame for an accident. A moment of inattentiveness, a poor estimate of time, speed, space, or all three. Or something involving all the above, or nothing in between.
Upon learning about the fact that some manufacturers are working on self-driving cars, some people are outraged, while others are overly excited. Those against this technology fear that we will eventually reach a sort of dystopia where nobody knows how to drive anymore, or nobody is allowed to drive, and robots will drive us wherever the authorities decide.
You get the point, right? People fear that these self-driving cars will be controlled by law enforcement or governments and that their freedom to drive to wherever they want will go away.
Meanwhile, some people from the group of those who are excited about self-driving vehicles are a bit too excited about the tech. An even smaller group from those described earlier gives too much credit to some early vehicles without just a part of the technology that allows them to genuinely drive themselves.
The part with too much credit is a dangerous slope, as it could lead to an avalanche that affects everyone on the mountain, not just the wealthy amateur stunt person who jumps out of a helicopter at the top.
Again, in the latter case, the problem lies with humans, as well as human nature. When you believe in something, and you want it to work, you are going to let it work. If it fails, you have a long list of apologies, while others are ready for the great finger-pointing match.
Sadly, a SAE Level 3 autonomous driving system does not let vehicles drive themselves in true form, but it does let some people believe that it does, no matter how many others say that it doesn't.
The situation described above is much like placebo medicine, apart from the fact that the manufacturer of the pills writes that they are substitute medications and that they do not cure or treat any illness.
Somehow, there is a group of people who claim that the writing is there for legal reasons only and that “big pharma wants to keep you in the dark about the latest (placebo) medicine.” If a pill cannot make you feel worse, and some have felt better after taking it, it must be good, right? Nope, it does not work that way. Please speak to a real doctor before taking medication for whatever illness you have.
IAM RoadSmart, the self-described leading road safety charity in the UK, warns that self-driving vehicles will not guarantee safer roads. Instead, the representatives are urging government officials to prioritize driver training to ensure that roads become safer.
The organization has nothing against self-driving vehicles, which could be allowed on the country's roads as early as next year, but the human factor is the most feared in the equation. The fear is that some drivers might show an over-reliance on self-driving tech, while others will never improve the way that they drive.
While both are essential issues if you ask us, the road safety charity also points out that the safety of those vehicles was not based on real-world testing on UK roads, and the fact that current motorists are not prepared in any way for the future vehicles with self-driving features that will be allowed on the country's roads as early as 2023.
In other words, most people in the UK (and the rest of the world, if you ask us) are not prepared to handle a vehicle that can drive itself occasionally but then switches control back to the driver with just a visual and auditive warning.
The organization is asking for a methodology of teaching people how to take back control of a vehicle once it stops driving itself while it is traveling at speed on the road.
Yes, that may happen, and some people are not entirely ready for the switch, even though all the systems of this kind that we have encountered so far were clear and smooth when it came to resuming human driving in mere seconds.
Upon learning about the fact that some manufacturers are working on self-driving cars, some people are outraged, while others are overly excited. Those against this technology fear that we will eventually reach a sort of dystopia where nobody knows how to drive anymore, or nobody is allowed to drive, and robots will drive us wherever the authorities decide.
You get the point, right? People fear that these self-driving cars will be controlled by law enforcement or governments and that their freedom to drive to wherever they want will go away.
Meanwhile, some people from the group of those who are excited about self-driving vehicles are a bit too excited about the tech. An even smaller group from those described earlier gives too much credit to some early vehicles without just a part of the technology that allows them to genuinely drive themselves.
The part with too much credit is a dangerous slope, as it could lead to an avalanche that affects everyone on the mountain, not just the wealthy amateur stunt person who jumps out of a helicopter at the top.
Again, in the latter case, the problem lies with humans, as well as human nature. When you believe in something, and you want it to work, you are going to let it work. If it fails, you have a long list of apologies, while others are ready for the great finger-pointing match.
Sadly, a SAE Level 3 autonomous driving system does not let vehicles drive themselves in true form, but it does let some people believe that it does, no matter how many others say that it doesn't.
The situation described above is much like placebo medicine, apart from the fact that the manufacturer of the pills writes that they are substitute medications and that they do not cure or treat any illness.
Somehow, there is a group of people who claim that the writing is there for legal reasons only and that “big pharma wants to keep you in the dark about the latest (placebo) medicine.” If a pill cannot make you feel worse, and some have felt better after taking it, it must be good, right? Nope, it does not work that way. Please speak to a real doctor before taking medication for whatever illness you have.
IAM RoadSmart, the self-described leading road safety charity in the UK, warns that self-driving vehicles will not guarantee safer roads. Instead, the representatives are urging government officials to prioritize driver training to ensure that roads become safer.
The organization has nothing against self-driving vehicles, which could be allowed on the country's roads as early as next year, but the human factor is the most feared in the equation. The fear is that some drivers might show an over-reliance on self-driving tech, while others will never improve the way that they drive.
While both are essential issues if you ask us, the road safety charity also points out that the safety of those vehicles was not based on real-world testing on UK roads, and the fact that current motorists are not prepared in any way for the future vehicles with self-driving features that will be allowed on the country's roads as early as 2023.
In other words, most people in the UK (and the rest of the world, if you ask us) are not prepared to handle a vehicle that can drive itself occasionally but then switches control back to the driver with just a visual and auditive warning.
The organization is asking for a methodology of teaching people how to take back control of a vehicle once it stops driving itself while it is traveling at speed on the road.
Yes, that may happen, and some people are not entirely ready for the switch, even though all the systems of this kind that we have encountered so far were clear and smooth when it came to resuming human driving in mere seconds.