Driver less Cars are Coming
Below, I have outlined some challenges and ethical questions that need addressed before we reach the goal of a car that is 100% driver-less (no pedals and no steering wheel!).
Why should fleet managers be concerned with advances in driver-less technologies? Because these advances will drastically reduce the number of car accidents and this should lower the cost of fleet motor insurance.
The race to create a vehicle that reaches level 6 on the scale of the Society of Automotive Engineer, which translates to a fully autonomous vehicle or in plain English a vehicle that does not need human intervention. Level 1 is a car with no automation built into the 3 key driving elements (steering, braking and the gas pedal).
The Case for Smart Cars
Currently, in the USA and I am sure these figures will be similar in the UK, 94% of accidents are the result of driver error. 3 million plus people have vision problems that prevent them from driving, and let’s not forget senior citizens who are no longer able to drive. The journey to fully automated driving will drastically reduce road accidents and give the right of mobility to many more people in our society.
Should cars be 100% autonomous
This is being debated, but in the meantime we have to answer some ethical questions. One of these questions is referred to as the trolley scenario. In this case you have a trolley moving down a rail line and in front five people are in its path. The trolley can only take one action to avoid killing the 5 people, and that is by switching track, but one child is on the other track. Can you see the dilemma?
Some questions that need to be addressed first
Can you see the dilemma given the knowledge that these cars rely on deep learning, which is a form of artificial intelligence, to make decisions? The artificial intelligence is a form of deep learning in which neural networks are trained by human engineers or programmers to take the appropriate action when driving autonomously.
So one of the ethical question is what do you train the AI to do? In addition, should we allow these types of death and life decisions to be made by artificial intelligence? Do we give equal weight to human life, or should other factors be considered, such as age? Should you the driver be killed in taking the necessary preventive action to minimize the outcome of an auto accident by artificial intelligence?
Should Drivers always be on standby.
If level 6 is too much to contemplate, let's consider levels 4 and 5. These cars are fully automated to drive in any weather condition, but will alert the driver when an accident that is unavoidable will occur and let the driver take the appropriate action, so artificial intelligence is not responsible for killing humans.
Some road accidents will still happen
In the recent TESLA fatal accident the cars was operating at level 4, and it would have notified the driver by noise and if necessary touch that it needed human intervention. The accident is still being investigated as to why the standby driver did not take action. These cars rely heavily on artificial vision by using radar, cameras and LiDAR in combination to create a 3D map of the driving environment. One theory is that one of these sensors malfunctioned, which may have been caused by something as simple as dirt on the lenses.
Fleets of the future and the cost of Fleet insurance
Without doubt cars will continue to get safer as these new technologies are applied to new cars. Limits on driving hours will most likely be removed or increased, as driver tiredness will be less of a factor in accident cause. The big question for fleet managers is will their fleet insurance costs be reduced? Possibly you would think, but would any decrease in insurance costs may be offset by the increased cost of vehicles with this new technology. LiDAR is not cheap.
In my view long term these costs will come down like any new technology, and we will all benefit by drastic reductions in road accidents which will reduce fleet insurance costs.