Just read another article about this, on a car magazine website, together with the responses from readers. The usual kind of scenario. The car is driving you along the road and a child runs out in front. Only options are to hit the child or mount the pavement and hit a number of adults. this was followed by the usual discussion about the greater good and the need for programming to be consistent across manufacturers and tske account of what a human driver would do.
In all such articles/discussions I have read online none have considered what seems to me to be the most likely thing that a human driver would do. In such a situation you are not going to have time to weigh up the situation and make a rational and considered decision, most people will surely just panic, slam on the brakes and freeze until stopped. Could an autonomous car be programmed to do that, or would that be morally wrong?
autonomous cars are going to appeal greatly to those of us reaching the end of our driving career who might be struggling to drive due to ill health, poor eyesight etc. and also to the disabled, All of which are not going to be able to take over control in an emergency.
I'll be interested to see what the government does about licensing to "drive" one of these things
It's more than quite a pity, who's going to help me load and unload the luggage/shopping/wheeled walker/wheelchair? and take the longer but quicker route to the train station avoiding the rail crossing barriers down.