Who does an autonomous car run down?

  Old Deuteronomy 16:11 05 Sep 2016
Locked

Just read another article about this, on a car magazine website, together with the responses from readers. The usual kind of scenario. The car is driving you along the road and a child runs out in front. Only options are to hit the child or mount the pavement and hit a number of adults. this was followed by the usual discussion about the greater good and the need for programming to be consistent across manufacturers and tske account of what a human driver would do.

In all such articles/discussions I have read online none have considered what seems to me to be the most likely thing that a human driver would do. In such a situation you are not going to have time to weigh up the situation and make a rational and considered decision, most people will surely just panic, slam on the brakes and freeze until stopped. Could an autonomous car be programmed to do that, or would that be morally wrong?

  Forum Editor 16:29 05 Sep 2016

"Could an autonomous car be programmed to do that, or would that be morally wrong?"

Morality doesn't enter into it. An autonomous vehicle will be programmed to slow and stop if it senses an object - moving or otherwise - in its path. A calculation will take place, using speed and distance to impact. If there is time to slow without stopping the car will do so, allowing the driver to make a decision about further action. If there isn't time to allow the driver to decide, the car will stop. If it can't stop safely in the distance, it will try to take avoiding action, but it will not mount a pavement or steer into the path of oncoming vehicles in doing so.

Drivers are responsible for the vehicle at all times - saying that your car made the wrong decision in a given set of circumstances would not be a defence in court. Autonomous vehicles are not intended to make the driver entirely redundant - that wouldn't be possible at the current stage in the development of this technology.

  Bazzaman 16:34 05 Sep 2016

I read a similar article in a newspaper a couple of weeks ago. There were multiple choice answers (and, as usual, often none of them fitted for me). Clearly respondents would have had time to think it over, way up the pros and cons and then state their "choice" - like in the real world (I think NOT!).

So seems to me the programmed solution does not reflect what would really happen (if driven by a person), but rather what a group of people think they might like to see in an idealised world.

  Fruit Bat /\0/\ 16:34 05 Sep 2016

Does not compute! - self destruct in 5 seconds. :0)

The whole point in self driving cars is to get rid of the "loose nut behind the wheel".

The first time a self driving car kills somebody, (Oh dear its already happened) will it be the programmer that is prosecuted?

Issac Asimov wrote many stories in the 40's exploring the dilemmas of robots program with "the three laws"

  Old Deuteronomy 16:47 05 Sep 2016

Forum Editor,

I know you are right and agree with what you said. What got me thinking was the idea, many people seem to have, that the car should make a moral judgement and do as, they suggest, a human driver would do whilst completely ignoring the, I believe, fact the a human driver would not have the time to make such a judgement and would likely just panic and throw out the anchor.

  oresome 20:38 05 Sep 2016

Autonomous vehicles are not intended to make the driver entirely redundant

I note than since the recent fatality there has been a shift in terminology and hyperbole by the manufacturers.

I consider a half way house dangerous. Inevitably if the driver is doing very little they will lose concentration and not be as alert should an incident occur.

  OTT_B 21:46 05 Sep 2016

there has been a shift in terminology and hyperbole by the manufacturers.

Well, not exactly...there's never been any standardised language, and the hyperbole seems mostly to be coming from the sole company that's pushing the technology in the market hardest.....which happens to be the same company that has experienced some high profile incidents...just saying.....

But I agree (unofficially!) that there is a risk of loss of concentration from the driver due to on board systems doing the leg work.

That said and done, and back to the original post, the whole ethics issue has been blown out of all proportion. Cars have always been designed, built and sold with dozens of components which, if they fail, could kill an occupant or someone outside the vehicle. And any increased level of vehicle autonomy isn't going to change this. The question of whether a programmer is going to be asked to kill a child is total and utter nonsense. As FE correctly pointed out, the vehicles will be programmed to slow or safely evade if possible. If it isn't, then the driver is responsible. Think in terms of an aircraft autopilot - it doesn't matter if there's a problem, the pilot is still responsible, even if the problem is with the autopilot.....

But that's all for a vehicle that at least still has a steering wheel, which over the coming years (or much less) is going to happen. And when it does, you can absolutely expect that a vehicle that has a person in charge won't allowed anywhere near them.

  Forum Editor 22:43 05 Sep 2016

Automatic systems have been used to land aircraft for some time, and they work well - in fact many experts believe they are far safer than humans. Not quite the same as a car, I agree, but the point is similar - the automated system doesn't 'think' or make judgements based on ethics.

The whole point of autonomous vehicles is to use technology that can react faster than a driver can, and react consistently - it will do the same thing time and time again and do it perfectly.

What has to happen is that the technology has to be refined - put simply, the algorithms have to be far better. We have all come to know that automatic cameras can produce excellent results almost all the time, but we understand that an accomplished photographer will beat the algorithms when it comes to creativity.

The same thing will always apply to autonomous vehicles - they will (or should) always fail safe, protecting the driver and passengers from harm whenever possible. In a word, they'll drive the car in a boring, predictable fashion whereas a skilled driver will do so with verve and imagination. He or she cannot guarantee to get from A to B unscathed, but the ultimate autonomous system can.

It will undoubtedly happen, but we're a long way from it at the moment.

  Fruit Bat /\0/\ 23:16 05 Sep 2016

Its not new its just the next step, they have been taking the skill out of driving and replaced it with electronic systems for years. think Auto gear box, ABS, Traction Control Parking assist. and the simple ones like auto wipers auto lights, lane warnings blind spot warning etc. etc.

  oresome 09:44 06 Sep 2016

The place for such systems is where the environment is much more controlled and railways would seem to me to be the obvious one.

I don't think I'd have any qualms about travelling on a driverless train.

  LastChip 10:28 06 Sep 2016

The point is, there is no grey area in programming, it's either true or false, on or off. All a program does is ask a question and answer it. Just how complex you make those series of questions, determines how "clever" the program appears to be. But in essence, the program not clever (or intelligent) at all.

oresome has a very serious and valid point. Drivers can (and will) be lulled into a false sense of security - which I believe is part of the issue with the fatal accident in the USA. It is true to say "auto land" has been available for many years and of course, fully automatic driver-less trains. But these are in ultra controlled environments. Autonomous cars will be operating amongst millions of human brains, who sometimes are far from logical. And there lies the problem.

Perhaps they should be painted in bright colours with a huge warning: no human brains on board! But then equally, you could apply that to some of the drivers today.

This thread is now locked and can not be replied to.

Elsewhere on IDG sites

Huawei Mate 30 Pro Review: Hands-on

Fred Deakin on creating the artwork for his sci-fi rock opera The Lasters

Best iOS 13 features: What does iOS 13 do

Les meilleures enceintes connectées avec écran tactile (2019)