I just noticed an interesting article on ABC: http://www.abc.net.au/news/2015-11-04/researchers-probe-mora…
The question: should your self-driving car kill you if that avoids killing multiple other people?
I personally don't think it is quite as easy as adding up numbers of people. My moral compass would include a question like "who is at fault for the accident?".
An example: someone is speeding and runs a red light, they have a passenger in their car. I am alone in my car and obey all the road rules. I would think my car should prioritise my own safety over that of someone breaking the law.
Another example: a couple of drunk people stumble out onto the road as I am about to drive past. The car can send me to certain death by swerving into a tree or it can hit the two drunk people. I'd still like my own life prioritised.
What car would you buy? One that:
"A self driving car" as you put it would sense the pedestrians and red light runners well before they become a hazard and adjust your speed and braking accordingly..I'm not sure about "swerving" to avoid objects as that would then put the vehicle out of control.