We'd like to remind Forumites to please avoid political debate on the Forum... Read More »
Debate House Prices
In order to help keep the Forum a useful, safe and friendly place for our users, discussions around non MoneySaving matters are no longer permitted. This includes wider debates about general house prices, the economy and politics. As a result, we have taken the decision to keep this board permanently closed, but it remains viewable for users who may find some useful information in it. Thank you for your understanding.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!
Self-driving cars more dangerous
Comments
-
Clifford_Pope wrote: »When autonomous cars encounter daredevil cyclists and pedestrians playing chicken just for the fun of seeing the car react, they will have to start doing exactly that.
If someone makes as if to deliberately jump in front of an autonomous car, it will pretty quickly have to decide whether to slam the brakes on or deliberately run him down.
Once people realise that autonomous cars can always be fooled into giving way, traffic will come to a standstill. Alternatively, autonomous cars will have to start making up rules and killing people.
We already have the same risks with driverless trains though.0 -
Clifford_Pope wrote: »That's bound to happen with any AI self-learning device surely?
Unless they have a very strong moral sense embedded in them, at some point they will realise that killing people is of benefit to their own species.
If an autonomous car is so clever that it can make decisions about running down one old person versus a queue of school children, then before long it will get the get an appetite for running into people.
Or instead of a moral sense, perhaps it just needs to have an anal paranoia about not scratching its own paintwork?
as for running down one person - there'll be a VIP version of the software that always spares the VIP at everyone else's expense
if the car has a choice between John Bercow dying or a bus queue of kids dying the kids arent going to make it0 -
Yes because before long they started killing people!
The canon of the film says that the kill switch was to stop them learning empathy (and thus being able to fool the Voight-Kampff test) not because they wanted to kill people. Some of the models (e.g. Rutger Hauer's character) were combat models alreadySam Vimes' Boots Theory of Socioeconomic Unfairness:
People are rich because they spend less money. A poor man buys $10 boots that last a season or two before he's walking in wet shoes and has to buy another pair. A rich man buys $50 boots that are made better and give him 10 years of dry feet. The poor man has spent $100 over those 10 years and still has wet feet.
0 -
The canon of the film says that the kill switch was to stop them learning empathy (and thus being able to fool the Voight-Kampff test) not because they wanted to kill people. Some of the models (e.g. Rutger Hauer's character) were combat models already
The film was set in Los Angeles, November 2019. So we only have a few months before the Nexus 6 Replicants arrive.
Which is probably a good few years before we get self driving cars.0 -
How do you code moral sense ??
as for running down one person - there'll be a VIP version of the software that always spares the VIP at everyone else's expense
if the car has a choice between John Bercow dying or a bus queue of kids dying the kids arent going to make it
The classic John Bercow problem would need regulations to stop.
Overall accident percentages should be more controllable due to speeds in populated places could be cut but with no loss of time to destination:
https://www.youtube.com/watch?v=iHzzSao6ypE0 -
How about just having a rule where if someone drives like an a-hole then they lose their license, and actually have some police officers to enforce it?0
-
-
https://arstechnica.com/cars/2019/02/in-2017-the-feds-said-tesla-autopilot-cut-crashes-40-that-was-bogus/
"the activation of Autosteer actually increased crash rates by 59 percent."
have to wait a bit longer for this one i think
The person who did that study removed 99% (literally) of the input data in order to get to that result. I suspect they have a hidden agenda.0 -
Clifford_Pope wrote: »When autonomous cars encounter daredevil cyclists and pedestrians playing chicken just for the fun of seeing the car react, they will have to start doing exactly that.
Its no different to human drivers encountering daredevil cyclists and pedestrians playing chicken.
If someone makes as if to deliberately jump in front of an autonomous car, it will pretty quickly have to decide whether to slam the brakes on or deliberately run him down.
Thats only going to cause a problem to a human driver , since only a human could gauge intent. Autonomous software is written to react to events not to try and determine what might be in the mind of a pedestrian.
Once people realise that autonomous cars can always be fooled into giving way, traffic will come to a standstill.
Why dont pedestrians do that today, eg jump in front of cars to stop traffic?? Why doesn't that happen now with human drivers ? After all if someone jumps in front of my car, I'd try to stop and no doubt you would do the same as well.0
This discussion has been closed.
Confirm your email address to Create Threads and Reply

Categories
- All Categories
- 351.3K Banking & Borrowing
- 253.2K Reduce Debt & Boost Income
- 453.7K Spending & Discounts
- 244.2K Work, Benefits & Business
- 599.4K Mortgages, Homes & Bills
- 177.1K Life & Family
- 257.7K Travel & Transport
- 1.5M Hobbies & Leisure
- 16.2K Discuss & Feedback
- 37.6K Read-Only Boards