Debate House Prices


In order to help keep the Forum a useful, safe and friendly place for our users, discussions around non MoneySaving matters are no longer permitted. This includes wider debates about general house prices, the economy and politics. As a result, we have taken the decision to keep this board permanently closed, but it remains viewable for users who may find some useful information in it. Thank you for your understanding.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!

Self-driving cars more dangerous

13567

Comments

  • kabayiri
    kabayiri Posts: 22,740 Forumite
    Part of the Furniture 10,000 Posts
    When autonomous cars encounter daredevil cyclists and pedestrians playing chicken just for the fun of seeing the car react, they will have to start doing exactly that.
    If someone makes as if to deliberately jump in front of an autonomous car, it will pretty quickly have to decide whether to slam the brakes on or deliberately run him down.


    Once people realise that autonomous cars can always be fooled into giving way, traffic will come to a standstill. Alternatively, autonomous cars will have to start making up rules and killing people.

    We already have the same risks with driverless trains though.
  • kabayiri wrote: »
    We already have the same risks with driverless trains though.

    big difference between trains on a track both of which you control centrally and cars on a street neither of which you control
  • That's bound to happen with any AI self-learning device surely?


    Unless they have a very strong moral sense embedded in them, at some point they will realise that killing people is of benefit to their own species.


    If an autonomous car is so clever that it can make decisions about running down one old person versus a queue of school children, then before long it will get the get an appetite for running into people.
    Or instead of a moral sense, perhaps it just needs to have an anal paranoia about not scratching its own paintwork? :)
    How do you code moral sense ??

    as for running down one person - there'll be a VIP version of the software that always spares the VIP at everyone else's expense

    if the car has a choice between John Bercow dying or a bus queue of kids dying the kids arent going to make it
  • Nasqueron
    Nasqueron Posts: 10,796 Forumite
    Part of the Furniture 10,000 Posts Photogenic Name Dropper
    Zxcv_Bnm wrote: »
    Yes because before long they started killing people!


    The canon of the film says that the kill switch was to stop them learning empathy (and thus being able to fool the Voight-Kampff test) not because they wanted to kill people. Some of the models (e.g. Rutger Hauer's character) were combat models already

    Sam Vimes' Boots Theory of Socioeconomic Unfairness: 

    People are rich because they spend less money. A poor man buys $10 boots that last a season or two before he's walking in wet shoes and has to buy another pair. A rich man buys $50 boots that are made better and give him 10 years of dry feet. The poor man has spent $100 over those 10 years and still has wet feet.

  • antrobus
    antrobus Posts: 17,386 Forumite
    edited 15 February 2019 at 5:21PM
    Nasqueron wrote: »
    The canon of the film says that the kill switch was to stop them learning empathy (and thus being able to fool the Voight-Kampff test) not because they wanted to kill people. Some of the models (e.g. Rutger Hauer's character) were combat models already

    The film was set in Los Angeles, November 2019. So we only have a few months before the Nexus 6 Replicants arrive. :)

    Which is probably a good few years before we get self driving cars.
  • PokerPlayer111
    PokerPlayer111 Posts: 343 Forumite
    edited 15 February 2019 at 5:29PM
    Zxcv_Bnm wrote: »
    How do you code moral sense ??

    as for running down one person - there'll be a VIP version of the software that always spares the VIP at everyone else's expense

    if the car has a choice between John Bercow dying or a bus queue of kids dying the kids arent going to make it


    The classic John Bercow problem would need regulations to stop.


    Overall accident percentages should be more controllable due to speeds in populated places could be cut but with no loss of time to destination:
    https://www.youtube.com/watch?v=iHzzSao6ypE
  • Arklight
    Arklight Posts: 3,182 Forumite
    Ninth Anniversary 1,000 Posts
    How about just having a rule where if someone drives like an a-hole then they lose their license, and actually have some police officers to enforce it?
  • Arklight wrote: »
    How about just having a rule where if someone drives like an a-hole then they lose their license, and actually have some police officers to enforce it?


    Would love that rule now but even with it you still get human error which hopefully could be improved upon with self drive.
  • AnotherJoe
    AnotherJoe Posts: 19,622 Forumite
    10,000 Posts Fifth Anniversary Name Dropper Photogenic
    Zxcv_Bnm wrote: »
    https://arstechnica.com/cars/2019/02/in-2017-the-feds-said-tesla-autopilot-cut-crashes-40-that-was-bogus/

    "the activation of Autosteer actually increased crash rates by 59 percent."

    have to wait a bit longer for this one i think


    The person who did that study removed 99% (literally) of the input data in order to get to that result. I suspect they have a hidden agenda.
  • AnotherJoe
    AnotherJoe Posts: 19,622 Forumite
    10,000 Posts Fifth Anniversary Name Dropper Photogenic
    edited 16 February 2019 at 9:08AM
    When autonomous cars encounter daredevil cyclists and pedestrians playing chicken just for the fun of seeing the car react, they will have to start doing exactly that.
    Its no different to human drivers encountering daredevil cyclists and pedestrians playing chicken.

    If someone makes as if to deliberately jump in front of an autonomous car, it will pretty quickly have to decide whether to slam the brakes on or deliberately run him down.
    Thats only going to cause a problem to a human driver , since only a human could gauge intent. Autonomous software is written to react to events not to try and determine what might be in the mind of a pedestrian.

    Once people realise that autonomous cars can always be fooled into giving way, traffic will come to a standstill.

    Why dont pedestrians do that today, eg jump in front of cars to stop traffic?? Why doesn't that happen now with human drivers ? After all if someone jumps in front of my car, I'd try to stop and no doubt you would do the same as well.
This discussion has been closed.
Meet your Ambassadors

🚀 Getting Started

Hi new member!

Our Getting Started Guide will help you get the most out of the Forum

Categories

  • All Categories
  • 351.3K Banking & Borrowing
  • 253.2K Reduce Debt & Boost Income
  • 453.7K Spending & Discounts
  • 244.2K Work, Benefits & Business
  • 599.4K Mortgages, Homes & Bills
  • 177.1K Life & Family
  • 257.7K Travel & Transport
  • 1.5M Hobbies & Leisure
  • 16.2K Discuss & Feedback
  • 37.6K Read-Only Boards

Is this how you want to be seen?

We see you are using a default avatar. It takes only a few seconds to pick a picture.