General > General Technical Chat
Self driving liability
<< < (9/9)
nctnico:

--- Quote from: eugene on April 23, 2022, 09:42:12 pm ---
--- Quote from: TerraHertz on April 23, 2022, 02:57:01 pm ---Did I say human drivers don't do stupid things? Nope. Just today I was a few meters away (on foot) from a near accident where a human driver did something very stupid, and the two cars involved screeched to a stop about a foot apart.

But it's a logical fallacy to argue that because human drivers screw up, we should allow 'AI' self-driving cars.
Even if somehow self-driving cars managed to have fewer accidents per mile driven, less injuries, etc, it would still be a very bad idea.

--- End quote ---

Don't mistake me for an AI fanboi, but I don't understand how anything that makes traveling safer can be a "very bad idea."

In any case, I'm certain that I see the entire topic differently than most of the contributors to this thread. (I find it comical that a bunch of off-duty engineers think they understand the issues better than the on-duty engineers working in the auto industry.) Only nutjobs like Elon Musk think it's a good idea to make a car that is completely driverless and sell it today. On the other hand, ALL of the major auto manufacturers are investing big money and effort on the topic in a more general way. None of GM, VAG or Toyota are going to be offering cars that drive around a tarmac and crash into anything in the way. The reason is obvious and the major topic of this thread: liability.

But, they are doing all of the research that their enormous resources allow. And in the course of their research they will develop all sorts of ways to build cars that assist the driver, possibly in surprising ways. The consequence can only be that the cars will be MORE safe. No major auto manufacturer (Telsla not withstanding) is going to introduce features that make traveling less safe. It's just not part of their business plan.

Anyway, all of this assisted driving will eventually result in auto-driving. In the meantime (i.e. today) we can have fun talking about the what ifs, but it's all essentially hypothetical because unless Google starts making cars, Tesla will be the only example we have for discussion, and they aren't even close to typical.

--- End quote ---
The latter is not quite right. Tesla drums up a lot of hype about their brand but the reality is that their assisted driving system (currently SAE level 2) is severly lagging behind on the competition. For example: Mercedes offers 'Drive pilot' which is an SAE level 3 automated driving system on their S class models for use in Germany. BMW is close to introducing a SAE level 3 automated driving system but seems to struggle with the regulatory framework in the EU: https://electricvehicleweb.com/bmw-ix-level-3-autonomous-driving/

Getting the legal framework in order soon is important because there will be more and more cars on the road with automated driving systems.
eugene:
Thank you. I stand corrected.

I still contend that, at least in the next decade, assisted driving will play a much larger part than fully autonomous driving. In some case, like an imminent collision, the car might take over completely from the driver, but that's still a long way from the sci-fi notion of putting your kids in the car and telling it to take them to school, then come right back.

I guess that I am ironically responding emotionally to other's emotions.
Someone:

--- Quote from: jpanhalt on April 23, 2022, 12:07:19 pm ---Rates per passenger mile is deceptive as scheduled carriers log far more passenger miles than GA for a single airplane.  That is, most GA flights are over shorter distances than scheduled carrier flights and have far fewer passengers per flight.
--- End quote ---
Yes, typically shorter distances for GA does somewhat bias their safety figures in the negative, as does hours of aircraft operation. But that alone still doesn't explain the enormous differences in fatality rates between the different modes (having already removed the very very risky agricultural operations from the figures). I suggest most of it are the additional "restrictive" rules that the large airlines enforce, which will be a similar situation when self driving cars arrive.

Big businesses with large exposure will be much more risk averse than an "insured" individual who doesn't see significant loss from their (in)action. So expect self driving cars that accept the legal liability/responsibility to be much more cautious than human drivers who are exposed to little liability. Offloading externalities of cars to government backed/mandatory insurance has been a huge win for private motoring (similar to the nuclear industry).
sleemanj:

--- Quote from: Stray Electron on April 22, 2022, 03:14:15 pm ---
   This times 100,000!  IMO if the "rider" wants to go somewhere and doesn't want to "drive" then they should take a cab or a bus!


--- End quote ---

Which may also be driverless...



TerraHertz:
On the topic of 'AI' astonishing advances:



Adding- this too: https://this-person-does-not-exist.com/en
Navigation
Message Index
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod