General > General Technical Chat

Self driving liability

<< < (6/9) > >>

nctnico:

--- Quote from: CatalinaWOW on April 22, 2022, 03:34:39 pm ---This thread illustrates the dominance of emotion over logic in discussions of this topic.  Specifically the arguments about liability for speeding tickets when in self driving mode. 

In my mind there is only a very minor argument here.  In general, in self driving mode the car should never speed.  There are only two reasons I can think of for this not being the case.  First, if the vendor of the software set it that way.  In which case liability is obvious and indisputable.  The second case is if the speed limit has changed and the database available to the self driving software is not updated.  Liability here could be disputable, but this is relatively uncommon (and also likely to trip up a human driver).

--- End quote ---
You forget the third case: speed limits can be changed quickly and regulary (think about road works which can also move). Also on many roads there are dynamic speed limits depending on congestion levels. All in all it is very easy to miss a sign that tells the speed limit has changed and a database doesn't help. When using Google maps it happens very often that it indicates the wrong speed limit while there are fixed signs on a highway.

Road works are particulary nasty because the signs aren't always clear. In France for example there isn't always sign that indicates the end of the road works. You just have to deduce that from the section marked with orange cones ending. Another example are parallel roads with different speed limits. Even for a human it is easy to get confused.

The bottom line is: a self driving car will be getting speeding tickets even though the intention is not to exceed the speed limit.

BTW: I'm not against self driving cars at all. On the contrary; I think that they will give me freedom when I'm too old to drive by myself so I don't have to depend on family members or crappy bus service. And hopefully before that an opportunity to do something usefull or just enjoy the landscape during a journey instead of operating a machine that -from a basic operating point of view- hasn't really evolved for over 100 years. I do find the legal implications interesting though.

Someone:

--- Quote from: jpanhalt on April 22, 2022, 12:30:23 pm ---
--- Quote from: NiHaoMike on April 22, 2022, 04:25:32 am ---Isn't there a similar problem for airline pilots on long flights when the autopilot is flying?

--- End quote ---

I am only familiar with US FAA rules.  We have 3 categories: Scheduled airlines, Private and charter (i.e., "General Aviation"), and Military.

Large scheduled airliners are highly integrated.  I learned recently, for example, that in landing, the pilot cannot control some functions like braking and reverse thrust if the radar altimeter and/or some other electronic functions do not work.  There was a recent example on a transcontinental flight to Paris.  Plane was in good shape, but something was wrong with the automatic landing system(s), and the pilots had to declare an emergency.  Add, of course the 737 MAX disasters of a couple of years ago.  Boeing has that liability.  My take was that I don't want to fly in any airplane where having the pilots in control is an emergency. ;)

In smaller GA aircraft, the pilot is responsible.  Sure, the manufacturers get sued too, but the FAA assigns blame on the pilot(s) regardless of the autopilot.  (Some large aircraft used for scheduled service are also flown under GA rules.)   It is not uncommon to see an accident report with the conclusion that the pilot failed to maintain a safe altitude and/or airspeed.  As just one example, there was an incident in Southern California many years ago in which the ground radar controller vectored a small GA Beechcraft into a mountain.  The pilot was still blamed for not maintaining awareness of the terrain.
--- End quote ---
You have it back to front:
"International Fatality Rates, A Comparison of Australian Civil Aviation Fatality Rates with International Data"
https://www.atsb.gov.au/media/32897/b20060002.pdf
or TLDR/easier to digest:
https://skybrary.aero/articles/general-aviation-ga

Flying on non-scheduled air transport, particularly private (rather than charter), is much much more dangerous by 1-2 orders of magnitude. The measure of fatalities per 100,000 hours is easy to measure for that industry, but the gap is even wider if you use the more sensible fatalities per billion passenger km. I'm sticking to the pilots forced to be conservative by the commercial airline industry, its a system that works and has an excellent track record to back it up.

Someone:

--- Quote from: Stray Electron on April 22, 2022, 03:22:11 pm ---
--- Quote from: james_s on April 21, 2022, 08:52:42 pm ---Yes for me the whole point in owning a car, especially a relatively high performance car is that I get to drive it myself. I drove a Tesla Y for a bit and it was a blast. Fastest car I've ever driven.
--- End quote ---
... I completely agree that being able DRIVE is the complete point of owning any car. I would absolutely HATE living in any big city where car ownership and/or driving was impossible.
--- End quote ---
The purpose/reason/delivery of public roads is not for your recreation. So you're still free to rent out a race track or private road (they exist in many countries) for all your driving pleasure.

CatalinaWOW:

--- Quote from: nctnico on April 22, 2022, 04:18:19 pm ---
--- Quote from: CatalinaWOW on April 22, 2022, 03:34:39 pm ---This thread illustrates the dominance of emotion over logic in discussions of this topic.  Specifically the arguments about liability for speeding tickets when in self driving mode. 

In my mind there is only a very minor argument here.  In general, in self driving mode the car should never speed.  There are only two reasons I can think of for this not being the case.  First, if the vendor of the software set it that way.  In which case liability is obvious and indisputable.  The second case is if the speed limit has changed and the database available to the self driving software is not updated.  Liability here could be disputable, but this is relatively uncommon (and also likely to trip up a human driver).

--- End quote ---
You forget the third case: speed limits can be changed quickly and regulary (think about road works which can also move). Also on many roads there are dynamic speed limits depending on congestion levels. All in all it is very easy to miss a sign that tells the speed limit has changed and a database doesn't help. When using Google maps it happens very often that it indicates the wrong speed limit while there are fixed signs on a highway.

Road works are particulary nasty because the signs aren't always clear. In France for example there isn't always sign that indicates the end of the road works. You just have to deduce that from the section marked with orange cones ending. Another example are parallel roads with different speed limits. Even for a human it is easy to get confused.

The bottom line is: a self driving car will be getting speeding tickets even though the intention is not to exceed the speed limit.

BTW: I'm not against self driving cars at all. On the contrary; I think that they will give me freedom when I'm too old to drive by myself so I don't have to depend on family members or crappy bus service. And hopefully before that an opportunity to do something usefull or just enjoy the landscape during a journey instead of operating a machine that -from a basic operating point of view- hasn't really evolved for over 100 years. I do find the legal implications interesting though.

--- End quote ---

You case is exactly case two.  Liability in that case is debatable.  Even the facts are debatable.  What if the variable limit sign changes after you pass?

coppice:

--- Quote from: nctnico on April 22, 2022, 04:18:19 pm ---You forget the third case: speed limits can be changed quickly and regulary (think about road works which can also move). Also on many roads there are dynamic speed limits depending on congestion levels. All in all it is very easy to miss a sign that tells the speed limit has changed and a database doesn't help. When using Google maps it happens very often that it indicates the wrong speed limit while there are fixed signs on a highway.

Road works are particulary nasty because the signs aren't always clear. In France for example there isn't always sign that indicates the end of the road works. You just have to deduce that from the section marked with orange cones ending. Another example are parallel roads with different speed limits. Even for a human it is easy to get confused.

The bottom line is: a self driving car will be getting speeding tickets even though the intention is not to exceed the speed limit.

--- End quote ---
Most new cars now read traffic signs. They aren't always that reliable, but if a car is going to self drive, rather than just assist the driver, it had better be able to reliably read those signs. If it can't even read something as predicable as traffic signs, how is it going to detect and react properly to less well structured things happening around it?

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod