General > General Technical Chat
Full-self-driving needs external infrastructure
bdunham7:
--- Quote from: nctnico on March 06, 2023, 05:05:32 pm ---Such drivers don't exist...
--- End quote ---
I don't think that's true at all.
--- Quote ---It needs to be better than the average human driver (including drunks, retards and old people).
--- End quote ---
And this is the is the precise point of unresolvable disagreement as I already pointed out. As long as videos and stories abound of self-driving cars making serious errors that I or any competent, awake driver wouldn't make--or even ones that humans imagine they wouldn't make--there isn't going to be acceptance of these systems, at least by me and most people my age. From conversations I've had, the younger crowd seems to be more willing to go along with this madness. That's fine for them, I suppose, as long as we pass laws making both the human and FSD fully and jointly liable for all damages.
As an additional point, current Tesla self-driving systems are a bit more like aviation autopilot systems in one disturbing way, one that I think should be disqualifying--they only work under conditions that they can handle and when they are over their head, they shut off and hand the controls back to the human. To me this is a pretty clear admission that the human is the superior driver. FSD is claiming all of the safe miles driven but then it only works in situations where humans would be fairly safe as well as long as they are sober and attentive. Under more challenging conditions, it quits. Humans have to deal with 100% of driving situations without cherry-picking. When someone develops a system that works the other way around and has a button I can press so the car takes over when I'm in over my head in a low-visibility or other emergency situation, then I'll be impressed. Volvo's 'City Safety' system is an example of a system that goes in that direction as it intercedes when it detects a driver error, but it isn't FSD and certainly isn't marketed as such.
pcprogrammer:
I wonder if a computer system can be programmed or taught to anticipate. Something humans can and in lots of situations will be the reason accidents are avoided. We can make judgement on how someone is driving and decide to not overtake because the driver is possibly drunk or aggressive and could do stupid things leading to an accident when being overtaken. Or make judgment on other situations to allow for safe responses.
Call it a sixth sense, possibly non catch-able with whatever sensor there is out there.
Driving is not a simple task of stay between the lines and uphold the speed limit, and just turn the steering wheel when needed. It is looking at the bigger picture of what is happening around the vehicle.
By the way go back some 25 years and the self driving systems being worked on had inter vehicle communication to improve on safety. This way the self driving cars were able to adapt to each other. So the idea of external input is not so strange, but probably abandoned due to needing adaptation of every vehicle on the road to make it work. Just like ataradov wrote in the first response the realization of such a system is far to expensive.
tom66:
--- Quote from: bdunham7 on March 06, 2023, 04:45:24 pm ---The problem is that your statistics on human drivers includes drunk drivers, distracted drivers, drivers who flagrantly violate traffic laws, plainly incompetent drivers that haven't lost their licences as of yet and so forth. I don't think that an automated system that does just slightly better than that is acceptable at all. In fact I see absolutely no justification for self-driving unless it can do significantly better than an experienced, competent, wide-awake, undistracted and cautious human driver. IMO, that would only be practical at current technology levels if you at least used some additional sensing (LIDAR, etc), massive self-updating databases and perhaps external guide systems or artifacts. The sad part about Telsa FSD is that I think it probaby is possible, but they aren't doing it.
--- End quote ---
The problem is, a significant number of people on the roads *are* bad drivers. It's one thing to say we should only have good drivers on the road, but the fact is that we don't, given that in most countries they kill thousands of people per year, and this is considered a *good* amount compared to what it used to be. And it's common for those drivers to be so-called professional drivers; truck, taxi and van drivers and the like. Something about driving every day makes people less careful and attentive.
Compared to railways, it is crazy how much we accept the death toll on the roads; in the UK the last major rail accident killed 31, that was 24 years ago, attributed ultimately to driver error, passing a signal at danger. Since then, we've only had single digit fatalities in the odd accident here and there. The railway network is about 10x safer per passenger-km than the roads.
So if the bad and indifferent car drivers could be automated out - those are the people who likely don't care about what or how they drive, they just use a vehicle to get from A to B - then road safety could surely be improved.
To be clear, if all cars tomorrow were SDC's and we still had 40k deaths (in the US as an example) I'd be unhappy, because that's an unacceptably large figure. But if it turned out SDC's could get that down to half, I think I'd be pretty happy with that outcome, knowing that there should be no fundamental reason incremental improvement cannot get to nearly zero fatalities eventually.
--- Quote from: james_s on March 06, 2023, 05:50:07 pm ---
--- Quote from: nctnico on March 06, 2023, 05:05:32 pm ---Such drivers don't exist...
The truth between your and Tom66's posting is somewhere in the middle: on one hand doing self driving is very hard to do due to the interaction between drivers. Driving is way more than simple colission avoidance. OTOH a self driving system doesn't need to be perfect. It needs to be better than the average human driver (including drunks, retards and old people).
--- End quote ---
I remember being taught way back in drivers ed that driving is primarily a social interaction rather than a mechanical task, and social interactions are something computers are not good at. Any automated driving system has to be substantially better than average drivers for it to be acceptable. If it turns out that manufactures are liable, then the obvious outcome is that they will dedicate huge resources to legal teams to crush anyone involved in an accident with one of these self driving cars that tries to claim the manufacture is at fault. Even in cases where it is blatantly the fault of the self driving car, the companies will be able to bleed people dry in the courts and coerce them into settling. Maybe it's different in other parts of the world but here it is all but impossible for a regular individual to take on a large corporate entity, an entity with enormous resources can game the court system until the little guy runs out of money and can't afford to fight further, or in many cases they know this will happen and do whatever they can to cut their losses. It will be disastrous.
--- End quote ---
If you have an accident in the UK it is very common for the insurance company to try to identify immediately if you are at fault or not, and to settle the case. For instance, if you as a driver run in to the rear of someone, barring a few particular mitigating circumstances, you are assumed to be liable. If you merge onto a highway and collide with a vehicle on that highway, you are at fault. If you kill someone, you are extremely likely to face court unless the facts of the matter are so clear that the police cannot see a reasonable chance that you would be convicted.
With the amount of datalogging these SDC's do, I am sure that the first fatality from a mainstream operational vehicle will be examined in great detail. And the insurance company will probably pay out the costs to the victims and settle the case quickly. Most insurance companies try to avoid going through the courts as it's very expensive, so there are industry agreed rates on costs for injuries, fatalities, etc. No doubt there will be the odd court case and class action lawsuit but I don't see how this would be any different to how current motoring offences are handled.
AndyC_772:
I wouldn't trust insurance companies to apportion blame in a way which is scientifically useful.
Some years ago I came off a motorcycle on a roundabout. It was covered in diesel - which, for the benefit of those unfamiliar with motorcycling - is extremely slippery and known to be one of the worst hazards out there for riders of two wheelers. Here in the UK, depositing anything on the highway which causes someone to have an accident is a criminal offence.
I called the insurance company to report the incident.
"Whose fault was it?" they asked.
"Whichever idiot over filled their fuel tank and didn't put the cap back on properly", I replied.
"I need a name"
"What difference does it make whether or not I can name the specific, individual person that dumped oil all over the road?"
"If you can't provide a name, then we'll blame you".
This, my friends, is how to lose a long term customer in the space of about two minutes.
TimFox:
Recent experience on railroads shows that external infrastructure for traffic control requires vigilant maintenance.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version