Here's an interesting read related to any automated vehicle. Worth a look over. Its from the Austrialian Transportation Safety Board. The incident of Quantas Flight 72.
https://www.atsb.gov.au/media/3532398/ao2008070.pdf
Start on "Executive Summary". You're welcome to read the rest of 300+ pages though.
Protip: When using long range lights on the road, you actually see where you are driving. To think that a camera could even auto adjust that away in a recording seems bizarre imho.
Protip: When using long range lights on the road, you actually see where you are driving. To think that a camera could even auto adjust that away in a recording seems bizarre imho.
You haven't worked with many cameras, have you?
Dynamic range is one of the greatest challenges and things don't get much more dynamic than night driving situations.
Speeding however, is a criminal offense in most places, and only makes any incident worse
Here's an interesting read related to any automated vehicle. Worth a look over. Its from the Austrialian Transportation Safety Board. The incident of Quantas Flight 72.
https://www.atsb.gov.au/media/3532398/ao2008070.pdf
Start on "Executive Summary". You're welcome to read the rest of 300+ pages though.
Sorry, not going to read it - because it's totally irrelevant. You can throw words around as much as you like, but when it comes to average people driving cars, they are NOT going to be anywhere NEAR as involved as you are suggesting they should be. To do what you are saying would require a whole new level of training and accountability ... and autonomous vehicles are moving us further away from such imposts.
It's just NOT GOING TO HAPPEN.
Also, I strongly feel that your analogy with the airline industry is an insult to everybody involved. For one thing, one of THE main dangers of aviation comes from the fact that when something goes wrong, a pilot cannot simply "pull over" to the side of the road, like you can in a car. Running out of fuel is a life and death matter in the air. On the ground, it's a nuisance. I could go on and on....
1. get enough computing power and enough software to react to potentially dangerous situations for the car, it's passengers and anybody on the road. Make that reaction time equal to or better than human reaction time. The video shown in this thread shows the car doesn't have enough AI to "look ahead to potential accident situations, and act.".
1. get enough computing power and enough software to react to potentially dangerous situations for the car, it's passengers and anybody on the road. Make that reaction time equal to or better than human reaction time. The video shown in this thread shows the car doesn't have enough AI to "look ahead to potential accident situations, and act.".Here autopilot detected that shit will happen way before human would expect.
The Tesla was keeping a very sensible distance from the car ahead, so a human driver would have been able to easily stop in time. This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.
The Tesla was keeping a very sensible distance from the car ahead, so a human driver would have been able to easily stop in time. This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.
The Tesla was keeping a very sensible distance from the car ahead, so a human driver would have been able to easily stop in time. This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.The fact is, it started breaking before human would. So there was still quiet a distance once crash happened.
This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.
Because they don't act dumb unlike people.
The Tesla was keeping a very sensible distance from the car ahead, so a human driver would have been able to easily stop in time. This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.The fact is, it started breaking before human would. So there was still quiet a distance once crash happened.
The Tesla was keeping a very sensible distance from the car ahead, so a human driver would have been able to easily stop in time. This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.The fact is, it started breaking before human would. So there was still quiet a distance once crash happened.
The fact is, the Tesla car beeped long before the accident even happened.
PS: The English word is "brake".
A human driving could have recognised the same thing.
Protip: When using long range lights on the road, you actually see where you are driving. To think that a camera could even auto adjust that away in a recording seems bizarre imho.
You haven't worked with many cameras, have you?
Dynamic range is one of the greatest challenges and things don't get much more dynamic than night driving situations.Most dash cameras do a pretty good job at night. This Uber one seems to have done a rather poor job.
Uber must have used a potato instead of a dashcam.
Below a footage decent dashcam at the same spot at night.
https://youtu.be/CRW0q8i3u6E?t=30
The Tesla was keeping a very sensible distance from the car ahead, so a human driver would have been able to easily stop in time. This is a nice demonstration of why tailgating is dumb, not a demonstration of where machines have an edge.