this is what human sees at same spot same time of night:
Failure of the LIDAR
Says who?
More likely a failure to figure out that somebody would be stupid enough to cross the road like that.
One more thing, you really haven't made any case to say why we should take a human being completely out of the loop (my main point!).
I have seen some cars (like Audi) already have some TI camera's onboard.
Anybody know if this technology is already implemented in these autonomous test cars?
Volvo already has some automatic breaking for pedestrians. Is this switched off during these tests?
... industry doesn't abandon technology. It simply works out what went wrong and then takes steps to fix it.
Don't even get me started on how stupidly foolish it is to solely rely on GPS as a means of navigation.
But to say this proves self-driving cars are bad and should never be allowed is really disingenuous. What it proves is that Uber's technology is bad and shouldn't be driving on public roads. I hope Uber are held accountable if they are found to have made an error that should have been avoided if they followed good engineering practices. I can't imagine how the software could fail in such a predictable scenario, surely they would have tested the software with thousands of test cases of exactly this kind of situation, before allowing it to drive on public streets.
Don't even get me started on how stupidly foolish it is to solely rely on GPS as a means of navigation.Do you have any source that can corroborate that statement? Self driving cars do not solely rely on GPS. Well, at least the google cars don't (it's mentioned briefly in the video in my previous post).
Simulated miles at this stage are basically worthless. Simulation is a fantastic way to get to a prototype, but real world testing is about finding what stimuli your simulations missed. To continue building miles on the simulators is more of a publicity stunt that engineering.
What is need now is genuinely independent testing of these systems, complementing the in house testing. The group that builds anything is too highly motivated to gloss over problems to be trusted on their own.
Don't even get me started on how stupidly foolish it is to solely rely on GPS as a means of navigation.Do you have any source that can corroborate that statement? Self driving cars do not solely rely on GPS. Well, at least the google cars don't (it's mentioned briefly in the video in my previous post).From what I have seen, Google have relied solely on GPS and maps for what most of us would consider navigation - i.e. "what route do I take?". They then use radar and lidar for short range fine tuning, and obvious things like collision avoidance. I assume by this time they have started working temporary road sign recognition into the mix, to deal with road works. The signing at road works tends to be pretty weak, and haphazard, though.
Don't even get me started on how stupidly foolish it is to solely rely on GPS as a means of navigation.Do you have any source that can corroborate that statement? Self driving cars do not solely rely on GPS. Well, at least the google cars don't (it's mentioned briefly in the video in my previous post).From what I have seen, Google have relied solely on GPS and maps for what most of us would consider navigation - i.e. "what route do I take?". They then use radar and lidar for short range fine tuning, and obvious things like collision avoidance. I assume by this time they have started working temporary road sign recognition into the mix, to deal with road works. The signing at road works tends to be pretty weak, and haphazard, though.No, they mention specifically in the video I linked that they do not rely solely on GPS, that wouldn't be possible since GPS isn't accurate enough and the cars have to be able to drive even when there is poor GPS reception (which happens often in cities, e.g. "urban canyons" and tunnels), or if the sensor fails. Of course, it would be stupid not to use GPS as one additional sensor, but the GPS data is used together with all other sensor data to determine where the car is on the internal maps, and then you use the internal maps for navigation and route planing. (GPS can only give you a location of course, it can't help you find a route, for that you need a map, and using a map to plan your route is exactly what humans do as well, so I don't see how that could be considered stupid).
... industry doesn't abandon technology. It simply works out what went wrong and then takes steps to fix it.
That is probably the rational response to this incident.
We know that air crashes typically have more than one cause.

Simulated miles at this stage are basically worthless. Simulation is a fantastic way to get to a prototype, but real world testing is about finding what stimuli your simulations missed. To continue building miles on the simulators is more of a publicity stunt that engineering.
Simulated miles at this stage are basically worthless. Simulation is a fantastic way to get to a prototype, but real world testing is about finding what stimuli your simulations missed. To continue building miles on the simulators is more of a publicity stunt that engineering.
Strongly disagree. Simulation is the only way the engineering of these cars can progress.
The most important part of the "simulation" will be regression testing. Every change to the car's software will create a trip to the simulator to verify it still works in all their test situations. This is completely impractical with real cars.
You can also create situations far more extreme than in real life. If I were Google I'd have it set up like a video game where people could go at lunchtime to try and create a situation to beat the software.
(clowns falling from the sky!)
Someone who is an adviser for the robotics cars team at Google says, there is a rumour that LIDAR was turned off (search for "rumour", but the rest of the analysis is very good as well) :
Looking at the video again, I find it odd that the passenger looks up and looks startled, but his eyes are looking to the left of center, with the passenger being seated on the left side of the car. The exterior shot (whatever is really the true video quality) seem to show the pedestrian only when the pedestrian is at the center. Making me think that the passenger perhaps saw the pedestrian to the left of the car some time before the car hit the pedestrian
a) The "safety driver" would need to have observed the hazard before impact (which I think they may have).
b) They would immediately expect the AV to have responded and they would hesitate.
c) When the AV hadn't responded, they then would have had to remember that they had the ability to take control.
d) They then would have been challenged by the taking the decision to do so.
e) Once they had done that, they would have had to "find" the controls - the steering wheel and brake pedal as a minimum, since their hands (certainly) and feet (possibly) were not in normal driver position.
f) They then would apply whatever action they chose.
For a normal "hands on" driver, only steps a) and f) are involved - the others are not automatic steps and they will take a finite time to be processed.
This is precisely the point I made in reply 26 where I took this still from the interior video.
It does seem their reaction is consistent with noticing the pedestrian well before impact.

It does seem their reaction is consistent with noticing the pedestrian well before impact. I am not at all surprised by their lack of evasive action because of the points I outlined here:
a) The "safety driver" would need to have observed the hazard before impact (which I think they may have).
b) They would immediately expect the AV to have responded and they would hesitate.
c) When the AV hadn't responded, they then would have had to remember that they had the ability to take control.
d) They then would have been challenged by the taking the decision to do so.
e) Once they had done that, they would have had to "find" the controls - the steering wheel and brake pedal as a minimum, since their hands (certainly) and feet (possibly) were not in normal driver position.
f) They then would apply whatever action they chose.
For a normal "hands on" driver, only steps a) and f) are involved - the others are not automatic steps and they will take a finite time to be processed.
This really makes me want to find out what happened with the tech.