In this case it wasn't important though, since that cyclist wasn't anywhere near that car.
I completely disagree. The cyclist is much closer than some of the other pedestrians that the system did detect and highlight. By the color of the boxes it appears that the system priorities each pedestrian but it appears to have completely overlooked the cyclist. The only reason that he was "unimportant" was that he didn't pull in front of the automated car and get run over. I don't know who's car this was but if someone had looked at this video sooner and realized that there was a problem with the system not recognizing cyclists then perhaps the accident in Arizona would have been prevented. (And hind sight is always 20/20 as they say!)
It totally failed to detect the cyclist at 6:17 in the video.
In this case it wasn't important
Right, and the cyclist hit by the Uber car wasn't important either, not until the very last second...
I really can't understand your answer. How is the cyclist not worthy of a blue rectangle?
This video is NOT from a self-driving car, it's from a collision avoidance system only intended to act as a backup system if the human driver fails to act in time. There is a big difference.
This system is supposed to break in the last second if an object is directly in front of the car, when the human driver does not react in time. This system will not cause any accident if it misinterprets the video. (Actually, it might cause an accident if it believed there was something in front of the car when there really was not, and suddenly break. So in this case it's better it detects too few than too many.) Every time it breaks where the human driver failed to react in time it prevents an accident that would otherwise have happened. So even if it fails to detect some things in front of the car, the net result is that this system saves many lives. Naturally, the more the better, but failing to detect someone doesn't cause an accident here. If you don't use that system, the accident would have happened anyway. There isn't any downside to using it even if it's much less than 100% accurate (as long as it doesn't cause false positives). No one is 100% accurate though, certainly not humans. And in this case that bicycle was never in imminent danger of being hit by the car so it made no difference if the system saw it in this case.
The situation is very different for a self-driving car though. If a self-driving car does not realise something is in front of it, it will very likely crash into it. It's not the backup system, it is the primary driver (and the human "safety driver" is the backup). A self driving car not only tracks things in front of the car but everywhere around it, it's much more advanced. So my point is that such a system, using video only, while impressive, isn't good enough for a self-driving car (by itself). That's why e.g. Waymo also uses LIDAR since it's much easier to interpret the data from such a sensor than it is to interpret 2D images.
The fact that this much simpler collision avoidance system detected the victim in the Uber accident by only using the very bad video that was released by the police shows that the Uber car really should have had no problem to detect that person in time.