Not really.
The cycle-pushing pedestrian contributed the most to this accident. They caused the hazard in the first place.
Unlike what the Sheriff said, it's looking more like a preventable accident. From Ubers video I thought the incident happened in a less built up area.
But those mobile phone videos show otherwise. Even though digital cameras over expose at night when set to automatic, the mobile videos show many light sources and Ubers video shouldn't been so dark.
The issue is here is not the actual accident and who's at fault,
IMO the Sheriff in Arizona was much too quick to say that it was NOT Uber's fault.
The system was not Fit for Purpose and should not have been in use on public roads in live traffic.
By this logic, human drivers are not fit for purpose and should also be banned from roads, because this sort of accident happens all the time.
I really want to see the data. Unfortunately they will do their best to make sure it never gets out (other than shitty dashcam footage), as it will only demonstrate their incompetence.
The system was not Fit for Purpose and should not have been in use on public roads in live traffic.
By this logic, human drivers are not fit for purpose and should also be banned from roads, because this sort of accident happens all the time.
Of course, Uber still needs to get their shit together because a LIDAR system should easily outperform humans in cases like this.
I really want to see the data. Unfortunately they will do their best to make sure it never gets out (other than shitty dashcam footage), as it will only demonstrate their incompetence.
this video is interesting. anybody want to translate some german? it seems mobileye is tested
Don't think I have heard anyone talk about the lidar blind spot near the car caused by relying on a single roof mounted unit.
These trials are not about testing critical safety mechanisms, they are for fine tuning a host of other things (predicting traffic flow, optimising road positioning and so on). Do you really think letting 2 tons of metal travelling at 50mph on public roads weaving through traffic and pedestrians is a good way of testing basic safety systems?
It totally failed to detect the cyclist at 6:17 in the video.
I'm sure that's just a coincidence.
In this case it wasn't important though, since that cyclist wasn't anywhere near that car.
It totally failed to detect the cyclist at 6:17 in the video.In this case it wasn't important
Re: EEVblog #1066 - Uber Autonomous Car Fatality - How?
« Reply #134 on: March 23, 2018, 10:08:13 PM »
This police inquery and its conclusions are staggering .... I wonder how is it possible that people do not notice that ....
The video presented does not have the scope of formal proof because that it is provided by the main interested party who can very well have manipulated it for its own interest ... (reduction of luminosity, cuts to make believe the impossibility of reaction and braking, etc ...)
Remember that this is not a surveillance camera video.
Several evidences seem to show that there is a problem with this video.
1) the road was lit, there is no appearance of lighting on the video.
2) The efficiency and the range of the headlights do not correspond to the headlights of a modern car. (Looks like headlights of a Ford Model T ... !!!)
3) There is no progressivity in the appearance of the pedestrian with his bike, it seems that portions of video are missing.
4) The sensitivity of the camera seems totally abnormal. Digital cameras are usually more sensitive to light than the human eyes .... There are even cameras capable of shooting in very low light levels.
Here, the video seems to have been made with the sensitivity of an old super 8 camera
Under such conditions, a reconstruction was essential to be done and it would have been necessary to publish the video made during the reconstitution so that one can compare.
Publish only the video of which there is no proof of authenticity, is biased and abnormal.
The result of the police investigation implies that the pedestrian would have committed suicide by knowingly crossing the road in a dark place in front of a vehicle that could not see and avoid it ..... this is no sense.
The reality would be rather that the pedestrian thought it was perfectly visible and that a motorist would certainly have braked to avoid it .... Of course, he committed an imprudence, but not a suicide.
Unfortunately, there was no driver, but an automatic system that failed to detect her and did not brake .....
UBER wants to pass this for a fatality (and succeeded), but for me, it is clear that it was not inevitable.
A pedestrian crossing a lighted road is visible from a distance enought for the driver to brake and at least to reduce the speed enough not to kill the pedestrian.
It totally failed to detect the cyclist at 6:17 in the video.
It totally failed to detect the cyclist at 6:17 in the video.Same at 6:47... but this is even worse as the cyclist drives towards the car on a collision course.
Maybe the programmer hated bikes
When I first noticed the cyclist I tried to find where he came from. As far as I can see he just teleported into the road because I can't see him ride from further back. It is wierd when you look close.
In this case it wasn't important though, since that cyclist wasn't anywhere near that car.I completely disagree. The cyclist is much closer than some of the other pedestrians that the system did detect and highlight. By the color of the boxes it appears that the system priorities each pedestrian but it appears to have completely overlooked the cyclist. The only reason that he was "unimportant" was that he didn't pull in front of the automated car and get run over. I don't know who's car this was but if someone had looked at this video sooner and realized that there was a problem with the system not recognizing cyclists then perhaps the accident in Arizona would have been prevented. (And hind sight is always 20/20 as they say!)
It totally failed to detect the cyclist at 6:17 in the video.In this case it wasn't importantRight, and the cyclist hit by the Uber car wasn't important either, not until the very last second...
I really can't understand your answer. How is the cyclist not worthy of a blue rectangle?
IMO the Sheriff in Arizona was much too quick to say that it was NOT Uber's fault.
This system is supposed to break in the last second if an object is directly in front of the car
This system will not cause any accident if it misinterprets the video.
This system will not cause any accident if it misinterprets the video.Red herring.
There's something wrong in the software if it can't see that bike.
By this logic, human drivers are not fit for purpose and should also be banned from roads, because this sort of accident happens all the time.
Uber have settled with the victims family:
https://www.theguardian.com/technology/2018/mar/29/uber-settles-with-family-of-woman-killed-by-self-driving-car
So it looks like ubers media management is largely working.
By this logic, human drivers are not fit for purpose and should also be banned from roads, because this sort of accident happens all the time.
Some human drivers are not fit to drive, and should be banned from the roads. This technology is claimed to improve safety - if it fails under what should have been a relatively trivial test case then it's not living up to its claims.
. But I'm interested in the logic of what should/shouldn't be allowed. As Dave pointed out, people tend to hold machines (ie. something that was designed) to a different standard than humans.
If they can't perform better than a human then they're useless.
[
Still waiting to see if AZ is going to press charges.
People can relate to crossing the road in front of an oncoming car driven by a human, sometimes it is done by making eye contact and confirming the driver has seen you. You cannot do that with an AV. That mistrust will be the Achilles heel of AV acceptance.
Well isn't that the point of any tool? To do things BETTER than a human can? What else is a wrench for? Or a volt meter? Or even a common hammer? If they can't perform better than a human then they're useless.
If they can't perform better than a human then they're useless.
Define "better".
I am being quite serious about that.
One could argue using a hammer is "better" - until an improperly programmed hammer misses the nail and hits your thumb. Is it still "better" in that case?
If you're one of the 5%, I think you'd have a fair claim that it's not beneficial. And you can't compare accident rates between human drivers and autonomous vehicles. People aren't going to accept a machine with a firmware error or a sensor error making life and death decisions. People who don't understand or don't care about the underlying engineering won't cut it any slack.
For crying out loud, they (Uber) even disabled the built-in safety system that came standard on the car! That's about like taking the brakes out of a car...
Bad enough that Uber's system failed miserably but there was NO excuse for them disabling the standard built in system.
If they can't perform better than a human then they're useless.
Define "better".
I am being quite serious about that.
One could argue using a hammer is "better" - until an improperly programmed hammer misses the nail and hits your thumb. Is it still "better" in that case?
No, not something wrong, just not 100% correct. You can never get 100% accuracy here, humans can not either. How good it has to be depends on the application. If that collision avoidance system manages to prevent even 5% of all accidents that means it's saving many lives every year (i.e. even if it fails to trigger 95% of the time it's still beneficial).
For autonomous cars where the system replaces human drivers, you want the new system to perform better than humans (still doesn't have to be perfect though). As long as robot cars are safer than human operated cars it means lives will be saved.If you're one of the 5%, I think you'd have a fair claim that it's not beneficial. And you can't compare accident rates between human drivers and autonomous vehicles. People aren't going to accept a machine with a firmware error or a sensor error making life and death decisions. People who don't understand or don't care about the underlying engineering won't cut it any slack.
People can relate to crossing the road in front of an oncoming car driven by a human, sometimes it is done by making eye contact and confirming the driver has seen you. You cannot do that with an AV. That mistrust will be the Achilles heel of AV acceptance.
I agree that people think that way. It's very unfortunate. The intervening period between "self driving cars have a lower accident rate" and "people trust self driving cars" could last for many, many years.
I suspect if you asked the average person "do you prefer 500 people dying to software bugs or 1000 people dying to drink driving", they would respond "I prefer 0 people dying", and continue to support the status quo.
People can relate to crossing the road in front of an oncoming car driven by a human, sometimes it is done by making eye contact and confirming the driver has seen you. You cannot do that with an AV. That mistrust will be the Achilles heel of AV acceptance.
Very good point. The question if people will accept autonomous cars has more to do with human psychology, politics and economics than the real benefits or performance of them.
As for comparing human drivers to machine drivers, the difference here is just because one human in ten thousand drives dangerously does not mean they all do, mass produce a machine that drives dangerously and every single one will do the same so a machine has to be of a higher standard as the results will always be the same or they should unless you have the engineering wrong, is that not what this discussion is all about engineering standards.
I suspect there's large segments of the population who see driving as a chore/problem and they'll all buy self-driving cars overnight if it lets them do other stuff instead.
I don't just mean people who'll sit there texting; Imagine being able to send the kids to school automatically, etc. Plenty of people would want that.
Then there's all the people who'll see it as a good way to drink more alcohol. There won't be any shortage of those.
etc.
None of those people will give a rats ass about pedestrians making eye contact.
The real solution is to not design cities where 1hour plus commutes are necessary. Cars have taken us from living the dream of freedom and mobility to a nightmare of congestion and distance.
If you're in heavy AV traffic with all the cars locked together in some algorithmic synchronicity, you may as well be in a bus.
You cannot do that [make eye contact] with an AV.
I suspect there's large segments of the population who see driving as a chore/problem and they'll all buy self-driving cars overnight if it lets them do other stuff instead.
"cannot" is a strong word... These days, most cameras recognize eye contact in order to focus more accurately.
If you're in heavy AV traffic with all the cars locked together in some algorithmic synchronicity, you may as well be in a bus.
There's a good point made in the comments in that Volvo video:
What if a gang of bad people* attacks your autonomous car?
You won't be able to drive away - the sensors will refuse to run them over!
(*) Or zombies, or whatever...
Volvo system fails too: (...)
Quote from: SilverSolder link=topic=106881.msg1466134#msg1466134
"cannot" is a strong word... These days, most cameras recognize eye contact in order to focus more accurately.
....if you sit right in front of it. Useless for the purpose being discussed.
Quote from: SilverSolder link=topic=106881.msg1466134#msg1466134
"cannot" is a strong word... These days, most cameras recognize eye contact in order to focus more accurately.
....if you sit right in front of it. Useless for the purpose being discussed.
Here is a video that shows where facial recognition is heading... hint, achieving eye contact is probably the least of its capabilities...
https://videos.posttv.com/washpost-production/The_Washington_Post/20171222/5a3d5cd1e4b0330b121d744b/5a4fd674e4b0cb2c842a15d3_1439412153584-wn5qra_t_1515181698375_640_360_600.mp4
Now that really is Big Brother 1984. Does any one really want to go that far although I am sure it will get foisted on us all eventually.
Before getting into the details of Tesla's statement, what happened? On March 23 a Tesla Model X crashed on a South Bay freeway -- at the Highway 101 and Highway 85 connector in Mountain View -- causing the death of 38-year-old San Mateo resident, Apple software engineer Walter Huang.
Huang's family reported that he alerted Tesla's service department to a big problem with his Model X's Autopilot. As the Mercury News reported, Huang's family asserted that he "had taken the car to a dealer several times and complained that the function kept steering the car toward the highway divider into which he crashed."
QuoteHuang's family reported that he alerted Tesla's service department to a big problem with his Model X's Autopilot. As the Mercury News reported, Huang's family asserted that he "had taken the car to a dealer several times and complained that the function kept steering the car toward the highway divider into which he crashed.""Hey this autopilot keeps trying to murder me, lets keep using it."?
Well isn't that the point of any tool? To do things BETTER than a human can? What else is a wrench for? Or a volt meter? Or even a common hammer? If they can't perform better than a human then they're useless.
Not necessarily. It's like any form on automation IMO. Even if it only works as well on average, it can still free up people to do other things, so it's not useless.
Agreed. Tesla's 'auto pilot' is nothing more than a fancy cruise control.
I'm wondering why Tesla's autopilot doesn't steer the car onto the hard shoulder (or the slowest lane)
PS. autopilot is a deceptive name and I hope Tesla gets taken to the cleaners for their continued use of the term.
PPS. taking a look at the stock price this was a 8 billion dollar accident, Tesla isn't kidding when they say the level of damage is unprecedented.
The problem is that joe public doesn't know what an "autopilot" is.
Another thing I haven't seen discussed anywhere is how easy these cars are to fool.
eg. What happens if a bunch of youtube idiots make some fake "Stop" signs and goes out to the autobahn to hold them up?
A human driver will easily ignore them but what will a robot car do? Emergency braking?
Another thing I haven't seen discussed anywhere is how easy these cars are to fool.
eg. What happens if a bunch of youtube idiots make some fake "Stop" signs and goes out to the autobahn to hold them up?
A human driver will easily ignore them but what will a robot car do? Emergency braking?Rumor says that if you place a beautiful naked woman beside a busy street, it won't be long before there are multiple crashes. (How will the Mythbusters test that in a safe and ethical way?) The self driving cars will probably recognize her as just another person.
Another thing I haven't seen discussed anywhere is how easy these cars are to fool.
eg. What happens if a bunch of youtube idiots make some fake "Stop" signs and goes out to the autobahn to hold them up?
A human driver will easily ignore them but what will a robot car do? Emergency braking?Rumor says that if you place a beautiful naked woman beside a busy street, it won't be long before there are multiple crashes. (How will the Mythbusters test that in a safe and ethical way?) The self driving cars will probably recognize her as just another person.
Another thing I haven't seen discussed anywhere is how easy these cars are to fool.
eg. What happens if a bunch of youtube idiots make some fake "Stop" signs and goes out to the autobahn to hold them up?
A human driver will easily ignore them but what will a robot car do? Emergency braking?
Another thing I haven't seen discussed anywhere is how easy these cars are to fool.
eg. What happens if a bunch of youtube idiots make some fake "Stop" signs and goes out to the autobahn to hold them up?
A human driver will easily ignore them but what will a robot car do? Emergency braking?
Seriously?
QuoteHuang's family reported that he alerted Tesla's service department to a big problem with his Model X's Autopilot. As the Mercury News reported, Huang's family asserted that he "had taken the car to a dealer several times and complained that the function kept steering the car toward the highway divider into which he crashed.""Hey this autopilot keeps trying to murder me, lets keep using it."?Agreed. Tesla's 'auto pilot' is nothing more than a fancy cruise control. If you don't understand that then Darwin has an award for you. Especially if the driver complained a few times about it and KNEW that the auto pilot had problems with this specific section of road. If the latter is really true then the Darwin award is well deserved.
Seriously?
Yes, seriously.
I can't imagine how anybody might think that throwing stuff off bridges might be fun but they still do it.
I can imagine people messing around to see how the dumbass computers react, eg. walking around town in a stop-sign t-shirt and filming it, etc. ("Not my fault, officer, I was just wearing a t-shirt")
California Vehicle Code section 21465 states, "No person shall place, maintain, or display upon, or in view of, any highway any unofficial sign, signal, device, or marking, or any sign, signal, device or marking which purports to be or is an imitation of, or resembles, an official traffic control device or which attempts to direct the movement of traffic or which hides from view any official traffic control device."
I wonder if that would apply to a t-shirt with a picture of a stop sign on it? What if it didn't actually say "Stop" but some clever word that resembles it at a glance? The law is obviously intended to prevent people from putting up fake traffic control signs.
Haven't you seen those little plastic kid figures placed at the edge of neighborhood roads that may be holding a flag and say SLOW! on the side?
They might want to focus on identifying and avoiding people and concrete barriers first...
that's the idea, nobody needs to own a vehicle. it comes and gets yous...
that's the idea, nobody needs to own a vehicle. it comes and gets yous...
Buses and light rail are perfect for rush hour, large numbers of people going from one place to another at roughly the same time. It's a far more efficient method of transportation than personal cars for that situation.
Well the way it works here in my corner of the US is there are park & ride lots in most suburbs with direct bus routes to and from the major cities. You walk, drive or bike to the local park & ride, bus to work and then bus back to the park & ride in the evening. When I worked downtown for a while I did that and it worked pretty well, the majority of my travel was on the bus taking up a fraction of the highway capacity of all those people in individual cars.
How well will autonomous cars work in the little country lanes we have where I live, lots of bends some sharp others not but most blind due to hedges and banks.
If the car cannot handle all situations automatically then will it need to have driving controls? If so, then the person in that seat will be needing a driving license. What happens if they are a person who wouldn't be capable of getting a license? Wasn't that part of the pitch in favour of AV's in the first place? Give freedom to the blind, the elderly and the infirm.
Who will want to be telling your Uber or taxi what to do and how to do it. You may as well have a taxi driver. Where's the progress.
Who will want a remote operator in control? They have no skin in the game like a human driver does. You want someone motivated to live another day.
As soon as they start killing dogs you'll have PETA and the RSPCA and every unaffiliated animal lover outraged and campaigning for AV's to avoid anything cute and furry on the roads.
And if I could customise the firmware I'd be swerving to flatten Cane Toads. Like I used to do.
Someone will hack the firmware and have them seeing Pokemons.
And to get back to being serious I recall many years ago following a car at about 60kmh on a road that carried heavy trucks. The trucks had worn the road surface so it had two ruts where the wheels traveled and in the rain that day shallow longitudinal puddles had formed. The car in front of me in a split second was wrapped around a tree. Ever since that day I always avoid such standing water. There are a couple of places in my vicinity where in heavy downpours the road can partially flood and drivers move from the kerbside lane to avoid hitting 20-30cm deep water across part of the lane. Can an AV reliably do that?
What happens if a box or a plank or a PVC pipe falls from a vehicle ahead? If you can swerve, do you? What will an AV do? I once ran over a mattress that was poorly secured. I did it in a split second deliberately choosing to not swerve from my lane or brake hard. If the mattress was a bookcase I'd have responded differently. Not to say effectively but I wouldn't have just run straight over it by choice.
The situations that present little challenge to a human need to be programmed. But you cannot cover every imaginable situation.
There is more than just bicyclists and pedestrians and other vehicles to be avoided. And as soon as the accidents start and people and property are harmed there will need to be legal redress, financial redress and blame apportioned. None of this seems to be anywhere but in the too hard basket.
People have always underestimated what can be automated
...
* Safe from crime: You don't have to fear being robbed, or if your a girl, to be raped.
...
* Safe from crime: You don't have to fear being robbed, or if your a girl, to be raped.
Wait... What?!
How is someone safer from attack in an autonomous car? Is the car going to magically come to your defense somehow if you're being attacked?!
I would think that having another human around would make travel significantly safer than a lone person traveling in an autopod...
Park&ride schemes can help with city centre congestion, but do little to reduce driven miles, car ownership, etc. They also require considerable space for the car park at the park&ride terminals.
I assume it refers to being attacked by the driver.
How is someone safer from attack in an autonomous car? Is the car going to magically come to your defense somehow if you're being attacked?!
I would think that having another human around would make travel significantly safer than a lone person traveling in an autopod...
They help greatly around here where the freeway/motorway is the choke point along with the roads and limited parking downtown. Many thousands of cars stay in the suburbs and those people take a bus to downtown, keeping thousands of cars off the freeways and out of the crowded downtown core.
Autonomous cars that are not privately owned don't solve the issue anyway, all they do is eliminate the taxi driver, it's a fancy way of replacing another relatively low skill job with automation.
Autonomous cars that are privately owned don't solve any problem except laziness and/or inattention. If they can take themselves somewhere without anyone on board then I imagine we'll see hundreds of them clogging the streets circling the block while their owners eat lunch or conduct business once they figure out that driving a few miles is cheaper than parking for a half hour.
Unlike what the Sheriff said, it's looking more like a preventable accident. From Ubers video I thought the incident happened in a less built up area.
But those mobile phone videos show otherwise. Even though digital cameras over expose at night when set to automatic, the mobile videos show many light sources and Ubers video shouldn't been so dark.
And it probably doesn't matter either way, because it's the LIDAR that should have picked it up and it didn't, under almost ideal practical circumstances.
I just saw this this morning. I'll probably come back later and read the rest of the thread, but I noticed there were a couple of posts on the first two pages saying they could not figure out where the bicycle came from at 6:17. It can be seen in front of the van parked on the left at 6:17.
IF it is the identification of every object within one's environment, then I would challenge that task as not even being reasonable.