EEVblog Electronics Community Forum
General => General Technical Chat => Topic started by: Gyro on June 13, 2018, 02:08:52 pm
-
An interesting video from the BBC News site - What happens when a vehicle in front of you suddenly changes lanes to avoid a stationary object. Possibly debatable whether you would react in time but driving manually, I think I'd have had a reasonable chance of taking avoiding action too... hopefully!
https://www.bbc.co.uk/news/av/business-44460980/this-car-is-on-autopilot-what-happens-next (https://www.bbc.co.uk/news/av/business-44460980/this-car-is-on-autopilot-what-happens-next)
-
So according to Tesla, to use the "autopilot" correctly you have to be all but driving the vehicle yourself anyway. So what's the point in even having it?
-
It does sound as if you need to at least read the manual (release notes?) pretty thoroughly.
P.S. I don't know the level of visual input/acuity the system has, I noticed that the lead car indicated before it changed lanes, maybe it would have detected brake lights if they had been applied? I suppose it didn't need to be a stationary vehicle in the lane, it would presumably have behaved the same with a parked one at the side of the road, road crew etc., assuming that the driver hadn't disengaged autopilot (how do you know when to expect such occurrences? :-\).
-
I don't know, when I'm driving I generally keep enough distance between me and the car in front so if they stopped suddenly then so could I - that's kind of learn to drive 101? I've been in that exact situation on the freeway and not had a problem stopping in time.
On realising that I could not stop in time I would probably just follow the car that changed lanes because even if I collided with someone in the adjacent lane the impact between two cars traveling at the same speed and direction would be a lot less then going in to the back end of a stationary car at speed. If you were attentive of your surroundings in the first place you would have a higher degree of confidence the adjacent lane was clear anyway.
So I'm just going to say the Tesla was following too closely in what were otherwise ideal road conditions, perhaps it's programmed to break the road rules like every one else does ;)
Having said that the video shows this experiment being done exactly once so it's a bit of coin toss fact.
-
You may have a similar fate as the Tesla if you only focus on the car in front. I was taught that one should always look at least two cars ahead, for this reason. If you see an obstacle ahead of the car in front, you'll be more prepared to react than if all you look at is the rear of the car you're following.
That's something these "self-driving" cars can't do at all.
-
You may have a similar fate as the Tesla if you only focus on the car in front. I was taught that one should always look at least two cars ahead, for this reason. If you see an obstacle ahead of the car in front, you'll be more prepared to react than if all you look at is the rear of the car you're following.
This is how I drive and it has reduced the anxiety of driving significantly. It's also why I dislike being behind any vehicle that prevents me from such observation. Trucks aren't so bad, since their drivers are a bit more aware of the idiots in front of them and tend to allow for that - plus they are bigger and will tend to (ahem) "clear" the road if they make contact. It's the smaller vehicles such as vans and SUV's with tinting made from sheets of black hole material that makes seeing what's in front impossible.
One big concern is that if there is an hazard in front of such a vehicle, that they may change lanes to avoid it, rather than braking. This will leave you to discover the hazard only when they move out of the way - and you may not have time to take action. The lane change manouvre they performed may not be available to you if there is a vehicle in that lane next to you.
This sort of thing is a challenge for human drivers - but even those who aren't as attentive may notice something like the screeching of tyres or some other indications ... but can we expect the same from AV's?
-
Teslas (at least ones with radar?) can look ahead past what the driver can see, and react appropriately. here's a video of it doing pretty much exactly this in the real world.
https://www.youtube.com/watch?v=_Kti-9qsLpc (https://www.youtube.com/watch?v=_Kti-9qsLpc)
I'm not sure if the dummy cars in that BBC test have an appropriate radar signature to make this test a proper one. or maybe they are using a model of tesla that doesn't have radar?
here's more random autopilot catches... including another similar one to the first link with a 4WD (why are so many 4WD drivers idiots, is it because they know they are in a big car and so won't pay the ultimate price for their awful driving?) slamming on brakes in heavy but fast moving express traffic to catch an exit they should have missed...
https://www.youtube.com/watch?v=--xITOqlBCM (https://www.youtube.com/watch?v=--xITOqlBCM)
-
Teslas (at least ones with radar?) can look ahead past what the driver can see
It might under some circumstances be able to fish a Doppler signature out of the noise, but if it misses a fire truck we can safely say it's not 100%.
-
Teslas (at least ones with radar?) can look ahead past what the driver can see, and react appropriately. here's a video of it doing pretty much exactly this in the real world.
That video shows a situation very unlike the one in the Thatcham video. The relevant vehicles are moving, so there is doppler. Many of these car radar systems struggle with stationary objects.
-
we need to get these asap so we are not at the mercy of the courts and uber
-
It might under some circumstances be able to fish a Doppler signature out of the noise, but if it misses a fire truck we can safely say it's not 100%.
It will never be 100%. Somehow people got the weird idea it would be.
-
That video shows a situation very unlike the one in the Thatcham video. The relevant vehicles are moving, so there is doppler. Many of these car radar systems struggle with stationary objects.
Doppler works with relative motion, so cars not moving when you are shouldn't cause issues. Stationary vehicles are basically terrain features anyway and those aren't invisible either.
-
In the Thatcham video, the "car" forming an obstruction is a dummy, and it's not metallic. I'd be surprised if it shows up on radar.
I'd like to see the exercise repeated with a layer of foil over the back of the dummy car, to give it a radar signature more like a real one.
-
Why does it have to have a specific radar signature? A piece of dropped furniture there can kill you all the same if you don’t break or move...
-
Sure, but the furniture can damage and /or kill occupants in the car even if a human being is driving the vehicle.
An actual driver can see the furniture and not be able to react in time.
:-//
-
In the Thatcham video, the "car" forming an obstruction is a dummy, and it's not metallic. I'd be surprised if it shows up on radar.
I'd like to see the exercise repeated with a layer of foil over the back of the dummy car, to give it a radar signature more like a real one.
That could've been a brick wall with this logic. I just think Tesla thinks it can apply agile development principles for self driving, fix bugs with software update after someone dies, and play with our life. I think it is closer to level 1 self driving than the "autopilot" or level 3 it claims to be.
-
Easy situation for a human to predict.
Really hard one for a Tesla. Pretty easy though if the car in front lets the car behind know what _it_ can see and what it is about to do.
Car to car and some sort of car to borg control comms is what we need to make things really safe and smart.
-
Why does it have to have a specific radar signature? A piece of dropped furniture there can kill you all the same if you don’t break or move...
It's not even a furniture, it's an inflatable balloon on a very weak frame. Basically transparent to the radar. Also how often do you see furniture on the road and car before you avoiding it in the last moment?
-
In the Thatcham video, the "car" forming an obstruction is a dummy, and it's not metallic. I'd be surprised if it shows up on radar.
I'd like to see the exercise repeated with a layer of foil over the back of the dummy car, to give it a radar signature more like a real one.
That could've been a brick wall with this logic. I just think Tesla thinks it can apply agile development principles for self driving, fix bugs with software update after someone dies, and play with our life. I think it is closer to level 1 self driving than the "autopilot" or level 3 it claims to be.
The type of major obstruction which is by far and away the most likely to be present on a highway, is a vehicle, not a wall.
Nevertheless the problem here isn't the Tesla, it's the journalism. The test in the Thatcham video is so obviously, so deeply flawed, that it can only have been made to illustrate a point about "self driving" cars not being perfect for a non-technical audience.
To anyone with the first clue about how autonomous (or semi-autonomous, or "assisted", or whatever) vehicles sense their surroundings, a test with a lightweight cardboard "car" is meaningless.
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless, rather than risk an emergency manoeuvre which could have resulted in getting rear-ended by another vehicle.
-
That video shows a situation very unlike the one in the Thatcham video. The relevant vehicles are moving, so there is doppler. Many of these car radar systems struggle with stationary objects.
Doppler works with relative motion, so cars not moving when you are shouldn't cause issues. Stationary vehicles are basically terrain features anyway and those aren't invisible either.
The current radars in cars have a rather broad beam (or often 2 or 3 rather broad beams), and they don't scan. They aren't scanning individual objects one by one. They just get a horrible messy return from each of the 2 or 3 sensors, which are the aggregates of everything within each beam. It is only the camera systems and LIDAR which can apply some finesse in their scanning of the environment. All you can do with the radar signal is use doppler to distinguish things. For a narrow beam you can offset the returning doppler by the boresight velocity of the emitter. With the broad beam you are stuck with using a compromise doppler offset, compromising between the boresight velocities at the centre and the edge of the beam. Nevertheless, if you don't do that doppler offsetting you can't hope to distinguish anything useful from the echoes.
-
It will never be 100%. Somehow people got the weird idea it would be.
Which makes it inferior to a car which ensures an attentive driver ... break assist is fine, lane drift warning is fine, obstacle warning is fine, automatic lane following is fucking stupid. Lane following trades convenience for security and once it kills someone other than the driver government will very quickly decide it's not a worthy trade off, especially if it's a cop or first responder. Tesla is playing with fire.
-
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless
If it has a layer of metallized foil it will be solid to x-band, k-band and optical. Thatcham does testing of Automatic Emergency Braking systems, it's likely the foam car had a decent simulation of a car radar cross section.
-
It will never be 100%. Somehow people got the weird idea it would be.
Which makes it inferior to a car which ensures an attentive driver ...
:palm: Nothing is 100% reliable. Nor human, nor autopilot will ever be 100% reliable. As long as autopilot is more reliable than human, autopilot wins. Please explain how 99.95% reliable human is better than 99.99% reliable autopilot?
-
I think it depends what they were trying to achieve with the video.
I don't doubt for a minute that the folks at Thatcham know their stuff, which makes it all the more surprising that they'd produce a video showing such an obviously flawed demonstration. To my mind, the only plausible explanation is that they're trying to make a point - and a valid one at that - but have sacrificed good science for theatrics. That's a shame.
-
I'm not sure if the dummy cars in that BBC test have an appropriate radar signature to make this test a proper one. or maybe they are using a model of tesla that doesn't have radar?
here's more random autopilot catches... including another similar one to the first link with a 4WD (why are so many 4WD drivers idiots, is it because they know they are in a big car and so won't pay the ultimate price for their awful driving?)
Looking at these videos I think there is a problem in Thatcham's setup: their dummy cars aren't detectable as hazardous objects. After all the Tesla can drive into them without damage.
-
. With the broad beam you are stuck with using a compromise doppler offset, compromising between the boresight velocities at the centre and the edge of the beam. Nevertheless, if you don't do that doppler offsetting you can't hope to distinguish anything useful from the echoes.
There is a rather neat trick that the Sonar guys developed to solve that.
What you do is have three unevenly spaced rx antennas, then after downconversion you do the FFT thing and for each peak above threshold resolve a bearing based on the phase shift between the channels. The third channel helps you to resolve ambiguities.
this lets you resolve relative speed and bearing (and, if you use sufficiently short pulses and a high enough IF bandwidth, also range, but that is tricky because it means high peak power).
Interfereometric sonar is the term, in this case I guess "Interfereometric doppler radar".
Regards, Dan.
-
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless
If it has a layer of metallized foil it will be solid to x-band, k-band and optical. Thatcham does testing of Automatic Emergency Braking systems, it's likely the foam car had a decent simulation of a car radar cross section.
Quite right. At the frequencies used for these radars its trivial to make a target which is both a realistic radar cross-section and harmless in a crash. Its sad that people apparently lacking the knowledge to construct such a thing immediately assume the people at Thatcham are idiots or frauds.
-
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
I agree, Thatcham has a remarkable pedigree over many decades of vehicle testing, there is no reason for them to fake the test.
It does bring home that the marketing of self-driving cars is somewhat over-hyped. From a safety perspective this is worrying that it’s not self driving at all: by our very nature, as humans we’ll be lured by the distraction of other things. The recent Uber incident in Phoenix is a case in point.
-
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
There is no LIDAR
I agree, Thatcham has a remarkable pedigree over many decades of vehicle testing, there is no reason for them to fake the test.
I doubt they are experts in RF.
-
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless
If it has a layer of metallized foil it will be solid to x-band, k-band and optical. Thatcham does testing of Automatic Emergency Braking systems, it's likely the foam car had a decent simulation of a car radar cross section.
Quite right. At the frequencies used for these radars its trivial to make a target which is both a realistic radar cross-section and harmless in a crash. Its sad that people apparently lacking the knowledge to construct such a thing immediately assume the people at Thatcham are idiots or frauds.
Nobody is claiming that but it wouldn't be the first time that new/different technology requires different ways of testing. The people at Thatcham should at least do more research in how valid their methods are when testing Tesla's technology. edit: based on the videos showing that Tesla cars can avoid other cars in similar real life scenarios.
-
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
There is no LIDAR
If that’s the case, oh dear. The state of driving automation isn’t quite what the marketing folks would have you believe.
I agree, Thatcham has a remarkable pedigree over many decades of vehicle testing, there is no reason for them to fake the test.
I doubt they are experts in RF.
You’re so very, very desperately underestimating Thatcham’s capabilities.
-
If that’s the case, oh dear. The state of driving automation isn’t quite what the marketing folks would have you believe.
Lidar is nearly useless against snow, rain, fog.
-
If that’s the case, oh dear. The state of driving automation isn’t quite what the marketing folks would have you believe.
Lidar is nearly useless against snow, rain, fog.
Indeed, it’d need to slow down to accommodate such conditions, just as a human would do.
-
Indeed, it’d need to slow down to accommodate such conditions, just as a human would do.
The issue is that it starts to detect objects which actually are not there.
-
BTW Tesla claims it relies primarily on radar. https://www.tesla.com/en_EU/blog/upgrading-autopilot-seeing-world-radar?redirect=no (https://www.tesla.com/en_EU/blog/upgrading-autopilot-seeing-world-radar?redirect=no)
Also they say that radar still can detect humans but not plastic.
After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easily through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.
-
Indeed, it’d need to slow down to accommodate such conditions, just as a human would do.
The issue is that it starts to detect objects which actually are not there.
Which is what I and many other people do in impaired visibility conditions. Makes driving under these conditions extremely nerve wracking and something to be avoided if at all possible.
Now whether the autonomous system deals with these false detections better or worse than a human driver is unknown to me, but I am sure the autonomous systems can be and are being improved, while I am equally sure that humans are essentially static. So now the question is not will robots be better drivers than people, but when will they be. Tesla isn't there yet (and doesn't claim to be). Tesla does seem foolish to allow a less than fully capable system out into the real world, as they should know that people will depend on the system for more than it can actually do.
-
Which makes it inferior to a car which ensures an attentive driver ...
:palm: Nothing is 100% reliable. Nor human, nor autopilot will ever be 100% reliable. As long as autopilot is more reliable than human, autopilot wins. Please explain how 99.95% reliable human is better than 99.99% reliable autopilot?
I did in the bit you left out. Lane following is the key feature which fucks things up.
I am not comparing an autonomous car to a human driver car here, I'm comparing a drive assist car which by design ensures inattentive drivers to a drive assist car which isn't quite so insane. The insane part is just my opinion, but if the next time it plows into a firetruck there is a first responder in between, government will agree with me and all the disingenuous bullshit about autopilot not being a misleading term will come home to roost.
As I said, Tesla is playing with fire.
-
Which makes it inferior to a car which ensures an attentive driver ... break assist is fine, lane drift warning is fine, obstacle warning is fine, automatic lane following is fucking stupid. Lane following trades convenience for security and once it kills someone other than the driver government will very quickly decide it's not a worthy trade off, especially if it's a cop or first responder. Tesla is playing with fire.
It only makes it inferior if the number of accidents per distance driven is higher, or the casualties. You can fiddle with what you feel is most important a bit, but effectiveness is what ultimately decides whether it's a success or not. Humans are terrible at mundane tasks over a longer period of time and that's where technology can bring massive improvements.
Developing this successfully will take some time and unfortunately also some casualties. No progress comes for free. We need to stop pretending automated cars need to be perfect. Humans are terrible drivers and we quite readily accept a huge pile of casualties caused by human error each year. If automated cards improve upon that even 10% over a span of 10 years, it should pretty much be a done deal.
Unfortunately, public perception can be a huge pain and just a few casualties might hamper development for years to come. It's unfortunately a bit of a trolley problem, but shooting yourself in the foot isn't much use to anyone.
-
Are you going to dispute that if you leave all the automation in the Tesla except the lane following it won't get safer? Tesla has a conflicting interest with regard to autopilot, it will never ensure the necessary driver attentiveness for the best safety or anything close to it.
Autopilot is not a path to full autonomy. Such cars need to be developed with a system for ensuring driver attention so annoying only someone getting paid would put up with it (the Uber driver was paid to pay attention to the diagnostic screen, she was actually doing her job and she had no way of knowing the engineers were so fucking incompetent her job was to drive dangerously).
-
Are you going to dispute that if you leave all the automation in the Tesla except the lane following it won't get safer? Tesla has a conflicting interest with regard to autopilot, it will never ensure the necessary driver attentiveness for the best safety or anything close to it.
Autopilot is not a path to full autonomy. Such cars need to be developed with a system for ensuring driver attention so annoying only someone getting paid would put up with it (the Uber driver was paid to pay attention to the diagnostic screen, she was actually doing her job and she had no way of knowing the engineers were so fucking incompetent her job was to drive dangerously).
I don't know. We don't know. However, the problem is partly legislative. The current situation is a workaround because legislators aren't comfortable allowing fully autonomous cars, probably for fear of public outrage. So they field the next best thing; something that's basically that yet isn't officially called an autopilot and where the driver gets the short end of the legal stick.
It's a bit of a chicken and egg problem, or more accurately a variant of the trolley dilemma. People are going to object for fair reasons, but not doing it is probably even more costly.
-
Why does it have to have a specific radar signature? A piece of dropped furniture there can kill you all the same if you don’t break or move...
It's not even a furniture, it's an inflatable balloon on a very weak frame. Basically transparent to the radar. Also how often do you see furniture on the road and car before you avoiding it in the last moment?
I see fallen furniture (or similar 1+ meter objects) on the road about once every week. Have to take similar evasive action probably once a month. California Highway 101 between San Jose and San Francisco...
I don’t understand the insistence that the test is somehow flawed because the object on the road wasn’t a real car. Could be anything you encounter while driving - animal, human, hanging cable, stuff fallen from trucks (including furniture), you name it. Do we put radar reflectors now on everything because current self driving cars rely on radar? Still haven’t seen a proper pothole test either. Promising technology but it will be a while.
-
I see fallen furniture (or similar 1+ meter objects) on the road about once every week. Have to take similar evasive action probably once a month.
With a car in front of you evading it in the last possible moment?
-
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
I believe the only commercial car with LIDAR is the new Cadillac CT6 with Super Cruise. I don't know where Cadillac put the LIDAR hardware. In the autonomous cars from Waymo, UIber and others its in a big ugly box on the roof of the car. The CT6 looks like any other large luxury car.
-
Humans are terrible drivers and we quite readily accept a huge pile of casualties caused by human error each year.
There are terrible drivers and there are excellent drivers and everything in betweeen. In any case I prefer to be killed by a human error than by the fake "autopilot" pile of crap of a narcissistic megalomaniac named Musk or whatever.
There's never going to be, in a very, very lonnng time, any "autopilot" that comes even close to a good human driver because the technology and know how to pull that off does not exist yet. At least not as the streets/roads are now, not without help from additional support infrastructure, in which case it won't be an "autopilot".
The best you can say of the autopilot is that it has killed dozens already. Funny thing is in Elon's words it's never, not even once, been the autopilot's fault. Eat this. And if it has not killed more people it's because there was a human driver supervising.
-
As I said, a pile of crap:
https://www.youtube.com/watch?v=fc0yYJ8-Dyo (https://www.youtube.com/watch?v=fc0yYJ8-Dyo)
https://www.youtube.com/watch?v=VgQwHDFohTo (https://www.youtube.com/watch?v=VgQwHDFohTo)
There are dozens more. This has to stop.
-
As I said, a pile of crap:
Then post another 40 000 car crash fatalities caused by humans annually in USA alone.
-
Tesla has deliberately named its system autopilot: to set it apart from the systems of other automakers. At the same time, they write in small print that the system is only an assistance system and that the driver must drive himself and keep his hands on the steering wheel at all times.
All other manufacturers are so responsible here that they only ever call their systems 'assistance systems' (and I'm pretty sure in all other manufacturers there were marketing meetings with the examination whether one should / could call the own system also 'autopilot').
Tesla, on the other hand, inflates it from a marketing point of view, well aware that they are endangering lives because there are enough drivers who trust the term 'autopilot'.
In my opinion, Tesla is clearly to blame for a large part of such accidents simply because of the concept of calling it 'autopilot' and that they make no effort to rename it. And this renaming would have to be taken on by a responsible company...... but no, then the share price would fall by 5 points, thus they do not give a sh*t.
-
As I said, a pile of crap:
Then post another 40 000 car crash fatalities caused by humans annually in USA alone.
And now we have to add to those a few dozens moar due to the autopilot...
-
And now we have to add to those a few dozens moar due to the autopilot...
Dozens? Deaths occurred while using autopilot are still in low single digit. And this is after a few years of usage. Then I should say that hundreds of thousands mourn each year in US alone for deaths that happened in regular crashes. And many millions worldwide.
-
So what? How does it help more deaths courtesy of the autopilot? Do you realize these people wouldn't be dead now if it were not for the autopilot?
-
So what? How does it help more deaths courtesy of the autopilot? Do you realize these people wouldn't be dead now if it were not for the autopilot?
That you way overblown the problem. It may be an order of magnitude safer than car driven by human but you are sill outraged as sometimes shit still happens.
Do you realize these people wouldn't be dead now if it were not for the autopilot?
They might be dead because of their reckless driving even without autopilot. Actually overall more people likely would be killed.
-
Your logic is flawed.
-
Your logic is flawed.
Then let's ban all cars and planes altogether. They cause deaths!!!
Personally I'm fine as long as technology progress makes life safer than it was. I don't agree with a stance that it's acceptable as long as human deaths are caused by humans and completely wrong when rare death is cased by imperfection of automatic machine (humans still were responsible of negligent usage of technology).
-
So what? How does it help more deaths courtesy of the autopilot? Do you realize these people wouldn't be dead now if it were not for the autopilot?
There's a stat going round that tells us Autopilot-equipped Tesla cars travel around 320 million miles per fatality, on average in the US, all cars (including teslas) get 86 million miles of travel per fatality. So, yes.. a small number of people have died due to autopilot, but statistically, a lot more have lived. We just don't know of them because they're not road statistics, like they would have been in another car!
YES, autopilot has killed people that would be alive today if not for autopilot (that said, the list of autopilot deaths includes that guy who was WATCHING A MOVIE while his car drove him to work, it got confused by the lack of contrast between the side of a truck trailer and the sky, and tried to drive under the trailer..... after that is when they started using the radar a lot more... I'd say that guy would have died a lot sooner if he'd tried that in a regular car!)
BUT autopilot is also very good in a lot of situations, avoiding very bad crashes because it has response times far shorter than humans, and has rules that cover how to best react in situations we may never see.
Automating driving is a real world example of the trolley problem, but executed statistically rather than directly. You can make a choice to save a large number of people overall, but that choice is going to kill a smaller number of people who wouldn't have been killed in a car crash otherwise.
-
If they don't fix its behaviour with cars in front of it evading obstacles and a human is one of the obstacles next time, Tesla is going to be as fucked as Uber was ... maybe moreso given how much their brand is tied to the convenience of using an electronic device while "driving". There will be no room for excuses, the behaviour is known, the lack of attention of the drivers caused by lane following is known, it's just a question if the technology improves in time for them not to get fucked by chance.
You'd best not be long Tesla if they end up on the wrong side of the statistics.
PS. about radar being better with rain, that's true. The problem is how to get high directivity with a small system. You can focus light with an optical system a couple cm on each side. With radar, not so much ... as it stands the radar doesn't really image anything. It just picks out doppler signatures for a huge FOV and makes assumptions. Potentially lethally wrong assumptions. I personally believe stereoscopic should be able to be a decent alternative to LIDAR, but regardless of whether you use LIDAR or stereoscopics, interpreting the images beyond just following lines on the road is no small task. Not a task which should be experimented with with inattentive drivers IMO.
-
Before we fall into the trap of bringing in aviation into the mix as a comparison, please remember that almost all commercial passenger transport is in controlled airspace: that means that traffic is proactively and centrally coordinated. Airpsace also has the distinct avantage of having a third dimension, there are no physical road constraints, and there's far less traffic density in those three dimensions. There aren't too many pedestrians either.
This is completely different to the road vehicle scenario where there is no proactive traffic coordination, centralised or otherwise, vehicles are constrained by roads into a far higher density, and any deconfliction is purely reactive.
-
There are terrible drivers and there are excellent drivers and everything in betweeen. In any case I prefer to be killed by a human error than by the fake "autopilot" pile of crap of a narcissistic megalomaniac named Musk or whatever.
There's never going to be, in a very, very looong time, any "autopilot" that comes even close to a good human driver because the technology and know how to pull that out does not exist yet. At least not as the streets/roads are now, not without help from additional support infrastructure, in which case it won't be an "autopilot".
The best you can say of the autopilot is that it has killed dozens already. Funny thing is in Elon's words it's never, not even once, been the autopilot's fault. Eat this. And if it has not killed more people it's because there was a human driver overseeing it.
You obviously have very strong opinions in regards to Tesla and advancements in road technology. Do you have any numbers to substantiate your claims, like on how autopilot systems are impossible for "a very, very loong time" for example?
Again, this is a statistical matter. Obviously people are going to be freaked out by relinquishing control to a computer, but emotions aren't or shouldn't be the deciding factor. Considering it's not just your own safety but those of others too, the decision is likely to be taken out of your hands. Spilling blood over silly emotions is ridiculous and criminal.
-
Your logic is flawed.
His argument is sound. Not liking an argument because it doesn't line up with your emotions doesn't make it flawed. Ultimately the statistics determine what the way to go is and those are in this early stage already in favour of automation.
-
I think you don't understand the statistics.
-
You obviously have very strong opinions in regards to Tesla and advancements in road technology.
I wouldn't call a device that's killed dozens already an "advancement in road technology".
-
Before we fall into the trap of bringing in aviation into the mix as a comparison, please remember that almost all commercial passenger transport is in controlled airspace: that means that traffic is proactively and centrally coordinated. Airpsace also has the distinct avantage of having a third dimension, there are no physical road constraints, and there's far less traffic density in those three dimensions. There aren't too many pedestrians either.
This is completely different to the road vehicle scenario where there is no proactive traffic coordination, centralised or otherwise, vehicles are constrained by roads into a far higher density, and any deconfliction is purely reactive.
I disagree. Commercial aviation is a mix of complex three dimensional no fly zones, air corridors and open areas. Self regulation also is a vital part of the mix that makes things so safe. The skies are much more constrained and complex than you'd think, while the situation on roads is in many cases more clear cut by design.
-
I think you don't understand the statistics.
This is the second objection you make in this thread without explaining how or why. I hope you understand that people here have no other option than disregarding opinionated statements without arguments or substantiation. It's no use guessing why you might object. If you formulatenan argument, it might further the discussion.
-
I would not call a device that's killed dozens an "advancement in road technology".
Most people would agree that less deaths is a better results. I gather that people who feel the world is overpopulated might disagree, but I don't expect many people would feel this is an appropriate point of view.
-
Before we fall into the trap of bringing in aviation into the mix as a comparison, please remember that almost all commercial passenger transport is in controlled airspace: that means that traffic is proactively and centrally coordinated. Airpsace also has the distinct avantage of having a third dimension, there are no physical road constraints, and there's far less traffic density in those three dimensions. There aren't too many pedestrians either.
This is completely different to the road vehicle scenario where there is no proactive traffic coordination, centralised or otherwise, vehicles are constrained by roads into a far higher density, and any deconfliction is purely reactive.
I disagree. Commercial aviation is a mix of complex three dimensional no fly zones, air corridors and open areas. Self regulation also is a vital part of the mix that makes things so safe. The skies are much more constrained and complex than you'd think, while the situation on roads is in many cases more clear cut by design.
What is it you’re disagreeing about? As I stated, almost all commercial passenger transport is in controlled airspace. It’s like that for a reason: safety by centralised coordination. There are many aspects to flight safety, and this is how almost all commerical passenger flights avoid bumping into each other which is the scope of what we’re discussing here, compared to the lack of centralised coordination an autonomous car is subjected to.
(And FWIW, I’m a pilot, based within the London CTR, I am sure you’ll know just how complex that airspace is).
-
Howardlong you missed the point of me bringing up planes. It's was mentioned to show that this technology causes deaths as well. There is autopilot too (since a long time ago) but it wasn't the point and was not mentioned. It was mentioned to show that it's hypocritical to put one particular technology (which increases safety) under tough scrutiny while there are other technologies which cause deaths and everyone is OK with that.
-
Indeed, and my point is that we should be wary of comparing commercial aviation (which has enjoyed centralised coordination for very many decades) to driving an autonomous car (lacking any centralised coordination) has very fundamental flaws.
Now if there was an effort to integrate some kind of centralised vehicle coordination longer term, that would be interesting, but it would also need to have legacy integration.
Tesla calling it an "autopilot" hardly helps!
-
This is the second objection you make in this thread without explaining how or why.
This:
There's a stat going round that tells us Autopilot-equipped Tesla cars travel around 320 million miles per fatality, on average in the US, all cars (including teslas) get 86 million miles of travel per fatality. So, yes.. a small number of people have died due to autopilot, but statistically, a lot more have lived. We just don't know of them because they're not road statistics, like they would have been in another car!
Is a (textbook example of a) non-sequitur.
-
This is the second objection you make in this thread without explaining how or why.
This:
There's a stat going round that tells us Autopilot-equipped Tesla cars travel around 320 million miles per fatality, on average in the US, all cars (including teslas) get 86 million miles of travel per fatality. So, yes.. a small number of people have died due to autopilot, but statistically, a lot more have lived. We just don't know of them because they're not road statistics, like they would have been in another car!
Is a (textbook example of a) non-sequitur.
Is a (textbook example of a) non-sequitur.
-
What is it you’re disagreeing about? As I stated, almost all commercial passenger transport is in controlled airspace. It’s like that for a reason: safety by centralised coordination. There are many aspects to flight safety, and this is how almost all commerical passenger flights avoid bumping into each other which is the scope of what we’re discussing here, compared to the lack of centralised coordination an autonomous car is subjected to.
(And FWIW, I’m a pilot, based within the London CTR, I am sure you’ll know just how complex that airspace is).
The point is that the road also has all sorts of mechanisms to ensure separation and flow control. Just the concept of lanes keeps cars laterally separated. Intersections are either vertically separated or centrally controlled by traffic lights. Maximum speeds ensure relative speeds are limited. Stopping in high speed areas is not allowed. The flow of traffic on a more macroscopic scale is often controlled in busy areas by the authorities which can divert traffic, add lanes or close roads. Instead of the tower telling you to cross the runway, there's a traffic light. There really isn't much of a difference when you boil things down.
Conversely aircraft have collision avoidance mechanisms in place too, either in the form of electronics or obviously a pilot.
-
This:
There's a stat going round that tells us Autopilot-equipped Tesla cars travel around 320 million miles per fatality, on average in the US, all cars (including teslas) get 86 million miles of travel per fatality. So, yes.. a small number of people have died due to autopilot, but statistically, a lot more have lived. We just don't know of them because they're not road statistics, like they would have been in another car!
Is a (textbook example of a) non-sequitur.
Once more it would help to explain why you think that is the case, because it seems to be a sound argument. We shouldn't be left to guess why you disagree with the quote. Without knowing what you are disputing it's hard to have a well argued discussion.
Although I suspect it's more of an emotional matter than rational. The former obviously defies explaining. Somehow people tend to prefer to die at the hands of a human, even if that ultimately hurts their chances of survival.
-
Once more it would help to explain why you think that is the case, because it seems to be a sound argument. We shouldn't be left to guess why you disagree with the quote. Without knowing what you are disputing it's hard to have a well argued discussion.
Although I suspect it's more of an emotional matter than rational. The former obviously defies explaining. Somehow people tend to prefer to die at the hands of a human, even if that ultimately hurts their chances of survival.
George of the jungle is obviously very triggered by the idea of giving up his god given right to kill himself, his family and strangers with his own natural human incompetence in relation to long periods of solid concentration.
He's resorting to textbook discussion derailing techniques, by pushing out the bare minimum of information beyond "I disagree" - this is because the less he claims, the less he has to actually back up, and maybe that's a good thing, seeing he's already failed to back up any of the claims he's already made.
-
The point is that the road also has all sorts of mechanisms to ensure separation and flow control. Just the concept of lanes keeps cars laterally separated. Intersections are either vertically separated or centrally controlled by traffic lights. Maximum speeds ensure relative speeds are limited. Stopping in high speed areas is not allowed. The flow of traffic on a more macroscopic scale is often controlled in busy areas by the authorities which can divert traffic, add lanes or close roads. Instead of the tower telling you to cross the runway, there's a traffic light. There really isn't much of a difference when you boil things down.
Conversely aircraft have collision avoidance mechanisms in place too, either in the form of electronics or obviously a pilot.
The primary method of separation for commercial aircraft movement is via proactive centralised coordination. As I stated already, deconfliction for road vehicles doesn’t have this kind of coordination, instrad it’s piecemeal reactive. There is no communication between all road users to allow for proactive centralised coordination, we jump in our cars and go. If it did have some means of universal proactive coordination between road vehicles, we wouldn’t be having this conversation.
Aviation collision avoidance systems, which do bear similarity to vehicle assisted automation systems, are reactive and very much a secondary method of separation. They are also relatively rare to flag, because the primary proactive central coordination method has already done its job.
Edit: I’ll just add ghat, as an aside, if there was universal centralised coordination model for road vehicles, road utilisation could be far higher and delays far lower. Practically speaking, it will be a long time coming, and not all road users will adopt it anyway.
-
The primary method of separation for commercial aircraft movement is via proactive centralised coordination. As I stated already, deconfliction for road vehicles doesn’t have this kind of coordination, instrad it’s piecemeal reactive. There is no communication between all road users to allow for proactive centralised coordination, we jump in our cars and go. If it did have some means of universal proactive coordination between road vehicles, we wouldn’t be having this conversation.
Again, I object to the notion of it being piecemeal reactive coordination. The system is much more coherent and refined than that. It's mostly the implementation that differs, but road traffic is centrally coordinated. The fact that cars travel in the same direction in very close proximity to each other at very similar speeds on a highway is no accident, it's by design. One might call that passive centralised coordination, but traffic lights and other more macroscopic flow control aren't even passive.
-
Any road traffic coordination is by nature reactive, vehicles appear at random times in an unplanned manner.
Now, yes, you can sequence trafffic lights to improve road usage in a localised manner. This is hardly the same degree of control that an air traffic service mandates inside CAS. For example, you can’t randomly choose to go off piste and turn left or right without first coordinating that with the controller, and such requests are not that uncommon, particularly in bad weather. Sure, a mandatory TCAS resolution advisory will demand immediate action, but that is very rare, and you’d want to be communicating any action with the controller ASAP. After all, ATC in CAS is already there to maintain separatiion. A TCAS RA inside CAS will generally be as a result of pilot or controller error, and that’s the reason it’s rare.
So what I am trying to convey is that autonomous vehicles are relying on a reactive system for collision avoidance. You can talk about roads, lanes and traffic lights, but that is nowhere near the level of control that occurs in CAS. Outside CAS, yes, it’s not a million miles off, where we adhere to the Rules of the Air, and look out of the window a lot!
-
Edit: I’ll just add ghat, as an aside, if there was universal centralised coordination model for road vehicles, road utilisation could be far higher and delays far lower. Practically speaking, it will be a long time coming, and not all road users will adopt it anyway.
This is being developed by various research institutes. I expect self driving vehicles will get some kind of mesh networking ability to coordinate between them and the road (traffic light, traffic density, etc) in order to use roads more effectively.