Author Topic: Thatcham research lab Tesla test - vehicle ahead takes avoiding action  (Read 9467 times)

0 Members and 1 Guest are viewing this topic.

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
. With the broad beam you are stuck with using a compromise doppler offset, compromising between the boresight velocities at the centre and the edge of the beam. Nevertheless, if you don't do that doppler offsetting you can't hope to distinguish anything useful from the echoes.
There is a rather neat trick that the Sonar guys developed to solve that.

What you do is have three unevenly spaced rx antennas, then after downconversion you do the FFT thing and for each peak above threshold resolve a bearing based on the phase shift between the channels. The third channel helps you to resolve ambiguities. 
this lets you resolve relative speed and bearing (and, if you use sufficiently short pulses and a high enough IF bandwidth, also range, but that is tricky because it means high peak power).

Interfereometric sonar is the term, in this case I guess "Interfereometric doppler radar".

Regards, Dan.
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8646
  • Country: gb
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless
If it has a layer of metallized foil it will be solid to x-band, k-band and optical. Thatcham does testing of Automatic Emergency Braking systems, it's likely the foam car had a decent simulation of a car radar cross section.
Quite right. At the frequencies used for these radars its trivial to make a target which is both a realistic radar cross-section and harmless in a crash. Its sad that people apparently lacking the knowledge to construct such a thing immediately assume the people at Thatcham are idiots or frauds.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Even if the target is just polystyrene, should the LIDAR not have some functionality here?

I agree, Thatcham has a remarkable pedigree over many decades of vehicle testing, there is no reason for them to fake the test.

It does bring home that the marketing of self-driving cars is somewhat over-hyped. From a safety perspective this is worrying that it’s not self driving at all: by our very nature, as humans we’ll be lured by the distraction of other things. The recent Uber incident in Phoenix is a case in point.

 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
There is no LIDAR
Quote
I agree, Thatcham has a remarkable pedigree over many decades of vehicle testing, there is no reason for them to fake the test.
I doubt they are experts in RF.
« Last Edit: June 14, 2018, 08:44:51 pm by wraper »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless
If it has a layer of metallized foil it will be solid to x-band, k-band and optical. Thatcham does testing of Automatic Emergency Braking systems, it's likely the foam car had a decent simulation of a car radar cross section.
Quite right. At the frequencies used for these radars its trivial to make a target which is both a realistic radar cross-section and harmless in a crash. Its sad that people apparently lacking the knowledge to construct such a thing immediately assume the people at Thatcham are idiots or frauds.
Nobody is claiming that but it wouldn't be the first time that new/different technology requires different ways of testing. The people at Thatcham should at least do more research in how valid their methods are when testing Tesla's technology. edit: based on the videos showing that Tesla cars can avoid other cars in similar real life scenarios.
« Last Edit: June 14, 2018, 09:43:57 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
There is no LIDAR

If that’s the case, oh dear. The state of driving automation isn’t quite what the marketing folks would have you believe.

Quote
Quote
I agree, Thatcham has a remarkable pedigree over many decades of vehicle testing, there is no reason for them to fake the test.
I doubt they are experts in RF.

You’re so very, very desperately underestimating Thatcham’s capabilities.
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
If that’s the case, oh dear. The state of driving automation isn’t quite what the marketing folks would have you believe.
Lidar is nearly useless against snow, rain, fog.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
If that’s the case, oh dear. The state of driving automation isn’t quite what the marketing folks would have you believe.
Lidar is nearly useless against snow, rain, fog.

Indeed, it’d need to slow down to accommodate such conditions, just as a human would do.
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Indeed, it’d need to slow down to accommodate such conditions, just as a human would do.
The issue is that it starts to detect objects which actually are not there.
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
BTW Tesla claims it relies primarily on radar. https://www.tesla.com/en_EU/blog/upgrading-autopilot-seeing-world-radar?redirect=no
Also they say that radar still can detect humans but not plastic.

Quote
After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easily through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.
 

Online CatalinaWOW

  • Super Contributor
  • ***
  • Posts: 5231
  • Country: us
Indeed, it’d need to slow down to accommodate such conditions, just as a human would do.
The issue is that it starts to detect objects which actually are not there.

Which is what I and many other people do in impaired visibility conditions.  Makes driving under these conditions extremely nerve wracking and something to be avoided if at all possible.

Now whether the autonomous system deals with these false detections better or worse than a human driver is unknown to me, but I am sure the autonomous systems can be and are being improved, while I am equally sure that humans are essentially static.  So now the question is not will robots be better drivers than people, but when will they be.  Tesla isn't there yet (and doesn't claim to be).  Tesla does seem foolish to allow a less than fully capable system out into the real world, as they should know that people will depend on the system for more than it can actually do.
 

Online Marco

  • Super Contributor
  • ***
  • Posts: 6721
  • Country: nl
Which makes it inferior to a car which ensures an attentive driver ...
:palm: Nothing is 100% reliable. Nor human, nor autopilot will ever be 100% reliable. As long as autopilot is more reliable than human, autopilot wins. Please explain how 99.95% reliable human is better than 99.99% reliable autopilot?

I did in the bit you left out. Lane following is the key feature which fucks things up.

I am not comparing an autonomous car to a human driver car here, I'm comparing a drive assist car which by design ensures inattentive drivers to a drive assist car which isn't quite so insane. The insane part is just my opinion, but if the next time it plows into a firetruck there is a first responder in between, government will agree with me and all the disingenuous bullshit about autopilot not being a misleading term will come home to roost.

As I said, Tesla is playing with fire.
« Last Edit: June 15, 2018, 12:16:33 am by Marco »
 

Offline Mr. Scram

  • Super Contributor
  • ***
  • Posts: 9810
  • Country: 00
  • Display aficionado
Which makes it inferior to a car which ensures an attentive driver ... break assist is fine, lane drift warning is fine, obstacle warning is fine, automatic lane following is fucking stupid. Lane following trades convenience for security and once it kills someone other than the driver government will very quickly decide it's not a worthy trade off, especially if it's a cop or first responder. Tesla is playing with fire.
It only makes it inferior if the number of accidents per distance driven is higher, or the casualties. You can fiddle with what you feel is most important a bit, but effectiveness is what ultimately decides whether it's a success or not. Humans are terrible at mundane tasks over a longer period of time and that's where technology can bring massive improvements.

Developing this successfully will take some time and unfortunately also some casualties. No progress comes for free. We need to stop pretending automated cars need to be perfect. Humans are terrible drivers and we quite readily accept a huge pile of casualties caused by human error each year. If automated cards improve upon that even 10% over a span of 10 years, it should pretty much be a done deal.

Unfortunately, public perception can be a huge pain and just a few casualties might hamper development for years to come. It's unfortunately a bit of a trolley problem, but shooting yourself in the foot isn't much use to anyone.
 

Online Marco

  • Super Contributor
  • ***
  • Posts: 6721
  • Country: nl
Are you going to dispute that if you leave all the automation in the Tesla except the lane following it won't get safer? Tesla has a conflicting interest with regard to autopilot, it will never ensure the necessary driver attentiveness for the best safety or anything close to it.

Autopilot is not a path to full autonomy. Such cars need to be developed with a system for ensuring driver attention so annoying only someone getting paid would put up with it (the Uber driver was paid to pay attention to the diagnostic screen, she was actually doing her job and she had no way of knowing the engineers were so fucking incompetent her job was to drive dangerously).

 

Offline Mr. Scram

  • Super Contributor
  • ***
  • Posts: 9810
  • Country: 00
  • Display aficionado
Are you going to dispute that if you leave all the automation in the Tesla except the lane following it won't get safer? Tesla has a conflicting interest with regard to autopilot, it will never ensure the necessary driver attentiveness for the best safety or anything close to it.

Autopilot is not a path to full autonomy. Such cars need to be developed with a system for ensuring driver attention so annoying only someone getting paid would put up with it (the Uber driver was paid to pay attention to the diagnostic screen, she was actually doing her job and she had no way of knowing the engineers were so fucking incompetent her job was to drive dangerously).
I don't know. We don't know. However, the problem is partly legislative. The current situation is a workaround because legislators aren't comfortable allowing fully autonomous cars, probably for fear of public outrage. So they field the next best thing; something that's basically that yet isn't officially called an autopilot and where the driver gets the short end of the legal stick.

It's a bit of a chicken and egg problem, or more accurately a variant of the trolley dilemma. People are going to object for fair reasons, but not doing it is probably even more costly.
 

Offline ivaylo

  • Frequent Contributor
  • **
  • Posts: 661
  • Country: us
Why does it have to have a specific radar signature? A piece of dropped furniture there can kill you all the same if you don’t break or move...
It's not even a furniture, it's an inflatable balloon on a very weak frame. Basically transparent to the radar. Also how often do you see furniture on the road and car before you avoiding it in the last moment?
I see fallen furniture (or similar 1+ meter objects) on the road about once every week. Have to take similar evasive action probably once a month. California Highway 101 between San Jose and San Francisco...
I don’t understand the insistence that the test is somehow flawed because the object on the road wasn’t a real car. Could be anything you encounter while driving - animal, human, hanging cable, stuff fallen from trucks (including furniture), you name it. Do we put radar reflectors now on everything because current self driving cars rely on radar? Still haven’t seen a proper pothole test either. Promising technology but it will be a while.
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
I see fallen furniture (or similar 1+ meter objects) on the road about once every week. Have to take similar evasive action probably once a month.
With a car in front of you evading it in the last possible moment?
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8646
  • Country: gb
Even if the target is just polystyrene, should the LIDAR not have some functionality here?
I believe the only commercial car with LIDAR is the new Cadillac CT6 with Super Cruise. I don't know where Cadillac put the LIDAR hardware. In the autonomous cars from Waymo, UIber and others its in a big ugly box on the roof of the car. The CT6 looks like any other large luxury car.
 

Offline GeorgeOfTheJungle

  • Super Contributor
  • ***
  • !
  • Posts: 2699
  • Country: tr
Humans are terrible drivers and we quite readily accept a huge pile of casualties caused by human error each year.

There are terrible drivers and there are excellent drivers and everything in betweeen. In any case I prefer to be killed by a human error than by the fake "autopilot" pile of crap of a narcissistic megalomaniac named Musk or whatever.

There's never going to be, in a very, very lonnng time, any "autopilot" that comes even close to a good human driver because the technology and know how to pull that off does not exist yet. At least not as the streets/roads are now, not without help from additional support infrastructure, in which case it won't be an "autopilot".

The best you can say of the autopilot is that it has killed dozens already. Funny thing is in Elon's words it's never, not even once, been the autopilot's fault. Eat this. And if it has not killed more people it's because there was a human driver supervising.
« Last Edit: July 02, 2018, 02:07:14 pm by GeorgeOfTheJungle »
The further a society drifts from truth, the more it will hate those who speak it.
 

Offline GeorgeOfTheJungle

  • Super Contributor
  • ***
  • !
  • Posts: 2699
  • Country: tr
As I said, a pile of crap:





There are dozens more. This has to stop.
« Last Edit: June 15, 2018, 07:55:19 am by GeorgeOfTheJungle »
The further a society drifts from truth, the more it will hate those who speak it.
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
As I said, a pile of crap:
Then post another 40 000 car crash fatalities caused by humans annually in USA alone.
 
The following users thanked this post: Mr. Scram

Offline Pinkus

  • Frequent Contributor
  • **
  • Posts: 773
Tesla has deliberately named its system autopilot: to set it apart from the systems of other automakers. At the same time, they write in small print that the system is only an assistance system and that the driver must drive himself and keep his hands on the steering wheel at all times.
All other manufacturers are so responsible here that they only ever call their systems 'assistance systems' (and I'm pretty sure in all other manufacturers there were marketing meetings with the examination whether one should / could call the own system also 'autopilot').
Tesla, on the other hand, inflates it from a marketing point of view, well aware that they are endangering lives because there are enough drivers who trust the term 'autopilot'.

In my opinion, Tesla is clearly to blame for a large part of such accidents simply because of the concept of calling it 'autopilot' and that they make no effort to rename it. And this renaming would have to be taken on by a responsible company...... but no, then the share price would fall by 5 points, thus they do not give a sh*t.
 
The following users thanked this post: GeorgeOfTheJungle

Offline GeorgeOfTheJungle

  • Super Contributor
  • ***
  • !
  • Posts: 2699
  • Country: tr
As I said, a pile of crap:
Then post another 40 000 car crash fatalities caused by humans annually in USA alone.

And now we have to add to those a few dozens moar due to the autopilot...
The further a society drifts from truth, the more it will hate those who speak it.
 

Offline wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
And now we have to add to those a few dozens moar due to the autopilot...
Dozens? Deaths occurred while using autopilot are still in low single digit. And this is after a few years of usage. Then I should say that hundreds of thousands mourn each year in US alone for deaths that happened in regular crashes. And many millions worldwide.
 

Offline GeorgeOfTheJungle

  • Super Contributor
  • ***
  • !
  • Posts: 2699
  • Country: tr
So what? How does it help more deaths courtesy of the autopilot? Do you realize these people wouldn't be dead now if it were not for the autopilot?
The further a society drifts from truth, the more it will hate those who speak it.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf