So according to Tesla, to use the "autopilot" correctly you have to be all but driving the vehicle yourself anyway. So what's the point in even having it?
It does sound as if you need to at least read the manual (release notes?) pretty thoroughly.
P.S. I don't know the level of visual input/acuity the system has, I noticed that the lead car indicated before it changed lanes, maybe it would have detected brake lights if they had been applied? I suppose it didn't need to be a stationary vehicle in the lane, it would presumably have behaved the same with a parked one at the side of the road, road crew etc., assuming that the driver hadn't disengaged autopilot (how do you know when to expect such occurrences?
).
I don't know, when I'm driving I generally keep enough distance between me and the car in front so if they stopped suddenly then so could I - that's kind of learn to drive 101? I've been in that exact situation on the freeway and not had a problem stopping in time.
On realising that I could not stop in time I would probably just follow the car that changed lanes because even if I collided with someone in the adjacent lane the impact between two cars traveling at the same speed and direction would be a lot less then going in to the back end of a stationary car at speed. If you were attentive of your surroundings in the first place you would have a higher degree of confidence the adjacent lane was clear anyway.
So I'm just going to say the Tesla was following too closely in what were otherwise ideal road conditions, perhaps it's programmed to break the road rules like every one else does
Having said that the video shows this experiment being done exactly once so it's a bit of coin toss fact.
You may have a similar fate as the Tesla if you only focus on the car in front. I was taught that one should always look at least two cars ahead, for this reason. If you see an obstacle ahead of the car in front, you'll be more prepared to react than if all you look at is the rear of the car you're following.
That's something these "self-driving" cars can't do at all.
You may have a similar fate as the Tesla if you only focus on the car in front. I was taught that one should always look at least two cars ahead, for this reason. If you see an obstacle ahead of the car in front, you'll be more prepared to react than if all you look at is the rear of the car you're following.
This is how I drive and it has reduced the anxiety of driving significantly. It's also why I dislike being behind any vehicle that prevents me from such observation. Trucks aren't so bad, since their drivers are a bit more aware of the idiots in front of them and tend to allow for that - plus they are bigger and will tend to (ahem) "clear" the road if they make contact. It's the smaller vehicles such as vans and SUV's with tinting made from sheets of black hole material that makes seeing what's in front impossible.
One big concern is that if there is an hazard in front of such a vehicle, that they may change lanes to avoid it, rather than braking. This will leave you to discover the hazard only when they move out of the way - and you may not have time to take action. The lane change manouvre they performed may not be available to you if there is a vehicle in that lane next to you.
This sort of thing is a challenge for human drivers - but even those who aren't as attentive may notice something like the screeching of tyres or some other indications ... but can we expect the same from AV's?
Teslas (at least ones with radar?) can look ahead past what the driver can see, and react appropriately. here's a video of it doing pretty much exactly this in the real world.
I'm not sure if the dummy cars in that BBC test have an appropriate radar signature to make this test a proper one. or maybe they are using a model of tesla that doesn't have radar?
here's more random autopilot catches... including another similar one to the first link with a 4WD (why are so many 4WD drivers idiots, is it because they know they are in a big car and so won't pay the ultimate price for their awful driving?) slamming on brakes in heavy but fast moving express traffic to catch an exit they should have missed...
Teslas (at least ones with radar?) can look ahead past what the driver can see
It might under some circumstances be able to fish a Doppler signature out of the noise, but if it misses a fire truck we can safely say it's not 100%.
Teslas (at least ones with radar?) can look ahead past what the driver can see, and react appropriately. here's a video of it doing pretty much exactly this in the real world.
That video shows a situation very unlike the one in the Thatcham video. The relevant vehicles are moving, so there is doppler. Many of these car radar systems struggle with stationary objects.
we need to get these asap so we are not at the mercy of the courts and uber
It might under some circumstances be able to fish a Doppler signature out of the noise, but if it misses a fire truck we can safely say it's not 100%.
It will never be 100%. Somehow people got the weird idea it would be.
That video shows a situation very unlike the one in the Thatcham video. The relevant vehicles are moving, so there is doppler. Many of these car radar systems struggle with stationary objects.
Doppler works with relative motion, so cars not moving when you are shouldn't cause issues. Stationary vehicles are basically terrain features anyway and those aren't invisible either.
In the Thatcham video, the "car" forming an obstruction is a dummy, and it's not metallic. I'd be surprised if it shows up on radar.
I'd like to see the exercise repeated with a layer of foil over the back of the dummy car, to give it a radar signature more like a real one.
Why does it have to have a specific radar signature? A piece of dropped furniture there can kill you all the same if you don’t break or move...
Sure, but the furniture can damage and /or kill occupants in the car even if a human being is driving the vehicle.
An actual driver can see the furniture and not be able to react in time.
In the Thatcham video, the "car" forming an obstruction is a dummy, and it's not metallic. I'd be surprised if it shows up on radar.
I'd like to see the exercise repeated with a layer of foil over the back of the dummy car, to give it a radar signature more like a real one.
That could've been a brick wall with this logic. I just think Tesla thinks it can apply agile development principles for self driving, fix bugs with software update after someone dies, and play with our life. I think it is closer to level 1 self driving than the "autopilot" or level 3 it claims to be.
Easy situation for a human to predict.
Really hard one for a Tesla. Pretty easy though if the car in front lets the car behind know what _it_ can see and what it is about to do.
Car to car and some sort of car to borg control comms is what we need to make things really safe and smart.
Why does it have to have a specific radar signature? A piece of dropped furniture there can kill you all the same if you don’t break or move...
It's not even a furniture, it's an inflatable balloon on a very weak frame. Basically transparent to the radar. Also how often do you see furniture on the road and car before you avoiding it in the
last moment?
In the Thatcham video, the "car" forming an obstruction is a dummy, and it's not metallic. I'd be surprised if it shows up on radar.
I'd like to see the exercise repeated with a layer of foil over the back of the dummy car, to give it a radar signature more like a real one.
That could've been a brick wall with this logic. I just think Tesla thinks it can apply agile development principles for self driving, fix bugs with software update after someone dies, and play with our life. I think it is closer to level 1 self driving than the "autopilot" or level 3 it claims to be.
The type of major obstruction which is by far and away the most likely to be present on a highway, is a vehicle, not a wall.
Nevertheless the problem here isn't the Tesla, it's the journalism. The test in the Thatcham video is so obviously, so deeply flawed, that it can only have been made to illustrate a point about "self driving" cars not being perfect
for a non-technical audience.
To anyone with the first clue about how autonomous (or semi-autonomous, or "assisted", or whatever) vehicles sense their surroundings, a test with a lightweight cardboard "car" is meaningless.
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless, rather than risk an emergency manoeuvre which could have resulted in getting rear-ended by another vehicle.
That video shows a situation very unlike the one in the Thatcham video. The relevant vehicles are moving, so there is doppler. Many of these car radar systems struggle with stationary objects.
Doppler works with relative motion, so cars not moving when you are shouldn't cause issues. Stationary vehicles are basically terrain features anyway and those aren't invisible either.
The current radars in cars have a rather broad beam (or often 2 or 3 rather broad beams), and they don't scan. They aren't scanning individual objects one by one. They just get a horrible messy return from each of the 2 or 3 sensors, which are the aggregates of everything within each beam. It is only the camera systems and LIDAR which can apply some finesse in their scanning of the environment. All you can do with the radar signal is use doppler to distinguish things. For a narrow beam you can offset the returning doppler by the boresight velocity of the emitter. With the broad beam you are stuck with using a compromise doppler offset, compromising between the boresight velocities at the centre and the edge of the beam. Nevertheless, if you don't do that doppler offsetting you can't hope to distinguish anything useful from the echoes.
It will never be 100%. Somehow people got the weird idea it would be.
Which makes it inferior to a car which ensures an attentive driver ... break assist is fine, lane drift warning is fine, obstacle warning is fine, automatic lane following is fucking stupid. Lane following trades convenience for security and once it kills someone other than the driver government will very quickly decide it's not a worthy trade off, especially if it's a cop or first responder. Tesla is playing with fire.
If anything, the Tesla did exactly what it should; crash into something it was able to detect as being insubstantial and harmless
If it has a layer of metallized foil it will be solid to x-band, k-band and optical. Thatcham does testing of Automatic Emergency Braking systems, it's likely the foam car had a decent simulation of a car radar cross section.
It will never be 100%. Somehow people got the weird idea it would be.
Which makes it inferior to a car which ensures an attentive driver ...
Nothing is 100% reliable. Nor human, nor autopilot will ever be 100% reliable. As long as autopilot is more reliable than human, autopilot wins. Please explain how 99.95% reliable human is better than 99.99% reliable autopilot?
I think it depends what they were trying to achieve with the video.
I don't doubt for a minute that the folks at Thatcham know their stuff, which makes it all the more surprising that they'd produce a video showing such an obviously flawed demonstration. To my mind, the only plausible explanation is that they're trying to make a point - and a valid one at that - but have sacrificed good science for theatrics. That's a shame.
I'm not sure if the dummy cars in that BBC test have an appropriate radar signature to make this test a proper one. or maybe they are using a model of tesla that doesn't have radar?
here's more random autopilot catches... including another similar one to the first link with a 4WD (why are so many 4WD drivers idiots, is it because they know they are in a big car and so won't pay the ultimate price for their awful driving?)
Looking at these videos I think there is a problem in Thatcham's setup: their dummy cars aren't detectable as hazardous objects. After all the Tesla can drive into them without damage.