General > General Technical Chat
Tesla Full Self Driving (FSD) info - interesting stuff!
<< < (13/41) > >>
wilfred:

--- Quote from: apis on April 30, 2019, 05:39:01 pm ---Waymo disengagement rate for 2018 was 1 disengage per 11,017 miles out of 1.28 million self-driven miles in CA 2018. That corresponds to about one disengage per year for an average driver in the US.


--- End quote ---

Anything above zero means you need a steering wheel and an attentive driver. No steering wheel or no automation. A hybrid mix is the worst of all possibilities. It means you can't trust the automation or you can't blame the driver.

The family of someone struck by an auto needs to seek a just outcome and lawyers at 20 paces with a large corporation is going to be an issue. You can't treat it the same a a plane crash due to a firmware error. You can't take the cars off the road you can't spend millions on a crash investigation.

The only solution I can see is to have each vehicle carry a designated driver who will accept the legal responsibility for the firmware choices. Perhaps they insert a card to load ethical parameters to customise the firmwares programming. Which obviously means you can't set the car to circle the block whilst you pick up your dry cleaning. Nor can you use it to freight children to school or Saturday morning sports. Nor can you get dropped off at the door and have the car go and find a parking spot.

If autos are the solution then I don't think the problem is well understood. Buses and trains will solve congestion better. Clustering people closer to work and community facilities will work better. Children will be better off walking to school close to home or socialising with friends on a bus or train.
apis:

--- Quote from: wilfred on May 01, 2019, 12:45:09 am ---
--- Quote from: apis on April 30, 2019, 05:39:01 pm ---Waymo disengagement rate for 2018 was 1 disengage per 11,017 miles out of 1.28 million self-driven miles in CA 2018. That corresponds to about one disengage per year for an average driver in the US.

--- End quote ---
Anything above zero means you need a steering wheel and an attentive driver. No steering wheel or no automation. A hybrid mix is the worst of all possibilities. It means you can't trust the automation or you can't blame the driver.

--- End quote ---
When it comes to anything safety-critical it would have to be zero (it will never be exactly zero but close and what is most important: better than a human driver), no argument there.

For the most part safety means don't crash into something or drive off the road, which you can do deterministically quite well using good sensors (lidar), good maps and traditional control theory. That's what Waymo is doing. Tesla has taken a very different approach, they use only cheap sensors (cameras) and then use machine learning algorithms to convert the video feed into a 3D representation of the world. Such machine learning based computer vision systems have been making really amazing progress the last decades. But as we have seen, the Tesla system sometimes makes fatal mistakes. The old school, but more expensive, Waymo approach is much more robust. Waymo using lidar means they don't have to convert a 2d representation into 3d, they get a very accurate 3d representation directly from the lidar sensor (and on top of that they add sensor information from cameras and radar).

Tracking objects and steering the car to avoid objects (short term) is a solved problem as long as you have an accurate 3d representation of the world. The most difficult parts are things like trying to predict other drivers behaviour, pedestrians intentions and things like that.

However, there are many corner cases that aren't safety critical but can still be hard for a car to deal with, e.g. because getting out of the situation requires technically breaking the law (driving somewhere you're not allowed to driver normally when the normal road is blocked). Would it be legal to program the car to break the law? But It's not so critical if the car can't find the route because the road is blocked, it can just drive somewhere where it can stop safely and ask the passenger or a tele-operator for help. Maybe there is an obstacle on the road it's not sure how to deal with (there was an event with a mattress on the road the self driving cars stopped for while human drivers just drove over it.) The cars will be programmed to be overly cautious and it might be annoying if they stop for something they don't have to stop for, but it won't be a major problem as long as it doesn't happen too often. I imagine these kind of minor problems could be solved in 5-15 minutes and if it only happens once every year then it won't be so bad. And one would expect the number of disengagements to keep getting lower as they improve the software and sensors.


--- Quote from: wilfred on May 01, 2019, 12:45:09 am ---The family of someone struck by an auto needs to seek a just outcome and lawyers at 20 paces with a large corporation is going to be an issue. You can't treat it the same a a plane crash due to a firmware error. You can't take the cars off the road you can't spend millions on a crash investigation.

The only solution I can see is to have each vehicle carry a designated driver who will accept the legal responsibility for the firmware choices. Perhaps they insert a card to load ethical parameters to customise the firmwares programming. Which obviously means you can't set the car to circle the block whilst you pick up your dry cleaning. Nor can you use it to freight children to school or Saturday morning sports. Nor can you get dropped off at the door and have the car go and find a parking spot.
--- End quote ---
I think we will mainly see taxi services for quite some time, not privately owned self driving cars. If one of the cars cause a problem it would be the taxi company who is responsible (and they in turn would hold the manufacturers of the cars responsible).

You never need to worry about parking, a car picks you up when you need it and drops you off at your destination. It will be like taxi, but without the driver.


--- Quote from: wilfred on May 01, 2019, 12:45:09 am ---If autos are the solution then I don't think the problem is well understood. Buses and trains will solve congestion better. Clustering people closer to work and community facilities will work better. Children will be better off walking to school close to home or socialising with friends on a bus or train.

--- End quote ---
Busses and trains can also be made self driving. Busses and trains are great as long as they are filled with people. But busses and trains also drive around almost empty for a large part of the day. Proponents of PRT (podcars) have always claimed that on demand services with smaller cars are more efficient than traditional mass transit systems. The problem with cars today is that everyone wants their own car, so we have one person driving around with 3 seats empty (or more) and most of the time the car is parked somewhere. We normally don't use taxi today because it's too expensive to pay a human chauffeur, but if you don't need a human driver that changes.
apis:

--- Quote from: apis on May 01, 2019, 06:42:47 pm ---For the most part safety means don't crash into something or drive off the road, which you can do deterministically quite well using good sensors (lidar), good maps and traditional control theory. That's what Waymo is doing. Tesla has taken a very different approach, they use only cheap sensors (cameras) and then use machine learning algorithms to convert the video feed into a 3D representation of the world. Such machine learning based computer vision systems have been making really amazing progress the last decades. But as we have seen, the Tesla system sometimes makes fatal mistakes. The old school, but more expensive, Waymo approach is much more robust. Waymo using lidar means they don't have to convert a 2d representation into 3d, they get a very accurate 3d representation directly from the lidar sensor (and on top of that they add sensor information from cameras and radar).

--- End quote ---
I might add that Waymo also uses machine learning, and Tesla has radar sensors as well as cameras (but no expensive LIDAR). But from what I know there is a large difference in their approach and Tesla rely much more on machine learning.

In the last question of this talk the Waymo engineer also mentions they use a hybrid approach, only using machine learning when it improves safety (even if they have to use the more expensive sensors).

"You want to be safe in the environment, so you don't want to make errors in perception, prediction and planning. And the state of machine learning is not at the point where it never makes errors."

https://youtu.be/Q0nGo2-y0xY?t=3774
Marco:
The problem with radar is that for the moment it's just doppler for a huge FOV, not ToF distance based imaging... which makes it almost completely useless.
maginnovision:
I'm pretty sure ti has mmwave that can give you size, speed, and direction of travel for detected objects. Seems ok to me. Of course all the inputs are fed into a filter to determine what's most likely happening you never accept everything at face value. Except for Tesla, they seem like they're ML or nothing.
Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod