I reckon it will be a *lot* longer than 10 years before we see autonomous cars - if ever. Don't forget the engineering maxim "the first 90% of a project takes 10% of the time. The last 10% takes the other 90% of the time". IE. The easiest parts are tackled first and the remaining issues get increasingly harder - autonomous driving being a perfect example.
Hell, in the UK the 'smart' meter development programme has been running for over 10 years and the costs could rise to £20B - for a bloody meter that only has to count kW hours and send the numbers to a data centre periodically. And it's only just started rolling out meters that meet the current standard. That should be way simpler than autonomous driving - though like most developments benefitting from government direction (interference) it's way more complicated than it should have been.
Double hell - I'm still waiting for a house robot that will fetch me a beer and find the TV remote after 50+ years of promises from 'Tomorrows World' etc. How hard can that be?
The basics of driving, navigation and situational awareness are relatively straightforward (but not easy). The problem is the myriad of corner cases and unexpected situations that have to be dealt with - the traffic sign that has slipped and is now upside down; the washing machine that fell offf the back of a truck; the breakdown requiring a short reverse down a one-way street; etc. ad-infinitum. If a car stops everytime it encounters a situation it doesn't know how to deal with gridlock will quickly follow.
You either have to have a team of very clever people considering how to design the hardware and software to reliably deal with every conceivable possibility and then have some way to handle the unconceivable ones (or much more likely, the conceivable ones that didn't occur to the developers because being hip and eco only use electric scooters and public transport and don't drive). Alternatively you have to come up with something that can self determine the best course of action in every situation that arises. The first would require vast engineering resources, the latter extremely good AI. That may be the answer and I don't know how good the state of the art is but I'm not holding my breath.
Some of the more interesting problems will arise from the non-autonomous vehicles co-occupying the roads. Human drivers will have a great deal of fun exploiting the safety first characteristics of self driving cars by, for example, forcing their way into traffic from a side street knowing that the autonomous car will be programmed to give way to avoid a collision. Or pranksters competing to come up with the funniest/cleverest scheme to halt or misdirect autonomous cars by the subtle application of bits of tape to road signs etc. Changes to the law will probably be the answer along with automated reporting of dash-cam footage showing aggressive human driving behaviour etc. but these will take time to implement and both sides will adapt and evolve; but how long will it take?
For as long as I can remember there has been no shortage of headlines trumpeting major scientific breakthroughs every few months. But almost invariably, after the initial excitement, it turns out that they've only developed the first 90% and all it requires for the last 10% are a few minor wrinkles to be ironed out, somebody to decide the best shade of blue to paint it, a solution to the Goldbach conjecture and the discovery of some new materials which can withstand 5000C whilst being a bit stronger than graphene at half the weight.
Humans both drive when drunk and when copulating (and probably both), they also drive while using their smart phone or when they are too tired or distracted by other things. They also cause accidents when at their best. But in the end it just comes down to who killes fewest people. I don't care if I get run over by a human or a self driving car, what I care about is how likely it is to happen.
You might not care but almost everybody else will. People tend to get very excited about deaths caused by things out of people's control and very quickly start baying for blood/something to be done/heads to roll. Rational consideration of cost - benefit tradeoffs won't be at the fore-front of most observer's minds (unless they happen to be engineers). When a human f**ks up and kills someone it's an accident. When something manufactured f**ks up and kills someone because of a design defect it's a company/corporation head that has to roll. Laws will have to change radically to allow autonomous vehicles to operate without the threat of company destroying liability costs arising from every incident involving human injury or death.
Case in point: it's my suspicion that railways in the UK are much too safe. The death rate on the railways is extremely low compared to the roads - which is a result of public demands for change after every major accident. That almost invariably means more cost and as a result we have a fabulously safe, but hideously expensive railway network that hardly anyone uses because it's too expensive. Eg. a child gets killed at an uncontrolled crossing resulting in all such crossings to be closed and fences to be erected alongside every inch of track to prevent any inadvertent access. When I say hardly anyone, that's relative to road usage of course - our trains are regularly overcrowded but only account for less than 9% of passenger miles. (Yes I do understand that many of the most important safety improvements don't incur significant costs - but many do.)
There are plenty of youtube videos showing the difficulties and problems that self driving cars get into for those interested.