That's just down to how insurance works though. An at-fault claim means your insurer had to pay out. Since you came off your bike, presumably damaging it in the process, and there was no other person to take the blame for the accident then your insurer pays out.
My point is, there's a really important subtlety to the terminology here.
Was I, in practical engineering terms, "at fault?"
No. You were not, in the strictest sense, legal or otherwise, at fault. But...
In this case, there really wasn't one. The fault was with whoever covered the road in diesel. Yet, as you point out, the way insurance works is they insist on this idea that whichever named person is "at fault" is the one who pays out - and in the absence of a name, as in this case, they fall back to an error handler which intentionally breaks the database.
The list of individuals labelled "at fault" no longer corresponds to the list of individuals whose driving exhibited a fault, and if information from the database were used to inform developers of self-driving equipment, somewhere there would be an engineer tasked with making a bike not fall over when it loses all grip at one of its wheels. Good luck with that.
... as far as the insurer is concerned, you cost them £x more that day than the day before. They had to cut you a cheque, or pay for repairs to the bike. Injuries could have occurred. As far as the insurer is concerned, you are now a higher risk.
The granularity of this is poor, but look at it from the insurer's perspective (I know, hard to see insurers as reasonable, but work with me here.) Before you have an accident, all an insurer knows is, your age, your vehicle, where you likely use that vehicle and maybe some ancillary details like your profession and employer. They build a risk model with that information, make a bet that they'll only have to pay out, say, £500 on average that year, work out their admin cost and present you with a premium for, say, £550. If you go and have an accident, then you change that risk model. Even if that's an accident which isn't fundamentally your *actual* fault, they might argue if you were travelling 10 mph slower, or at a different time of day, or on different roads, you would be less likely to be involved, and not made the claim. It would be nice if the model took more information in, but they've probably only just moved on from giant actuary tables for risk assessment. Insurance companies move slowly!
If you want better granularity, you get things like black boxes which micro-analyse your driving. This often results in premiums reducing for genuinely careful drivers, but suddenly those drivers who are 'worse' find their premiums going up because they were previously lying nicely in the middle of that distribution of risk benefiting from those who drive safer on average keeping their premiums low. (Side note - the lack of transparency of these algorithms is one reason I will never get one of these boxes fitted.)
A friend of mine thinks that sooner or later there will be an incident where a self driving car kills a bunch of people and then the whole thing will be outlawed. I tend to agree, but I'll watch and see what happens. For a while at least a lot of companies were jumping in and trying to rush this tech out into the world, and there is a lot of hubris. All it's going to take is an incident where one of these cars gets confused and plows into a big group of cyclists, a parade route or event in a park and mows down a dozen people. It actually doesn't even matter if they're statistically safer because people fail at statistical analysis. People worry about EV battery fires even though ICE cars catch fire at far greater rates. They worry about shark attacks even though far more people die from drowning in the water than from sharks. They fear flying because the plane could crash and yet driving to the airport is far more likely to result in a fatal crash.
I think your friend misunderstands just how cautious SDC's are, if anything they are too cautious compared to human drivers. Tesla's for instance phantom brake sometimes when they perceive pedestrians on the road regardless of whether they are really there - the NN ends up being overly sensitive to these objects because it's hyper-trained on them. The Waymo car will pretty much not move if there is a pedestrian standing near to it. I could actually foresee very effective protests against SDC's by say taxi drivers now out of a job, you merely need to get close to one to interfere with its operation and it will shut right down. There have been some experiments with fake stop signs on billboards to interfere with SDC's, which will need some interesting workarounds. I actually think we will get to the point where we have remote human drivers to 'unstick' vehicles that get stuck, but the number of these operators will be many fewer than cars on the road so it will still work out as a net win from a labour savings perspective. And that's really what SDC's are about from a commercial perspective, Waymo et al. don't really care that much about safety, other than not wanting to be embroiled in lawsuits, they want to run taxis where you don't need to employ people to run them.
I could see freak accidents like that Uber SDC which ran down the cyclist - Uber tried to pin the blame on the safety driver who was texting while driving but looking at the footage, the cyclist was wearing all black pushing a black bike at night with no lights, and it's hard to see how any driver could have seen that either - but those won't stop the rollout of SDC's.
SDC's won't be outlawed because the US knows it needs to stay ahead in this field to remain competitive with the world. But it will create a huge rift in the labour market and it's alarming that there seems to be no plan to cope with this sudden loss of employment. I think the statistic is that in about 30 out of the 50 states, driving a truck is the most common job.