max is 400 V with the 10X probe, but you should have like 3000V catII input protection (as 300V for the scope X 10 on the probe
Not sure what you mean with 'should have'. The max input voltage is the minimum of the probe spec and the scope spec times attenuation factor. So for a 300V scope with 400V x10 probe, the max voltage is 400V, not 3000V (not sure about CAT rating, I would expect the lowest CAT rating to apply). If you exceed the rated probe voltage, the attenuation hybrid (simplest version would be resistor with cap in parallel) could short, which would put the full voltage on the scope input, and subsequently kill the input circuit.
If you expect higher voltages, get probes with a higher rated input voltage (eg. x100 probes, like Tek P5100). Also keep in mind that the voltage rating is derated with frequency, a 300V probe won't be able to resist 300V at 100MHz.
How can I go hooking my scope up (Rigol 1052e) to this without having to worry about torching the scope?
Make sure the voltage is well below safe max. For DC or low frequency, you could check it with a DMM, which is usually more robust (although you should also observe the rated maximum). No idea about automotive electronics, so I can't help you with what to expect. There are some high voltages in the ignition circuit, but I wouldn't expect those in instrumentation connectors.
Also, how much can these scopes read safely?
Safe voltages are usually printed on the equipment or in the documentation. There should also be a frequency derating curve in there, too.