I've been reading through Horowitz and Hill and have come to the section about cable impedances, capacitances/inductances/speed-of-light-lag. But it is fairly clear this only seems to count for cables of a few metres and more, and mostly only for coax. Certainly one doesn't think this sort of thing gets considered in designs when connecting up lower speed equipment with ribbon cables and other multi-wire connectors. And yet it would appear to be the speed of the rising/falling edges in a signal which give it the ability to generate reflected pulses and other nasty effects on the waveform, particularly things like the spikes of doubled or below ground voltage which can arise. And pretty much all chips thesedays are going to give very fast edges on their logic rises and falls, even if the datarate is slow and the time between a rise and the subsequent fall is relatively long.
To my understanding in practice there is a point in designing systems above which it is generally considered that things like terminations are needed, and below which such considerations are ignored. Where does this occur?
Clearly power wires, DC ones too, not just building-scale AC wiring, aren't usually planned with these impedance configurations, is tha just an assumption that there is always enough capacitance in the load, however distant it may be, to give a slow rise time and not trigger those huge positive and negative spikes?
And ribbon cables, as well as wire-to-board connected cable types with multiple separate wire conductors in them, don't come with any mentions of typical capacitances or impedances in their datasheets, which makes one assume that when they are used in a design of something the designers have gone and said "transmission line effects aren't going to matter here".
There are clearly quite a lot of scenarios where one can have several metres of wiring (for I2C or SPI or UART or CAN or USB within a larger robotic arm or around a car)and although the data rate might be slow, the edges are however fast they naturally are from the chips being used, but one doesn;t see much suggestion that transmission line effects play a major role in the design of these sort of things, while they do matter for superfast signals between test-bench kit. Are the transmission line effects magnified a lot by coax cabling of the type neded for keeping sensitive signals clear of environmental noise, and that is why they mainly crop up for coax not other cable types?
Can anyone explain where the dividing line is for considering these effects, and why they don't always get considered? If they did always matter one wouldn't even be able to wire a 10cm PCB trace without fear of voltage steps going well outside the receiving end device's voltage limits, because even for slow data rates the rise and fall from almost any logic is pretty fast indeed. What stops transmssion line effects from mattering except for high speed signals in coax, and extraordinarily high speed microwave type signals elsewhere?