What's missing:
Analog bandwidth (in the information-theoretic sense) is frequency response times SNR.
Digital bandwidth is baud rate times bits per symbol.
10Mb ethernet for example is Manchester encoded, so the bitrate is actually half the transition rate (20MHz). The SNR can be very poor indeed, as long as the signal levels can be discriminated.
100Mb is multilevel; I forget exactly; I think it's four level -- 2 bits per symbol -- with some scrambling, line encoding (8b/10b or whatever), DC offset compensation and some frequency compensation (I think?). (Don't take my word for it, look it up!)
Gigabit ethernet is heavily encoded (error correction and scrambling and everything else), heavily multilevel (4096 levels -- 12 bits per symbol -- comes to mind, but I may just be thinking of something else, and it may be adaptive), and has frequency response compensation besides.
All three of these could potentially have the same analog bandwidth -- they can all run on Cat6 cable and 8P8C connectors. But in the presence of a noisy transmission medium of limited bandwidth (typical of a very long cable run, the noise arising from thermal noise and unbalanced ambient interference, and the bandwidth being limited by dispersion in the cable), the actual limits will become apparent.
When talking about bandwidth outside of communications, analog bandwidth is always the range of frequencies for which a circuit passes a signal, "pass" being defined as a certain threshold (usually -3dB, but occasionally -6, -1 or -20dB or more where strong rejection is required, or even just -1dB or even tighter, when precision is required). The SNR may be large or small; it doesn't matter because it's not part of that consideration.
Tim