There's an art to setting bias, and the physics behind it are complicated. I've worked on a couple big 2" tape 24-track machines in recording studios.
Ampex 1967 Biasing in Magnetic Tape Recording is a good discussion.
Optimum bias amplitude depends on tape formulation and speed, record head gap.
It affects distortion, noise and freq. response. It's difficult because you can trade-off noise for distortion or high frequency response - all three parameters interact.
Recording engineers can optimize bias on certain tracks, for high freq. response on the drum kit, or noise for vocals etc. So you will find all kinds of personal rules and preferences for setting bias. Pros will use an audio analyzer and check THD verses frequency to determine optimum bias, checking noise as well.
With a
DC bias, such as a permanent magnet like Sony uses on their cheap cassette products, you end up operating on 1/2 the B-H curve and the result is high distortion with low sensitivity. It's better than no bias at all, but you can still see a distorted sine-wave.
For AC bias, basically you set bias amplitude to give maximum output (highest tape sensitivity) at 1kHz.
This is not the best for low noise though and popular method is to apply slightly too much bias (overbias) to the tape.
With the overbias method, you set bias for max. output level with a 1kHz tone, then using a 10kHz tone dial bias up until output drops a few dB, depending on the head gap and tape formula. Ampex 456 was 3-4dB. Then you have to check and adjust freq. response, another 48 adjustments

Too much bias will give you tape noise, a poor S/N ratio.
Mag tape noise is hiss plus low frequency "bias rocks" as they are called in the recording industry. It's a bumping, crackling noise sounding like rocks being hit together.
The "bias rocks method" is to record a low freq. sine wave 20-50Hz, and on playback use a HPF and listen only to the bias rocks and adjust bias for minimum noise.