Very very widely used to signify the amount of red, green or blue light in a pixel. Used in 99.99999% of images and videos on the web and otherwise. Yes, for really high image quality, high dynamic range stuff, it's a on the low side and 10-bit display formats are in use instead, but 256 levels is enough for most cases and there appears to be no general or widespread shift towards higher bit depths in the near future.
Anything large, high-bandwidth requires optimization of storage size and any arbitrary bit counts are used, it's not odd all to see some value stored in, for example, 3 bits if the expected range is 0..7. Byte (8 bits) is more convenient and efficient as fewer instructions are required to extract and use the value than, say, with 3-bit value fields. But most of the performance penalty comes from memory access so if you can squeeze more into memory, you save time even if your CPU needs to do some bit shifting and masking.
And yes, half a byte (4 bits) is a nibble, this isn't some esoteric funny joke but a normal, widely used term, you can find it in instruction set manuals for example whenever they have instructions like "swap nibbles".