As example, I have a text routine that's basically Huffman encoding, so it's efficient for sparse data (namely, graphics with a lot of background color, and few other colors -- fonts). It takes up 215 words in AVR, including a lot of calls to IO routines that obviously wouldn't be necessary for a memory init.
Also, 178 words for a somewhat more powerful decoder (an area-based RLE), so I don't think it's too far out to make a compact decoder.
The same build these functions are in, only uses 0x58 bytes of .data total, so it would indeed be a tough sell.

On a related note, .text could be compressed, too. With no RAM execution, there's no point in trying on the AVR, of course. It's relevant to everything else though; in fact I ran across this just the other day, a lot of N64 games used Yaz0 (similar to LZMA?) to expand data and code overlays into RAM. And compressed (and encoded, encrypted or obfuscated) code is not unfamiliar on PCs either, though probably not as common as it once was (malware aside).
Although on another note, since the build I mentioned above is using a fair amount of PROGMEM -- I wonder how it might compare using compression on that, instead of storing it raw. Pieces can be expanded to RAM as needed (into local function stack frames, heap?). That can be written into the code (probably with great pain..). Maybe some tools and macros can help automate that; unclear if it could ever be as, or more, syntactically convenient than PROGMEM already is. Which is already on the annoying side, so...
But I digress. Neat that some tried, or offer it. Not surprised it's uncommon, of course.
Tim