The name "Delta Lambda" makes me think that this could be some slant on incremental deduped backup algorithms (or in simple terms, rsync).
If you had a consistent large amount of data amongst all the users of the system, and were able to encode some lambda function which operated over that data set or portion thereof, to rip out bitstreams and potentially apply some delta to them to create a target bitstream which forms desired resulting data, then your "compressed" data consists of a series of lambda functions operating over a known data set and maybe any data that could not be otherwise described compressed by some standard algorithm.
I feel like there is something here that would make Kurt Gödel turn in his grave or something however.