A copilot snippet is a small portion of code, a handful of lines.
A streaming service or a software program is thousands of lines of code in comparison.
Except that there are examples where entire functions, easily recognized after abstraction-filtration (per US software copyright legal practice), are produced.
Copilot provides those couple-of-second snippets from all over a copyrighted artwork to different people.
If each person only receives a small snippet, but the entire work is shared, was the copyright violated?
The same argument can be applied to torrents, when an uploader does not provide a full copy to any one downloader, only small snippets. The only difference there is that each uploader is their own Copilot, and downloaders have ways to determine if they have obtained a full copy (from many different sources). For Copilot-provided code, the code snippets performing a function successfully is roughly equivalent determination. Trick is, in many jurisdictions downloading is quite legal, and in many others carries no penalty; so, if each single uploader is doing exactly what Copilot is doing, why would file sharing be illegal if Copilot is legal?
At present, it is not illegal to read copyrighted source code, independently describe how that works to another engineer, and then have that other engineer re-implement that code (this is known as the "Chinese wall technique").
Except that ChatGPT and other language models do not "describe how it works", they summarize the language itself. No real understanding, just a mechanical translation
of the language itself, which does
not cross the "novel and copyrightable" boundary.
IOW, to make it legal, somebody needs to prove that what ChatGPT does, is actually analogous to the "Chinese wall technique" or "clean room implementation", and not a purely mechanical translation of the language. Same for Copilot.