Yep this is the reason i find USB-C a complete standardization trainwreck.
It is now possible to run almost anything imaginable via USB-C, but there is no way to know if it will actually work until you plug it in and try it since both devices must support that one specific feature out of many in order for it to work. Bringing back the ol phrase "Plug and Pray" despite USB 1.1 being the port that truly pulled us into the "Plug any Play" paradime where things simply worked once plugged in.
The only thing guaranteed by a USB-C port is having USB 2.0 support. All the other features like USB 3.0, USB 3.1 Gen2, HDMI, DisplayPort, Thunderbolt...etc are entirely optional so a host PC may support any combination of these it likes, and its impossible to know which since there is no requirement to place standardised icons for the supported features next to the port.
To make things even more confusing even host machines have the same USB-C port as USB devices. So the new symmetrical cables allow you to connect two host PCs (Where nothing will happen because they expect to see a USB device) or to connect two USB slave devices (Where nothing will again happen because they are both waiting for a host to talk to them)
Your case of a USB-C camera and USB-C monitor combines all these cases of how USB-C can go wrong. As a consumer you would expect this to work because they have the same port and you have a cable that fits both. However the camera is likely a USB 3.0 slave device while the monitor is likely a HDMI/DisplayPort slave device. So its two slaves that will not talk to each other, and even if they did, they do not use the same protocol to talk and thus could not understand each other.