You would think so. Even big Hollywood movies are made in Full HD and upscaled later. Like the new Avengers movie. Billions of dollars spent on it. CGI is rendered in Full HD. And they made a breakdown of it, 96% of the movie is CGI.Do you have a source for that? I'm pretty sure that's not correct. Looks like 2K and 2.8K were common standards and 4K is quickly becoming a new standard. The last Avengers movies seem to have been filmed on Arri cameras, which I think are 2.8K but possibly the new models are higher resolution cameras. Note that I just Googled all of this, so I may be way off the mark.
The advantage of a quality created 4k youtube upload is if you watch that video downsampled to 1080p, it looks almost as good as an authentic 25-50mbit bluray. 1080p youtube uploads are smeary and the difference is clearly visible on my 90 inch 1080p video projector.
Now, at that size, youtube 4k video played at 4k is actually smeary compared to an authentic 4k UHD Bluray (which was filmed in at least 4K which most Hollywood productions aren't), so, if you want true 4K quality, once again you might need to watch an 8k youtube video downsampled to 4k. However, with any display below 60 inches, I rarely see a use for true full 50mbit 4k video.
You would think so. Even big Hollywood movies are made in Full HD and upscaled later. Like the new Avengers movie. Billions of dollars spent on it. CGI is rendered in Full HD. And they made a breakdown of it, 96% of the movie is CGI.Do you have a source for that? I'm pretty sure that's not correct. Looks like 2K and 2.8K were common standards and 4K is quickly becoming a new standard. The last Avengers movies seem to have been filmed on Arri cameras, which I think are 2.8K but possibly the new models are higher resolution cameras. Note that I just Googled all of this, so I may be way off the mark.
They used a custom 6K IMAX camera:
https://www.popsci.com/camera-avengers-imax/
https://en.wikipedia.org/wiki/Production_of_Avengers:_Infinity_War_and_Avengers:_Endgame
If you watch it in IMAX you get more in the frame using a 1.90:1 aspect ratio
The advantage of a quality created 4k youtube upload is if you watch that video downsampled to 1080p, it looks almost as good as an authentic 25-50mbit bluray. 1080p youtube uploads are smeary and the difference is clearly visible on my 90 inch 1080p video projector.
Now, at that size, youtube 4k video played at 4k is actually smeary compared to an authentic 4k UHD Bluray (which was filmed in at least 4K which most Hollywood productions aren't), so, if you want true 4K quality, once again you might need to watch an 8k youtube video downsampled to 4k. However, with any display below 60 inches, I rarely see a use for true full 50mbit 4k video.
at 90" that is just shy of 1p/mm at 1080p. Obviously you would be standing a good distance away like 3m. But essentially what you are saying is that at any resolution the bitrate for that resolution is too low so you have to use a resolution that gets you the datarate you need because fuzzy 4K is as good as sharp 1080p which is what i have been trying to explain. As 4K is already more resoltution than the eye can see particularly in a moving image even if it's not pin sharp it does not matter, so why use 8K or 32Mp frames, I mean that is insane. It won't help on youtube if the limit is 4K and as 4K is more than the eye can see then do it in 4K. again, we are only talking about talking head shooting here not filming the lsat example of a natural landscape before it is destroyed forever.
I shoot my stuff in a mix of 4K (60Mbps) and 1080p (28Mbps).
4K for teardown and other detailed stuff like my Apollo 50th outside talking head shots.
But things like the Apollo 50th panel discussions were 1080p for the lower file sizes and the uselessness of it for low light indoor talking head from a distance shots.
Recent scope bench tutorial videos like 1228 and 1223 for example are only 1080p worthy.
I shoot my stuff in a mix of 4K (60Mbps) and 1080p (28Mbps).
4K for teardown and other detailed stuff like my Apollo 50th outside talking head shots.
But things like the Apollo 50th panel discussions were 1080p for the lower file sizes and the uselessness of it for low light indoor talking head from a distance shots.
Recent scope bench tutorial videos like 1228 and 1223 for example are only 1080p worthy.
and that makes perfect sense. If you suddenly announced you were filming in 8K and putting everything out in 4K it would not make me want to watch stuff more.
oh but the results show in the viewer numbers
well do gaming videos and you will get one
The advantage of a quality created 4k youtube upload is if you watch that video downsampled to 1080p, it looks almost as good as an authentic 25-50mbit bluray. 1080p youtube uploads are smeary and the difference is clearly visible on my 90 inch 1080p video projector.
Now, at that size, youtube 4k video played at 4k is actually smeary compared to an authentic 4k UHD Bluray (which was filmed in at least 4K which most Hollywood productions aren't), so, if you want true 4K quality, once again you might need to watch an 8k youtube video downsampled to 4k. However, with any display below 60 inches, I rarely see a use for true full 50mbit 4k video.
at 90" that is just shy of 1p/mm at 1080p. Obviously you would be standing a good distance away like 3m. But essentially what you are saying is that at any resolution the bitrate for that resolution is too low so you have to use a resolution that gets you the datarate you need because fuzzy 4K is as good as sharp 1080p which is what i have been trying to explain. As 4K is already more resoltution than the eye can see particularly in a moving image even if it's not pin sharp it does not matter, so why use 8K or 32Mp frames, I mean that is insane. It won't help on youtube if the limit is 4K and as 4K is more than the eye can see then do it in 4K. again, we are only talking about talking head shooting here not filming the lsat example of a natural landscape before it is destroyed forever.
I shoot my stuff in a mix of 4K (60Mbps) and 1080p (28Mbps).
4K for teardown and other detailed stuff like my Apollo 50th outside talking head shots.
But things like the Apollo 50th panel discussions were 1080p for the lower file sizes and the uselessness of it for low light indoor talking head from a distance shots.
Recent scope bench tutorial videos like 1228 and 1223 for example are only 1080p worthy.
Wouldn't you have liked to see the moon landing stuff in a decent res with HDR?
Are you seriously comparing videos of people babbling about hardware to a moon landing?
Not everything is worth the storage costs.
Not everything is worth the storage costs.
I am reminded of looking as SD video and wondering WTF this 4K rubbish is about
Wouldn't you have liked to see the moon landing stuff in a decent res with HDR?Are you seriously comparing videos of people babbling about hardware to a moon landing?
Not everything is worth the storage costs. I'm pretty sure you don't scan advertisements and spam, for example.
I could actually go back to the raw footage and redo higher quality versions of old videos if I wanted to.
@Simon, it's not the pixel resolution that's important here, it that youtube allows 4x the bandwidth for a 4k upload. This leads to less compression smearing, however, bluray allows for up to 50megabit/second burst during rapid scene changes with a sustained 35megabit per second. This is why even the youtube 4k down sampled to 1080p is still said to be worse than a 1080p bluray with that 50megabit per second bandwidth which would even reveal the background noise grain of the camera's CCD in that scene, not to mention some of compression smearing which is still present in the demo image I uploaded.
Right now, because I have a 1920x1080 display, I prefer 1080p content. That may change in the future. It definitely will, if I get a larger than 22" display -- however, as I mentioned before, color reproduction etc. are more important to me, so I tend to look at the panel quality instead of resolution first. Sometimes, when I watch videos, I drop the resolution because I want to drop the bandwidth used; usually because I'm doing something else (say, downloading large software packages or whatever) and not really watching the video. It annoys the heck out of me that I cannot keep the audio quality high, while dropping the video quality. I especially like technical stuff where I can switch to looking at a very high resolution photo of the thing at hand (outside Youtube, of course), while still listening. (I do not like looking at talking heads at all, btw; they annoy me.)
I could actually go back to the raw footage and redo higher quality versions of old videos if I wanted to.Yup, exactly. You do it if you think it is worth it, not because it is possible.