Maybe we don’t need 4K 60FPS video to show Mr. Beast giving away more crap. Just because we can up the quality, doesn’t mean we should. Or maybe client-side real-time AI upscaling will make this a non-issue.
Call me old fashioned but I’d rather see high native quality available for when it is relevant. If I’m watching gameplay footage (as one example) I would look at the render quality.
With more and more video games already trying to use frame generation and upscaling within the engine, at what point is too much data loss? Depending on upscaling again during playback means that you video experience might depend on which vendor you have - for example, an Nvidia computer may upscale differently from an Intel laptop with no DGPU vs an Android running on 15% battery.
That would become even more prominent if you’re evaluating how different upscaling technologies look in a given video game, perhaps with an intent to buy different hardware. I check in on how different hardware encoders keep up with each other with a similar research method. That’s a problem that native high resolution video doesn’t have.
I recognize this is one example and that there is content where quality isn’t paramount and frame gen and upscaling are relevant - but I’m not ready to throw out an entire sector of media for this kind of gain on some media. Not to mention that not everyone is going to have access to the kind of hardware required to cleanly upscale, and adding upscaling to everything (for everyone who’s not using their PS5/Xbox/PC as a set top media player) is just going to drive up the cost of already very expensive consumer electronics and add yet another point of failure to a TV that didn’t need to be smart to begin with.
The quality is something that depends on the content. If the video is just someone talking, 4K is overkill. And not every gameplay has to be recorded forever. Only the good ones. And even the videos can be rescaled after some time if nobody sees them.
This exists. For example, for general decentralized storage, there’s storj.io, and there’s PeerTube. But I guess there’s a reason it’s not more widespread. I’d happily be proven wrong, though.
I’m not sure if we manage to do the same for video though; hosting these costs a lot more.
Maybe we don’t need 4K 60FPS video to show Mr. Beast giving away more crap. Just because we can up the quality, doesn’t mean we should. Or maybe client-side real-time AI upscaling will make this a non-issue.
Call me old fashioned but I’d rather see high native quality available for when it is relevant. If I’m watching gameplay footage (as one example) I would look at the render quality.
With more and more video games already trying to use frame generation and upscaling within the engine, at what point is too much data loss? Depending on upscaling again during playback means that you video experience might depend on which vendor you have - for example, an Nvidia computer may upscale differently from an Intel laptop with no DGPU vs an Android running on 15% battery.
That would become even more prominent if you’re evaluating how different upscaling technologies look in a given video game, perhaps with an intent to buy different hardware. I check in on how different hardware encoders keep up with each other with a similar research method. That’s a problem that native high resolution video doesn’t have.
I recognize this is one example and that there is content where quality isn’t paramount and frame gen and upscaling are relevant - but I’m not ready to throw out an entire sector of media for this kind of gain on some media. Not to mention that not everyone is going to have access to the kind of hardware required to cleanly upscale, and adding upscaling to everything (for everyone who’s not using their PS5/Xbox/PC as a set top media player) is just going to drive up the cost of already very expensive consumer electronics and add yet another point of failure to a TV that didn’t need to be smart to begin with.
The quality is something that depends on the content. If the video is just someone talking, 4K is overkill. And not every gameplay has to be recorded forever. Only the good ones. And even the videos can be rescaled after some time if nobody sees them.
I mean, didn’t Vine fail even with mostly low-quality videos? I’m assuming even 720p could be a challenge for a decentralized site.EDIT: Apparently I was misremembering
It didn’t fail, twitter shut it down
I distinctly remember reading that on somewhere reputable but it seems you’re right. Thanks for the fact check.
Is there some reason you can’t start up a decentralized content hosting platform. Just let anyone with a spare hd and a spare pc at home join up?
Like I guess I don’t really want anything illegal on my PC… Maybe this plan is awful.
This exists. For example, for general decentralized storage, there’s storj.io, and there’s PeerTube. But I guess there’s a reason it’s not more widespread. I’d happily be proven wrong, though.