AI, in its current primitive form, is already benefiting a wide array of industries, from healthcare to energy to climate prediction, to name just a few. But...
If the visuals are performant and consistent, why do we care? I have always been baffled by the obsession with “real pixels” in some benchmarks and user commentary.
AI upscales are so immediately obvious and look like shit. Frame “generation” too. Not sour grapes, my card supports FSR and fluid motion frames, I just hate them and they are turned off.
Yeah this is “we have DLSS at home”. As someone who tested both, DLSS is the actually good one, FSR is a joke of an imitation that’s just slightly fancier TAA. Try DLSS Quality at 1440p or DLSS Balanced at 4K and you’ll see it’s game-changing.
That’s fine, but definitely not a widespread stance. Like somebody pointed out above, most players are willing to lose some visual clarity for the sake of performance.
Look, I don’t like the look of post-process AA at all. FXAA just seemed like a blur filter to me. But there was a whole generation of games out there where it was that or somehow finding enough performance to supersample a game and then endure the spotty compatibility of having to mess with custom unsupported resolutions and whatnot. It could definitely be done, particularly in older games, but for a mass market use case people would turn on SMAA or FXAA and be happy they didn’t have to deal with endless jaggies on their mid-tier hardware.
This is the same thing, it’s a remarkably small visual hit for a lot more performance, and particularly on higher resolution displays a lot of people are going to find it makes a lot of sense. Getting hung up on analyzing just “raw” performance as opposed of weighing the final results independently of the method used to get there makes no sense. Well, it makes no sense industry-wide, if you happen to prefer other ways to claw back that performance you’re more than welcome to deal with bilinear upscaling, lower in-game settings or whatever you think your sweet spot it, at least on PC.
That’s because FXAA also sucks. MSAA and downsampling is so far superior. Also ai generated “frames” aren’t performant, it’s fake performance, because as previously mentioned they look like shit, particularly in the way that they make me think about how well I’m running the game instead of playing the game in front of me.
MSAA is pretty solid, but it has its own quirks and it’s super heavy for how well it works. There’s a reason we moved on from it and towards TAA eventually. And DLSS is, honestly, just very good TAA, Nvidia marketing aside.
I am very confused about the concept of “fake perfromance”. If the animation looks smooth to you then it’s smooth. None of it exists in real life. Like every newfangled visual tech, it’s super in-your-face until you get used to it. Frankly, I’ve stopped thinking about it on the games where I do use it, and I use it whenever it’s available. If you want to argue about increased latency we can talk about it, but I personally don’t notice it much in most games as long as it’s relatively consistent.
I do understand the feeling of having to worry about performance and being hyper-aware of it being annoying, but as we’ve litigated up and down this thread, that ship sailed for PC gaming. If you don’t want to have to worry, the real answer is getting a console, I’m afraid.
This feels like its establishing a precedent for widespread adoption/implementation of AI into consumer devices. Manufactured consent.
“We compute one pixel… we hallucinate, if you will, the other 32.”
Between this and things like Sora, we are doomed to drown in illusions of our own creation.
If the visuals are performant and consistent, why do we care? I have always been baffled by the obsession with “real pixels” in some benchmarks and user commentary.
AI upscales are so immediately obvious and look like shit. Frame “generation” too. Not sour grapes, my card supports FSR and fluid motion frames, I just hate them and they are turned off.
DLSS and FSR are not comparable.
“FSR looks like shit” is not the same thing as “upscaling looks like shit”.
Yeah this is “we have DLSS at home”. As someone who tested both, DLSS is the actually good one, FSR is a joke of an imitation that’s just slightly fancier TAA. Try DLSS Quality at 1440p or DLSS Balanced at 4K and you’ll see it’s game-changing.
That’s fine, but definitely not a widespread stance. Like somebody pointed out above, most players are willing to lose some visual clarity for the sake of performance.
Look, I don’t like the look of post-process AA at all. FXAA just seemed like a blur filter to me. But there was a whole generation of games out there where it was that or somehow finding enough performance to supersample a game and then endure the spotty compatibility of having to mess with custom unsupported resolutions and whatnot. It could definitely be done, particularly in older games, but for a mass market use case people would turn on SMAA or FXAA and be happy they didn’t have to deal with endless jaggies on their mid-tier hardware.
This is the same thing, it’s a remarkably small visual hit for a lot more performance, and particularly on higher resolution displays a lot of people are going to find it makes a lot of sense. Getting hung up on analyzing just “raw” performance as opposed of weighing the final results independently of the method used to get there makes no sense. Well, it makes no sense industry-wide, if you happen to prefer other ways to claw back that performance you’re more than welcome to deal with bilinear upscaling, lower in-game settings or whatever you think your sweet spot it, at least on PC.
That’s because FXAA also sucks. MSAA and downsampling is so far superior. Also ai generated “frames” aren’t performant, it’s fake performance, because as previously mentioned they look like shit, particularly in the way that they make me think about how well I’m running the game instead of playing the game in front of me.
MSAA is pretty solid, but it has its own quirks and it’s super heavy for how well it works. There’s a reason we moved on from it and towards TAA eventually. And DLSS is, honestly, just very good TAA, Nvidia marketing aside.
I am very confused about the concept of “fake perfromance”. If the animation looks smooth to you then it’s smooth. None of it exists in real life. Like every newfangled visual tech, it’s super in-your-face until you get used to it. Frankly, I’ve stopped thinking about it on the games where I do use it, and I use it whenever it’s available. If you want to argue about increased latency we can talk about it, but I personally don’t notice it much in most games as long as it’s relatively consistent.
I do understand the feeling of having to worry about performance and being hyper-aware of it being annoying, but as we’ve litigated up and down this thread, that ship sailed for PC gaming. If you don’t want to have to worry, the real answer is getting a console, I’m afraid.
HEYY!!!
thanks for the real discussion and not memeing. i appreciate you. lemmy appreciates you.
Ahh, so you don’t really know what you’re talking about.