You’ll be able to play a new game with state of the art graphics at medium settings with a mid range GPU from 3 years ago. Seems fine to me.
We don’t know what medium settings actually mean. It may still look amazing.
For Control (2019) the first released minium system requirements had a Nvidia GeForce GTX 1060 from 2016. This was later lowered but that’s pretty much in line with what we see now. They’re aiming for next gen graphics. It worked for Control. The game was used as a benchmark for raytracing for years.
At 540p. That’s less than 720p. How the fuck is that even remotely acceptable? Medium settings at 1080p would be barely acceptable. But literally half the resolution? No way.
Maybe the graphics are that intense, even at whatever they chose to call “medium”. Try “low” then and see how those work out.
You are worked up about linguistics because it’s nothing more to you at the moment. Just because you’re used to run every other game at “high” or whatever, doesn’t mean this different game has to be the same.
I refuse to agree with “my midrange GPU has to be able to run everything at Ultra for #random number# of years or I am going ballistics”. I want progress. That’s what I buy new hardware for every few years. If you want graphics to be stuck and don’t advance in any meaningful way, get a console.
Maybe the graphics are that intense, even at whatever they chose to call “medium”. Try “low” then and see how those work out.
What are you trying to imply here? We know that “medium” doesn’t mean any raytracing etc. So what techniques do you think they implemented that could make “medium” so unbelievably good?
You are worked up about linguistics because it’s nothing more to you at the moment. Just because you’re used to run every other game at “high” or whatever, doesn’t mean this different game has to be the same.
My guy, what are you on about? I’m not saying that this game has to run on “high” on a specific configuration, I’m saying that 540p on medium is unacceptable for a last-gen GPU.
I refuse to agree with “my midrange GPU has to be able to run everything at Ultra for #random number# of years or I am going ballistics”. I want progress. That’s what I buy new hardware for every few years. If you want graphics to be stuck and don’t advance in any meaningful way, get a console.
Why not go a step further and insinuate that I want Ultra-Mega-Hyper graphics on my 20 year old toaster? You’re already misrepresenting my words to hell and back.
Ah I see. Graphics didn’t get any more complex and needed stronger hardware before raytracing was introduced. There isn’t any way to make a game demanding and look pretty without raytracing.
If you think it’s not acceptable, then… Don’t accept it I guess?
It has been incredibly rare for single games to be released which were so far advanced that they needed the highest end hardware to work with the commonly accepted minimum fidelity for the market. Of course there were always a bunch of games that ran equally bad on anything but the highest end hardware, because they were badly optimized.
Raytracing is something that actually needs hardware improvements. Pretty much anything else can - and should - be optimized to run on older hardware with less fidelity. You’re free to wait to see if you’re right, but anyone else can see that this is a case of bad optimization.
The minimum-recommended GPU for 540p 60FPS on medium settings is an RTX 3070.
Yes. 540p. without ray tracing.
Tell me again how that’s a realistic expectation.
You’ll be able to play a new game with state of the art graphics at medium settings with a mid range GPU from 3 years ago. Seems fine to me.
We don’t know what medium settings actually mean. It may still look amazing.
For Control (2019) the first released minium system requirements had a Nvidia GeForce GTX 1060 from 2016. This was later lowered but that’s pretty much in line with what we see now. They’re aiming for next gen graphics. It worked for Control. The game was used as a benchmark for raytracing for years.
At 540p. That’s less than 720p. How the fuck is that even remotely acceptable? Medium settings at 1080p would be barely acceptable. But literally half the resolution? No way.
You still don’t know what medium means. Until you do it’s pretty pointless to get worked up about linguistics.
I’m not worked up about linguistics, I’m worked up about the resolution. 540p anything will look like ass.
Maybe the graphics are that intense, even at whatever they chose to call “medium”. Try “low” then and see how those work out.
You are worked up about linguistics because it’s nothing more to you at the moment. Just because you’re used to run every other game at “high” or whatever, doesn’t mean this different game has to be the same.
I refuse to agree with “my midrange GPU has to be able to run everything at Ultra for #random number# of years or I am going ballistics”. I want progress. That’s what I buy new hardware for every few years. If you want graphics to be stuck and don’t advance in any meaningful way, get a console.
What are you trying to imply here? We know that “medium” doesn’t mean any raytracing etc. So what techniques do you think they implemented that could make “medium” so unbelievably good?
My guy, what are you on about? I’m not saying that this game has to run on “high” on a specific configuration, I’m saying that 540p on medium is unacceptable for a last-gen GPU.
Why not go a step further and insinuate that I want Ultra-Mega-Hyper graphics on my 20 year old toaster? You’re already misrepresenting my words to hell and back.
Ah I see. Graphics didn’t get any more complex and needed stronger hardware before raytracing was introduced. There isn’t any way to make a game demanding and look pretty without raytracing.
If you think it’s not acceptable, then… Don’t accept it I guess?
It has been incredibly rare for single games to be released which were so far advanced that they needed the highest end hardware to work with the commonly accepted minimum fidelity for the market. Of course there were always a bunch of games that ran equally bad on anything but the highest end hardware, because they were badly optimized.
Raytracing is something that actually needs hardware improvements. Pretty much anything else can - and should - be optimized to run on older hardware with less fidelity. You’re free to wait to see if you’re right, but anyone else can see that this is a case of bad optimization.
Isn’t that because it’s being upscaled? So it’s not really 540p is it