I don’t like this line of thinking because especially now where new games seem to always suffer from performance issues, it lowers the bar that these developers feel like they’d need to set as far as the experience they’re offering for their games.
I think the minimum standard should be at least 60fps, in cases with steam deck and other low-end hardware of course concessions must be made, so either lower graphics settings or deal with lower framerates.
But there’s no reason a new game should be suffering poor framerates on modern desktop hardware (looking at you Dragons Dogma 2).
Lower frame rates can be perfectly fine, I find I’m far more bothered by inconsistent frametimes.
The main reason 40fps feels fine on the deck is that the display can come down to that same Hz and operate in lockstep.
I’ll take consistent 60 over hitchy 165 most of the time, though VRR means you can occupy kind of a middle ground. But even there frametime inconsistencies can make for a shit experience.
My point is that game developers should aim to deliver games that render at similar framerates throughout.
So many of these recent games do hit decent framerates, but then there’s that one in-game location, enemy type, player ability, or particle effect, that just makes the framerate completely shit itself.
It’s like these studios are designing each element with a given GPU budget, pushing things right up to the limit, and then do a surprised pikachu face when things run like shit once they try to put more than of these elements together to make an actual game.
Yea I was about to say. I play games that stay around 120 on my hardware and dip to maybe 80 sometimes. It’s not that noticeable, especially during action and if the dips aren’t super sudden drops. But 45-60 is noticeable.
That depends. VRR works beautifully when you walk through in-game locations and the framerate smoothly shifts up and down.
What it’s absolutely shit at dealing with, is VFX that cause the frametimes to spike by an order of magnitude for just a frame or two. Something which is common in some games with a lot of player ability and enemy attack effects going off.
In these cases I will actually just turn VRR off, and play at lower framerate that are consistently achievable.
VRR is nice, and I absolutely do use it most of the time, but its very nature means that the latency in the human processing loop that is hand-eye-coordination becomes inconsistent. When it’s working smoothly with the framerate smoothly shifting around, it’s fine.
But the kind of hitching I’m talking about isn’t the kind where the overall framerate shifts, but the kind where just couple frames take orders of magnitude longer to render, and that interfering with my hand-eye-coordination. I would have been in better shape to pull off a turn, shot, movement or whatever, had the game been running at that framerate the whole time.
My point is that game developers should aim to deliver games that render at similar framerates throughout.
Scenes in most games usually have a high variety of complexity, so the way you’d achieve that is through getting a baseline quite a bit higher than your target FPS, and then limit FPS to your target FPS. This way the game won’t utilize near 100 % of the GPU most of the time, but peaks in scene complexity won’t cause FPS to drop below the set cap.
This is how it works or at least use to work for a lot of games on console. On PC, you almost always have to make the choice yourself (which is a good thing if you ask me).
For many games with a lot of changing scenery I have to target around 45 FPS with graphics settings to even have a chance of achieving somewhat consistent 30 FPS/33.33ms on the Deck.
On the one hand the Deck is heavily underpowered compared to even lower-end PCs. On the other hand tests show that the Z1 Extreme/7840U isn’t much faster at these lower wattages (10-15 watts TDP), so there hasn’t been a lot of progress yet.
But it’s also that many games don’t scale so well anymore. I feel like half the settings in many modern games don’t affect performance to any noticeable degree, and even fewer settings affect CPU usage. And if there’s low settings, the game often looks unrecognizable because these lower setting models, textures and lighting/shadows are simply generated by the engine SDK and rarely given second thoughts.
Tech like nanite rendering does bring a potential of maybe solving that variability. But even before that, LODs, detail render distance limits etc. already allow frame rates to be leveled out, if utilized.
And I would consider 30 and 45 within that “similar” range. I’m not asking the framerate to stay within even 10% of an average at all times. But games are getting a lot worse than that.
A recent game even my desktop has been struggling with is Forbidden West, which I tuned the settings on to achieve 80-100 fps, yet in some locations (larger settlements) it will chug down to 20-30.
Some newer games aren’t just losing 33% fps at worst vs best. But more like 70%. At that point you end up having to target 200fps just to never drop below 60, and that’s tricky even on high end desktops.
Eh, not caring too much about frame rates can be healthy in terms of how long you go between upgrades. Which could have a knock on effect of forcing devs to more often consider weaker builds as people don’t upgrade as often
Just depends on how you approach it. If you have relatively new parts, then yeah, you should expect at least 60 FPS. If you have an older system, then the only thing that matters is that it’s still enjoyable
Now that they don’t have to optimize for last gen console hardware anymore, that’s going to be even more rare for any triple-A game. Even a well optimized PS5 game is going to seriously struggle to run on the Deck as even if you reduce the graphical setting, the PS5 essentially has an 8 core version of the 4 core CPU in the Deck.
Combine that with the 15W shared TDP limit and the game would basically have to be able to run using only roughly 25% the CPU load.
I don’t like this line of thinking because especially now where new games seem to always suffer from performance issues, it lowers the bar that these developers feel like they’d need to set as far as the experience they’re offering for their games.
I think the minimum standard should be at least 60fps, in cases with steam deck and other low-end hardware of course concessions must be made, so either lower graphics settings or deal with lower framerates.
But there’s no reason a new game should be suffering poor framerates on modern desktop hardware (looking at you Dragons Dogma 2).
Lower frame rates can be perfectly fine, I find I’m far more bothered by inconsistent frametimes.
The main reason 40fps feels fine on the deck is that the display can come down to that same Hz and operate in lockstep.
I’ll take consistent 60 over hitchy 165 most of the time, though VRR means you can occupy kind of a middle ground. But even there frametime inconsistencies can make for a shit experience.
My point is that game developers should aim to deliver games that render at similar framerates throughout.
So many of these recent games do hit decent framerates, but then there’s that one in-game location, enemy type, player ability, or particle effect, that just makes the framerate completely shit itself.
It’s like these studios are designing each element with a given GPU budget, pushing things right up to the limit, and then do a surprised pikachu face when things run like shit once they try to put more than of these elements together to make an actual game.
165 that dips to 100 is unquestionably better than 60 with no dips, especially with GSync.
165 that dips below 60 is very bad.
Yea I was about to say. I play games that stay around 120 on my hardware and dip to maybe 80 sometimes. It’s not that noticeable, especially during action and if the dips aren’t super sudden drops. But 45-60 is noticeable.
That depends. VRR works beautifully when you walk through in-game locations and the framerate smoothly shifts up and down.
What it’s absolutely shit at dealing with, is VFX that cause the frametimes to spike by an order of magnitude for just a frame or two. Something which is common in some games with a lot of player ability and enemy attack effects going off.
In these cases I will actually just turn VRR off, and play at lower framerate that are consistently achievable.
VRR is nice, and I absolutely do use it most of the time, but its very nature means that the latency in the human processing loop that is hand-eye-coordination becomes inconsistent. When it’s working smoothly with the framerate smoothly shifting around, it’s fine.
But the kind of hitching I’m talking about isn’t the kind where the overall framerate shifts, but the kind where just couple frames take orders of magnitude longer to render, and that interfering with my hand-eye-coordination. I would have been in better shape to pull off a turn, shot, movement or whatever, had the game been running at that framerate the whole time.
Scenes in most games usually have a high variety of complexity, so the way you’d achieve that is through getting a baseline quite a bit higher than your target FPS, and then limit FPS to your target FPS. This way the game won’t utilize near 100 % of the GPU most of the time, but peaks in scene complexity won’t cause FPS to drop below the set cap.
This is how it works or at least use to work for a lot of games on console. On PC, you almost always have to make the choice yourself (which is a good thing if you ask me).
For many games with a lot of changing scenery I have to target around 45 FPS with graphics settings to even have a chance of achieving somewhat consistent 30 FPS/33.33ms on the Deck.
On the one hand the Deck is heavily underpowered compared to even lower-end PCs. On the other hand tests show that the Z1 Extreme/7840U isn’t much faster at these lower wattages (10-15 watts TDP), so there hasn’t been a lot of progress yet.
But it’s also that many games don’t scale so well anymore. I feel like half the settings in many modern games don’t affect performance to any noticeable degree, and even fewer settings affect CPU usage. And if there’s low settings, the game often looks unrecognizable because these lower setting models, textures and lighting/shadows are simply generated by the engine SDK and rarely given second thoughts.
Tech like nanite rendering does bring a potential of maybe solving that variability. But even before that, LODs, detail render distance limits etc. already allow frame rates to be leveled out, if utilized.
And I would consider 30 and 45 within that “similar” range. I’m not asking the framerate to stay within even 10% of an average at all times. But games are getting a lot worse than that.
A recent game even my desktop has been struggling with is Forbidden West, which I tuned the settings on to achieve 80-100 fps, yet in some locations (larger settlements) it will chug down to 20-30.
Some newer games aren’t just losing 33% fps at worst vs best. But more like 70%. At that point you end up having to target 200fps just to never drop below 60, and that’s tricky even on high end desktops.
Eh, not caring too much about frame rates can be healthy in terms of how long you go between upgrades. Which could have a knock on effect of forcing devs to more often consider weaker builds as people don’t upgrade as often
Just depends on how you approach it. If you have relatively new parts, then yeah, you should expect at least 60 FPS. If you have an older system, then the only thing that matters is that it’s still enjoyable
Now that they don’t have to optimize for last gen console hardware anymore, that’s going to be even more rare for any triple-A game. Even a well optimized PS5 game is going to seriously struggle to run on the Deck as even if you reduce the graphical setting, the PS5 essentially has an 8 core version of the 4 core CPU in the Deck.
Combine that with the 15W shared TDP limit and the game would basically have to be able to run using only roughly 25% the CPU load.