• Lucy :3@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    48 minutes ago

    Most games made in UE are AAA games, where every A stands for more scam, jankyness and less value overall. Very rushed, no love, made to barely work on “my machine” (4090). Many Unity games are smaller cash grabs.

    The most devs that fulfill at least one criterium well (eg. Gameplay, Performance, Stability) are either small studios with their own engine (4AGames, Croteam, Minecraft), or publishers with one banger per 5 years or so: Valve (lost it with CS2 tho), Rockstar. Because those devs either put love, time or both into the games.

  • MangoPenguin
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    45 minutes ago

    I rarely have a good time with UE4/UE5 games, performance is often rough and while on a technical level the graphics are ‘better’, I often don’t think they look as pleasant or feel as immersive as older games.

  • drosophila
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    5 hours ago

    I’m going to sound a little pissy here but I think most of what’s happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.

    Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.

    If you don’t believe me then look at these benchmarks from 2013:

    https://pcper.com/2013/02/nvidia-geforce-gtx-titan-performance-review-and-frame-rating-update/3/

    https://www.pugetsystems.com/labs/articles/review-nvidia-geforce-gtx-titan-6gb-185/

    Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn’t hit a consistent 60 FPS in maxed Hitman Absolution.

    And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that’s been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren’t going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).

    If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that’s fine, but don’t act like everything was great until a bunch of idiots got together and built UE5.

    EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      50 minutes ago

      EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.

      You can preload them if you want but that leads to loadscreens. It’s a developer issue not an Unreal one

      • The_Decryptor@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        24 minutes ago

        No matter what you’ve got to compile the shaders, either on launch or when needed. The game should be caching the results of that step though, so the next time it’s needed it can be skipped entirely.

  • verdigris@lemmy.ml
    link
    fedilink
    English
    arrow-up
    26
    ·
    8 hours ago

    I don’t agree with this at all. I’m sure there are projects where it wasn’t a great choice, but I’ve had no consistent problems with UE5 games, and in several cases the games look and feel better after switching – Satisfactory is a great example.

    • RightHandOfIkaros@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      5 hours ago

      Dead by Daylight switched to UE5 and immediately had noticably bad performance.

      Silent Hill 2 Remake is made in UE5 and also has bad performance stuttering. Though Bloober is famously bad at optimization so its possible it might be just Bloober being Bloober.

      STALKER 2 is showing some questionable performance issues for even high end PCs, and that is also made in UE5.

      Now, just because the common denominator for all these examples is UE5 doesn’t mean that UE5 is the cause, but it is certainly quite the coincidence that the common denominator is the same in all these examples.

      • a1studmuffin@aussie.zone
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 hours ago

        It’s the responsibility of the game developer to ensure their game performs well, regardless of engine choice. If they release a UE5 game that suffers from poor performance, that just means they needed to spend more time profiling and optimising their game. UE5 provides a mountain of tooling for this, and developers are free to make engine-side changes as it’s all open source.

        Of course Epic should be doing what they can to ensure their engine is performant out of the box, but they also need to keep pushing technology forward, which means things may run slower on older hardware. They don’t define a game’s minspec hardware, the developer does.

  • S_H_K@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    8 hours ago

    I think the main problem is how the industry became a crunching machine. Unreal had been sold as on size fits all solution whereas there is things it does good and others it doesn’t obviously.