You heard him 4090 users, upgrade to a more powerful GPU.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things

      • Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
      • Engines have generally grown to be more high fidelity including more particles, more fog, (not in Starfield but Raytracing, which is younger than 2017), etc. All of these higher fidelity items require more computer power. Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.

      I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”

      • Edgelord_Of_Tomorrow@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)

        Texture resolution has not considerably effected performance since the 90s.

        Changing graphics settings in this game barely effects performance anyway.

        Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.

        Wtf are you talking about, nobody uses SSAA these days. TAA has basically no performance penalty and FSR has a performance improvement when used.

        If you’re going to try and argue this point at least understand what’s going on.

        The game is not doing anything that other games haven’t achieved in a more performant way. They have created a teetering mess of a game that barely runs.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          10
          ·
          1 year ago

          Texture resolution has not considerably effected performance since the 90s.

          If this were true there wouldn’t be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I’m definitely not going to be re-learning what I know about games from Edgelord here.

        • avater@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Texture resolution has not considerably effected performance since the 90s.

          lol. try to play a game with 4K textures in 4K on a NVIDIA graphics card with not enough vram and you see how it will affect your performance 😅

          I wouldn’t say that Starfield is optimized as hell, but I think it runs reasonably and many people will fall flat on their asses in the next months because they will realize that their beloved “high end rig” is mostly dated as fuck.

          To run games on newer engines (like UE5) with acceptable framerates and details you need a combination of modern components and not just a “beefy” gpu…

          So yeah get used to low framerates if you still have components from like 4 years ago

          Changing graphics settings in this game barely effects performance anyway.

          That’s sound like you are cpu bound…

            • avater@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              I don’t know and I don’t care what is wrong with your system but the amd driver tells me I’m averaging at 87fps with high details on a 5800X and a radeon 6900, a system that is now two years old and I think this is just fine for 1440p.

              So yeah the game is not unoptimized, sure could use a few patches and performance will get better (remember it’s a fucking bethesda game for christ’s sake…) but for many people the truth will be to upgrade their rig or play on xbox

          • regbin_@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            The game might be much more CPU bound on Nvidia cards. Probably due to shitty Nvidia drivers.

            I have a 5800X paired with a 3080 Ti and I can’t get my frame rate to go any higher than 60s in cities.

            • avater@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              sorry to hear that, no problems here with AMD card but I’ve been team AMD all my life so I have no expierence in NVIDIA Cards and their drivers