• Smorty [she/her]
    link
    664 months ago

    Gaming on lower resolution because it’s a pixel art game

  • @Maggoty@lemmy.world
    link
    fedilink
    24
    edit-2
    4 months ago
    • Gaming on lower resolution because electricity prices are high.

    • Gaming on lower resolution because it removes obstacles and gives a tactical advantage.

    • @CucumberFetish@lemm.eeOP
      link
      fedilink
      244 months ago
      • Gaming on a new GPU because it’s faster
      • Gaming on an old GPU because it’s VRAM is dying and sometimes the walls disappear
  • @antidote101@lemmy.world
    link
    fedilink
    24
    edit-2
    4 months ago

    Gaming on low resolution because the game is premised on low poly fun, not high end graphics covering for mediocre gameplay.

    • @RageAgainstTheRich@lemmy.world
      link
      fedilink
      154 months ago

      It ticks me off when i see twitch streamers with the fps displayed and its running at 300+ fps. What a waste of electricity, money and hardware.

      • Miss Brainfarts
        link
        184 months ago

        Eh, more frames means better latency, but up to what point it still makes sense is a whole other story.

        • Norah - She/They
          link
          English
          2
          edit-2
          4 months ago

          for pro players

          Exactly. Most people aren’t pro players however. I’ve also seen this with games like Baldur’s Gate fwiw.

          Not only that but people seem to do it sometimes just to flex, when turning on VSync is an objectively better experience with less screen tearing. There’s ZERO reason to drive 300fps when your monitor is only capable of 144Hz.

            • Norah - She/They
              link
              English
              14 months ago

              You know the VSync setting had to be enabled in a lot of games for VRR to work right?

    • @CucumberFetish@lemm.eeOP
      link
      fedilink
      5
      edit-2
      4 months ago

      Already set. I’m not that competitive player and my reflexes are worse than a sloth’s. So I didn’t even bother to buy a higher refresh rate monitor than 60.

  • @sploosh@lemmy.world
    link
    fedilink
    134 months ago

    My 7800xt pulls about 230 watts at full bore, giving me my monitor’s refresh rate in FPS, 144. Limiting the framerate to 72 results in no tearing and drops the GPU watts to 170. Worth it.

        • @sploosh@lemmy.world
          link
          fedilink
          6
          edit-2
          4 months ago

          Longevity. I build a computer once every 6-10 years and don’t do much upgrading in between. Buying a beefier GPU means it’ll play new games well for a good long while.

          And it’s not like I’m doing this all the time. I was curious about the power usage, so I made a script to monitor it and starting tinkering with settings to see what the delta would be. I’m a 100% ultra, max resolution 95% of the time.

        • @dlok@lemmy.world
          link
          fedilink
          14 months ago

          I’m just thinking of using what I already have… I also suspect there is an efficiency sweet spot than running at full capacity

          • Miss Brainfarts
            link
            34 months ago

            Oh, absolutely. The higher-end you go, the more the product is tuned for balls to the walls power with no consideration for efficiency.

            As the extreme example, the 4090 can deliver almost 94% of it’s full potential when limiting its power draw to about 70% and/or undervolting it.

            (Not exactly sure about the numbers, but you get the idea)

  • @taiyang@lemmy.world
    link
    fedilink
    114 months ago

    Retro gaming… Low res… CRT filter with the warped edges and blooooooom effects… this is the ideal way to play your SNES games. Try it with Super Metroid, that shit is straight up unnerving and beautiful.

  • Smorty [she/her]
    link
    104 months ago

    I have a Watt-Meter right on my PC plug next to my monitor so I can always see how much I consume. It’s crazy how much the monitors alone take up, it’s kind 40 KW/h each. I’m considering removing one of them.

      • @azertyfun@sh.itjust.works
        link
        fedilink
        74 months ago

        Neither hopefully. The former at least is a unit of power, but 40 kW is enough to heat up a whole apartment building.

        In reality a large and older monitor might use a couple hundred watts. A small modern 24" will probably use closer to 50 W (guesstimating), which is still a decent chunk of the power draw of a budget build.

    • @sploosh@lemmy.world
      link
      fedilink
      2
      edit-2
      4 months ago

      KWh is a measure of total energy, not instantaneous power. Your watt meter was saying that since last reset of the value it measured 40 KWh of energy use. That’s not an insignificant amount - a Chevy Bolt can go around 180 miles on 40KWh. Watts, or kilowatts, are instantaneous power. That same Bolt can easily pull 100KW while accelerating and if it could somehow do that for an hour, it would have used 100KWh. It could never make it the whole hour as it has a 65KWh battery, so it would run out after 39 minutes.

      • @bob_lemon@feddit.de
        link
        fedilink
        6
        edit-2
        4 months ago

        What you’re describing is kWh, not kW/h. You need to multiply power with time to get back to energy. An appliance using 1kW of power for 1h “uses” 1kWh of energy. The same appliance running for 2h requires 2kWh instead.

        kW/h doesn’t really make sense as a unit, although it could technically describe the rate at which energy consumption changes over time.

        • @sploosh@lemmy.world
          link
          fedilink
          14 months ago

          Autocorrect seems to disagree, but autocorrect doesn’t know shit about power vs energy. Fixed it.

    • @saigot@lemmy.ca
      link
      fedilink
      14 months ago

      A typical wall outlet can only draw 1800w (1.8kw) no way is it drawing 40kwh (kw/h is a nonsense unit in this context). If it’s drawing 40wh ghats actually quite low, a typical monitor is closer to 80-100w while powered on.

      Where I live electricity is about 10c/kwh (cheap I know) so a 100w monitor is costing me about a cent an hour. More than worth it imo but you make your own decisions.

  • @Hootz@lemmy.ca
    link
    fedilink
    104 months ago

    This is why I do love that my PC is powered by renewable energy. It blows my mind how expensive power is everywhere else, plus I don’t wanna game if it means I gotta roll coal like huge parts of the world.

  • Ms. ArmoredThirteen
    link
    fedilink
    84 months ago

    I game on lower resolution because a lot of modern games are too hyper detailed for me and I get lost in the crisp information density. That and I hate the sound of computer fans

    • @morrowind@lemmy.ml
      link
      fedilink
      64 months ago

      I get lost in the crisp information density.

      Man you’re gonna hate this thing called the real world. I hear the pixels are mere nanometers across

      • Ms. ArmoredThirteen
        link
        fedilink
        34 months ago

        That’s the crisp part. A lot of modern games seem to be obsessed with making every single pixel pop out at you. Rummaging around outside is not like that it’s softer. A real world comparison would be something like malls which are obsessed with making every inch of visible space distinctly pop. I also hate being in malls

    • VaultBoyNewVegas
      link
      fedilink
      34 months ago

      I’ve been replaying FFXIV recently and I had the game running at the maximum refresh rate of my monitor and it was making the fans run harder. it took me way too long to realise that I should just go a setting down for refresh rate instead of altering fan speeds.

  • Miss Brainfarts
    link
    5
    edit-2
    4 months ago

    But seriously though, I cut off a good 30-40% from my GPUs power limit and would you look at that, I still enjoy the games I play.

    Have to run most of them at low to medium settings anyway, so might as well.

    Only thing left to improve it further would be undervolting, I should try that at some point.

    • @CucumberFetish@lemm.eeOP
      link
      fedilink
      2
      edit-2
      4 months ago

      Undervolt it too. Depending on what GPU you have, you might drop an additional 20-40% without a performance hit. An older GTX1070 I used to have dropped power consumption by 40%. The energy savings weren’t that big, but it was nice and quiet

      • Miss Brainfarts
        link
        24 months ago

        Currently using a GTX 970 on EndeavourOS, I believe tuxclocker can do undervolting, haven’t tried it yet

        • @CucumberFetish@lemm.eeOP
          link
          fedilink
          24 months ago

          Undervolting has the added benefit of reducing silicon degradation. I don’t know how much it’ll help your GPU, considering it’s age, but it’s something

          • Miss Brainfarts
            link
            24 months ago

            I’ve put it through absolute overclocking hell, and it still runs as happily as it always did, so there’s that.

  • @Jourei@lemm.ee
    link
    fedilink
    English
    44 months ago

    Games that drop framerate when game loses focus are fun. Something like Wurm or RS don’t need fancier framerate when I’m not even looking!