I use a 1080p monitor and what I’ve noticed is that once creators start uploading 4k content the 1080p version that I watch on fullscreen has more artifacting than when they only uploaded in 1080p.

Did you notice that as well?

Watching in 1440p on a 1080p monitor results in a much better image, to the detriment of theoretically less sharper image and a lot higher CPU usage.

  • Maxy
    link
    fedilink
    arrow-up
    5
    ·
    6 days ago

    About the “much higher CPU usage”: I’d recommend checking that hardware decoding is working correctly on your device, as that should ensure that even 4K content barely hits your CPU.

    About the “less sharper image”: this depends on your downscaler, but a proper downscaler shouldn’t make higher-resolution content any more blurry than the lower-resolution version. I do believe integer scaling (eg. 4K -> 1080p) is a lot less dependant on having a proper downscaler, so consider bumping the resolution up even further if the video, your internet, and your client allow it.

    • sexy_peach@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      I was just guessing the higher CPU usage. You’re probably right that it doesn’t matter

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      Youtube pushes the AV1 “format” heavily these days which is hard to decode using hardware acceleration, given that a lot of devices still out there do not support that.

      • kevincox@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        which is hard to decode using hardware acceleration

        This is a little misleading. There is nothing fundamental about AV1 that makes it hard to decode, support is just not widespread yet (mostly because it is a relatively new codec).

        • Peter1986C@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          6 days ago

          I mean, given that many devices do not support accelerating it, it is in practice “hard to accelerate” unless you add a new gfx card or buy a new device.

          I may not have worded it optimally (2L speaker), but I am sure it was fairly clear what I meant. 🙂

          • kevincox@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            6 days ago

            I wouldn’t call a nail hard to use because I don’t have a hammer. Yes, you need the right hardware, but there is no difference in the difficulty. But I understand what you are trying to say, just wanted to clarify that it wasn’t hard, just not widespread yet.

      • Maxy
        link
        fedilink
        arrow-up
        1
        ·
        5 days ago

        Good point, though I believe you have to explicitly enable AV1 in Firefox for it to advertise AV1 support. YouTube on Firefox should fall back to VP9 by default (which is supported by a lot more accelerators), so not being able to decode AV1 shouldn’t be a problem for most Firefox-users (and by extension most lemmy users, I assume).

        • Peter1986C@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          4 days ago

          I am running mostly Firefox or Librewolf on Linux these days, but I do not remember having to enable it. Not all of my systems support accelerating AV1 in their hardware, but they do play at 1080p (but with framedrops once above 30fps on the unaccelerated computer). But yeah, I do hope YT keeps VP9 around because of the acceleration.