For three years there has been a bug report around 4K@120Hz being unavailable via HDMI 2.1 on the AMD Linux driver.

The wait continues…

  • TurboWafflz@lemmy.world
    link
    fedilink
    arrow-up
    73
    ·
    9 months ago

    I don’t understand why any hardware uses HDMI anymore anyway, what does it have that displayport doesn’t?

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      68
      ·
      edit-2
      9 months ago

      HDMi foundation is founded by companies who own the home theatre environement (mainly movie conpanies and television) who puts DRM on HDMI to make it harder to illegally copy content like movies, ao they will always want to be anti open source because thats the request of streaming services/movie businesses. Its why for example, mobile devices have widevine levels. those levels basically determine how “unlocked” the device is and services will refuse to offer full functionality to unlocked devices because of it, be it audio or video.

      Members of VESA, who control the displaypprt standard are generally computer companies are mostly not in the business of media, so they value specs over drm on changes, which for example a use case is that displayport allows for daisychaining diaplays.

      • n3m37h@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        12
        ·
        9 months ago

        I don’t know a single person who has ever used HDMI to steal copyrighted content. Seriously? Who would rip a 2 hr move by watching it vs the 10 min it takes to rip a movie digitally.

        Like shit ya got CAM, WebRIP, BRRIP and SCENE. I doubt HDMI was used in any of these scenarios.

        • Dudewitbow@lemmy.zip
          link
          fedilink
          arrow-up
          9
          ·
          9 months ago

          technically speaking, every gamer who capture cards to bypass when games on PlayStation has an explicit mode that disables built in recording when a cutscene is active is an example.

    • MiltownClowns@lemmy.world
      link
      fedilink
      arrow-up
      58
      ·
      9 months ago

      Decades of being the standard in a/v. That’s like asking, why don’t we get rid of gas stations and just install electric chargers? Well, everybody’s got gas powered cars.

      • TurboWafflz@lemmy.world
        link
        fedilink
        arrow-up
        20
        ·
        9 months ago

        AV things sure since they stick around longer, but computers? When was the last time you saw a high end GPU with VGA or DVI? And they already usually have mostly DisplayPort with just one or two HDMI ports

        • MiltownClowns@lemmy.world
          link
          fedilink
          arrow-up
          22
          ·
          edit-2
          9 months ago

          Well, I wasn’t referring to that ecosystem. That ecosystem is already on display port. The reason HDMI is so prevalent is because it’s the standard in audio-visual equipment. Why would I talk about computer equipment when it’s not the standard there?

          The point still stands. Everybody has equipment that has HDMI, and to phase out that standard in equipment going forward is phasing out equipment people already own.

      • TimeSquirrel@kbin.social
        link
        fedilink
        arrow-up
        9
        ·
        9 months ago

        HDMI only had about four good years to itself before DisplayPort showed up. In contrast, the RCA port stuck around for damn near 100 years.

    • Flaky@iusearchlinux.fyi
      link
      fedilink
      English
      arrow-up
      22
      ·
      9 months ago

      Probably a lot more hardware using HDMI than DisplayPort? Just throwing a guess, tbh.

      That being said, I might consider looking towards DisplayPort when I can get a new monitor…

    • virr@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      CEC (technically I think displayport could support it, but generally isn’t implemented) and ethernet up to 100Mbps.

    • narc0tic_bird@lemm.ee
      link
      fedilink
      arrow-up
      8
      ·
      9 months ago

      Feature-wise probably next to nothing, and it’s usually behind one or two generations in terms of bandwidth. HDMI is often the only port available on TVs though, so GPU makers likely can’t afford to just leave it out.

      • Grass@sh.itjust.works
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        9 months ago

        They should anyway. New tech TV’s are all smart these days and the dumb ones are made for two decades ago. At this point we are better off with a PC monitor and separate speakers. Built in speakers are shit seemingly as a requirement. I use a video port switch for extra inputs without needing to use the on screen menus or just running out of built in ports.

      • Hyperreality@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        9 months ago

        Yep. Very common.

        A lot of people use their pc like a console or media server. Ie. use it to watch/play stuff from their bed or couch.

      • SuperIce@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        Your info is outdated. DP 2.0 is 80 Gbps can do 4K@240hz without display stream compression. It can do up to 16K@60hz using DSC.