many people seem to be excited about nVidias new line of GPUs, which is reasonable, since at CES they really made it seem like these new bois are insance for their price.

Jensen (the CEO guy) said that with the power of AI, the 5070 at a price of sub 600, is in the same class as the 4090, being over 1500 pricepoint.

Here my idea: They talk a lot about upscaling, generating frames and pixels and so on. I think what they mean by both having similar performace, is that the 4090 with no AI upscaling and such achieves similar performance as the 5070 with DLSS and whatever else.

So yes, for pure “gaming” performance, with games that support it, the GPU will have the same performance. But there will be artifacts.

For ANYTHING besides these “gaming” usecases, it will probably be closer to the 4080 or whatever (idk GPU naming…).

So if you care about inference, blender or literally anything not-gaming: you probably shouldn’t care about this.

i’m totally up for counter arguments. maybe i’m missing something here, maybe i’m being a dumdum <3

imma wait for amd to announce their stuffs and just get the top one, for the open drivers. not an nvidia person myself, but their research seems spicy. currently still slobbing along with a 1060 6GB

  • AmazingAwesomator@lemmy.world
    link
    fedilink
    arrow-up
    41
    ·
    1 day ago

    the fine print for their comparison charts said they were not tested equally; they just made up different benchmarking conditions for each card to make a 1st party slide. you are absolutely right on calling out their BS <3

    always wait for 3rd party benchmarks. if you are looking for accurate 3rd party benchmarks, gamers nexus and hardware unboxed have extremely good standards for benchmarking (youtube)

  • .Donuts@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    1 day ago

    I was ready to do some due diligence but the specs don’t lie: the 5070 is lower in all the specs that matter like CUDA cores, Shader cores, Tensor cores, VRAM and even base clock speed.

    There might be some improved use cases because of more modern architecture and offloading certain tasks to a powerful CPU, but it’s looking bleak, yeah.

    Minor pet peeve: it’s NVIDIA, full caps.

    • deranger@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      ·
      1 day ago

      Regarding your pet peeve, when was the change? I always want to write it as nVidia too, or maybe now Nvidia. Was that something back from the early GeForce days or am I just imagining things?

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      CUDA cores, Shader cores, Tensor cores

      You should never compare those cross architectures. Just like CPUs, GPUs can do more or less per clock per core. Inside an architecture you can use it get get an idea, but cross architecture it’s apples to oranges.

      ex: The GTX 680 had 3x the cores of the GTX 580, but only performed 2x as fast at best, closer to 1.5x.

  • Vik@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    18 hours ago

    I feel like they’ve been doing this with product releases since Ampere

  • noride@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    I will wait for more in-depth reviews of DLSS4 before I make my choice, but from what I’ve seen thus far, I think I may finally replace my trusty 1080. The 1080 was my first Nvidia card and I have always assumed I’d switch back, but I am cautiously optimistic about their 5 series changes.

    This is only their first iteration moving from a neutral net to AI for frame-gen/upscaling and I think it is already significantly improved. There is still likely a lot of headroom for improving the model even further to reduce artifacts and blurring. I also like the idea of improving the performance of existing hardware through software, and personally respect that they made many of the new features available to older generation cards as well.

    We’ll see though… AMD may clap back.

  • Boomkop3@reddthat.com
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    I mostly noticed the 5070 matched my overclocked 3080 in performance. I won’t need to upgrade anytime soon