many people seem to be excited about nVidias new line of GPUs, which is reasonable, since at CES they really made it seem like these new bois are insance for their price.

Jensen (the CEO guy) said that with the power of AI, the 5070 at a price of sub 600, is in the same class as the 4090, being over 1500 pricepoint.

Here my idea: They talk a lot about upscaling, generating frames and pixels and so on. I think what they mean by both having similar performace, is that the 4090 with no AI upscaling and such achieves similar performance as the 5070 with DLSS and whatever else.

So yes, for pure “gaming” performance, with games that support it, the GPU will have the same performance. But there will be artifacts.

For ANYTHING besides these “gaming” usecases, it will probably be closer to the 4080 or whatever (idk GPU naming…).

So if you care about inference, blender or literally anything not-gaming: you probably shouldn’t care about this.

i’m totally up for counter arguments. maybe i’m missing something here, maybe i’m being a dumdum <3

imma wait for amd to announce their stuffs and just get the top one, for the open drivers. not an nvidia person myself, but their research seems spicy. currently still slobbing along with a 1060 6GB

  • noride@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    I will wait for more in-depth reviews of DLSS4 before I make my choice, but from what I’ve seen thus far, I think I may finally replace my trusty 1080. The 1080 was my first Nvidia card and I have always assumed I’d switch back, but I am cautiously optimistic about their 5 series changes.

    This is only their first iteration moving from a neutral net to AI for frame-gen/upscaling and I think it is already significantly improved. There is still likely a lot of headroom for improving the model even further to reduce artifacts and blurring. I also like the idea of improving the performance of existing hardware through software, and personally respect that they made many of the new features available to older generation cards as well.

    We’ll see though… AMD may clap back.