• MudMan@fedia.io
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    Yep. The thing is, even if you’re on high end hardware doing offline CGI you’re using these techniques for denoising. If you’re doing academic research you’re probably upscaling with machine learning.

    People get stuck on the “AI” nonsense, but ultimately you need upscaling and denoising of some sort to render certain tier of visuals. You want the highest quality version of that you can fit in your budgeted frame time. If that is using machine learning, great. If it isn’t, great as well. It’s all tensor math anyways, it’s about using your GPU compute in the most efficient way you can.