For years I’ve had a dream of building a rack mounted PC capable of splitting its resources to host multiple GPU intensive VMs:

  • a few gaming VMs
  • a VM for work that can run Davinci Resolve and Blender renders
  • an LLM server
  • a Stable Diffusion server
  • media server

Just to name a few possibilities…

Everytime I’ve looked into it, it seemed like the technology just wasn’t there yet. I remember a few years ago Linus TT took a shot at it, but in the end suggested the technology (for non-commercial entities) just wasn’t in a comfortable spot yet.

So how far off are we? Obviously AI focused companies seem to make it work, but what possibilities exist for us self-hosters who might also want to run multiple displays in addition to the web gui LLM servers? And without forking out crazy money for GPU virtualization software licenses?

  • brownmustardminion@lemmy.mlOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    Have you tried or do you have any knowledge about utilizing the display ports on the gpu while virtualizing either in lieu or in tandem with streaming displays?

    • Decipher0771@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      No, but I think you’d have some problems. Only the host has access to the actual DisplayPort outputs, all the vgpus have virtual displays, I don’t think there’s a way to make them use the physical out.