You are only browsing one thread in the discussion! All comments are available on the post page.

Return

ARNiM ,

Not possible AFAIK, plus it will degrade the performance due to the latency etc, IMO it’s not feasible and not the best way if you want to leverage your GPU’s horsepower.

You will need to keep the transcoding in the storage server, maybe the rest (a viewer, manager etc) you can move to the Turing Pi 2.

thisisawayoflife OP ,

Transcoding on download seems like the easiest use case, I could use Tdarr and have one node with the GPU. But for apps like Immich that use the GPU for both transcoding (raw to jpg?) and for ML purposes (facial recognition) I’m guessing the container will have to run on the hardware where the GPU is, which means Plex and Jellyfin will also have to follow the gpu.

I’ve definitely thought about moving the GPU to a dedicated mini itx box. Wonder if I could find something rack friendly…

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • All magazines