y0shi

joined 1 week ago
[–] y0shi@lemm.ee 2 points 6 minutes ago

This sounds like something I’d consider for my homelab. Do you mind elaborating more on the pipeline? How does it look like to prune watched content? Or you keep it forever?

[–] y0shi@lemm.ee 2 points 3 days ago

That sounds like a great way of leveraging existing infrastructure! I host Plex together with other services in a server with intel transcoding capable CPU. I’m quite sure I would get much better performance with the GPU machine, might end up following this path!

[–] y0shi@lemm.ee 3 points 3 days ago (3 children)

I’ve an old gaming PC with a decent GPU laying around and I’ve thought of doing that (currently use it for linux gaming and GPU related tasks like photo editing etc) However ,I’m currently stuck using LLMs on demand locally with ollama. Energy costs of having it powered on all time for on demand queries seems a bit overkill to me…