this post was submitted on 27 Jun 2023
14 points (100.0% liked)
Gaming
32008 readers
147 users here now
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sure the card should have great value or must have an accessible price. It probably also depends on how "heavy" the tasks get. But seeing e.g. OpenAI struggling with requests, it may be useful to decentralize the processing (with running the model locally on the user's pc).
Maybe this statement was a bit confusing. What I meant was, that in a transition phase developers could choose to allow the usage of a dedicated accelerator card to run everything locally and offline. And for people who don't have or want such a card they could provide a cloud based subscription model, where the processing is done on remote servers.
Yeah, that makes more sense.
By the way, Microsoft just announced that they are trying to do the opposite, and move users to thin clients with Windows itself being a cloud service.
That said much less processing power is required to run a model than to train it. Games also would not require big models, since they only need to know the game lore, not all the world's knowledge.
There are limited LLMs out there that can run on a phone.