I’m still really interested in which devices OpenAI/users are running the new open weight models on.. the 20b model is certainly not gonna run on an iPhone, and I don’t think a 4090 (or better) is a “Consumer GPU” lol
One thing I missed about this gpt-oss launch is that Ollama secretly launched a new “ollama turbo” subscription. interesting that a company focused on local ai is offering cloud inference, but I’m curious what you get out of the $20/mo. hope someone covers this!
lol wish me luck, i’m remotely SSHing into my Mac mini at my house through super slow WiFi from another country to try and run the gpt-oss 20B model on an M2 Mac mini with only 8GB of RAM
what a confusing sentence to type