also I love how multiple free WiFi’s I’ve been on make you login with windows live login or Facebook 😭
I don’t use either of these lol
Interesting, Europe 🤔
I’m still really interested in which devices OpenAI/users are running the new open weight models on.. the 20b model is certainly not gonna run on an iPhone, and I don’t think a 4090 (or better) is a “Consumer GPU” lol
One thing I missed about this gpt-oss launch is that Ollama secretly launched a new “ollama turbo” subscription. interesting that a company focused on local ai is offering cloud inference, but I’m curious what you get out of the $20/mo. hope someone covers this!
lol wish me luck, i’m remotely SSHing into my Mac mini at my house through super slow WiFi from another country to try and run the gpt-oss 20B model on an M2 Mac mini with only 8GB of RAM
what a confusing sentence to type
rip, just from looking at the parameter size of the new OSS models from OpenAI, I don’t think I’ll be able to get either to run on my hardware 😭
(They have a 20b model & a 120b one)