Observations from video going viral about AI/LLM-related tech: - Many people don't understand that 'AI' and 'NPU' chips that can barely run tiny models can't accelerate giant 10+ GB LLM models - Many people have no clue you can run AI models offline / self-hosted
OpenAI's nightmare: DeepSeek R1 on a Raspberry Pi 5 ;)
The irony that 'Open'AI is so closed that a Chinese open model can come in and deflate the OpenAI bubble in less than a week...
If anyone else has a newer model Raspberry Pi NVMe SSD, can you check if it has a Biwin storage controller? (Originals had Samsung) โ€” want to verify if TRIM works or not...