Good morning, Nostr. Who's running local LLMs? What are the best models that can run at home for coding on a beefy PC system? In 2026, I want to dig into local LLMs more and stop using Claude and Gemini as much. I know I can use Maple for more private AI, but I prefer running my own model. I also like the fact there are no restrictions on these models ran locally. I know hardware is the bottleneck here; hopefully, these things become more efficient.
​I grew up on AOL Instant Messenger and I’m tempted to vibe code a Nostr app that’s identical—same features, same door opening/closing sounds, lol. Has anyone done this yet? It’d be so cool to use it for private messaging. I haven't kept up, but how far have private DMs come on Nostr lately? I know we have a NIP for statuses, which was a notorious feature on AIM.