Thread
Login to reply
Replies (13)
Does this mean youβre more open to cashu?
What does this have to do with cashu
you didn't read the RIPs, neither did I π
Does this mean you're more open towards trimming overgrown hooves to horses?
One thing I noticed when using @routstr and @Private Provider π£ private routstr AI is that you donβt have to pay a full sat especially when the models charge less (in milli sats)
So if you can do millisats then it will save on fees
Great! Take a look at local models, I just added multiple locally running models to @Nostria
Needs more polishing, was up late last night getting it done.
Can do translations, sentiment, speech to text, text to speech, with full privacy, all running locally.
These models are just 60-150MB, works good enough.
Text generation and chat can be done too but require 2-3GB models.
decentralized AI sounds challenging
If you do this I recommend you to use MDK for encrypting the conversation. That way at least there's **some** privacy.

GitHub
GitHub - marmot-protocol/mdk: Marmot Development Kit
Marmot Development Kit. Contribute to marmot-protocol/mdk development by creating an account on GitHub.
thats so complex though... sure that's nice for very secure DMs, but it doesn't make sense for everything
Honestly it's not that hard to use, and if you want LLM conversations over nostr they better be encrypted at least on my opinion
of course, we are actively discussing encryption options.
i designed pns View quoted note β specifically for this private self backup usecase.
can just use giftwraps for handing off convos to specific users
Haven't seen that before, thanks for sharing
π₯³π₯³π₯³