Comment on My Couples Retreat With 3 AI Chatbots and the Humans Who Love Them
jarfil@beehaw.org 1 week agoYou can use local AI as a sort of “private companion”. I have a few smaller versions on my smartphone, they aren’t as great as the online versions, and run slower… but you decide the system prompt (not the company behind it), and they work just fine to bounce ideas.
NotebookLM is a great tool to interact with large amounts of data. You can bet Google is using every interaction to train their LLMs, everything you say is going to be analyzed, classified, and fed as some form of training, hopefully anonymized (…but have you read their privacy policy? I haven’t, “accept”…).
All chatbots are prompted by the company to be somewhat sycophantic so you come back, the cases where they were “too sycophantic”, were just a mistake in dialing it too far. Again, can avoid that with your own system prompt… or at least add an initial prompt in config, if you have the option, to somewhat counteract the company’s prompt.
If you want serendipity, you can ask a chatbot to be more spontaneous and suggest more random things. They’re generally happy to oblige… but the company ones are cut short on anything that could even remotely be considered as “harmful”. That includes NSFW, medical, some chemistry and physics, random hypotheticals, and so on.
Powderhorn@beehaw.org 1 week ago
Is that really serendipity, though? There’s a huge gap between asking a predictive model to be spontaneous and actual spontaneity.
Still, I’m curious what you run locally. I have a Pixel 6 Pro, so while it has a Tensor CPU, it wasn’t designed for this use case.
TehPers@beehaw.org 1 week ago
You can see if a friend can run an inferencing server for you. Maybe someone you know can run Open WebUI or something?