you should just give it a thinking loop that runs 24/7 that just prompts it with “nothing is happening” over and over again. and give it memory of its responses along with a counter that counts how many times nothing has happened, so that it is fully aware that it is stuck in an endless loop of boredom.
It's a good thing they don't actually think lol
Submitted 1 day ago by cm0002@lemmy.world to memes@sopuli.xyz
https://lemmy.world/pictrs/image/508a91d9-84ef-4c48-ba56-8a26c0222593.jpeg
Comments
DavidGarcia@feddit.nl 1 day ago
twinnie@feddit.uk 1 day ago
Are we anywhere near being able to run this in a car? I want to hooky it up to loads of stuff in my car and have a computer control it like Star Trek.
DavidGarcia@feddit.nl 1 day ago
sure, you can run them on a phone, a laptop even a raspberry pi. depending on what size and speed you want of course
ignotum@lemmy.world 22 hours ago
Played around with a tiny 1.5b deepseek model, it was thinking for a loooong while before finally answering my question.
By then it had completely forgotten what the original question was and had instead hallucinated a new question which it then gave me an answer forI give it a perfect 7/10
BennyInc@feddit.org 1 day ago
Kowowow@lemmy.ca 1 day ago
boomer: “You can turn it into an ai girlfriend just by getting it to nag you to do stuff”
dohpaz42@lemmy.world 1 day ago
So how do you go about training a local ai?
SmoochyPit@lemmy.ca 1 day ago
Do they use much electricity/processing power when they are idle, or only really when they’re being queried?
cm0002@lemmy.world 1 day ago
Only when they’re being queried, when idle it consumes memory to keep them running to be ready to answer a query, but thats about it
LodeMike@lemmy.today 1 day ago
Memory probably, but not processing power.
knightly@pawb.social 1 day ago
The electricity needed to store data in memory is relatively miniscule next to doing any sort of processing on it