Comment on Someone got Gab's AI chatbot to show its instructions
JackGreenEarth@lemm.ee 6 months agoYes, but what LLM has a large enough context length for a whole book?
Comment on Someone got Gab's AI chatbot to show its instructions
JackGreenEarth@lemm.ee 6 months agoYes, but what LLM has a large enough context length for a whole book?
ninjan@lemmy.mildgrim.com 6 months ago
Gemini Ultra will, in developer mode, have 1 million token context length so that would fit a medium book at least. No word on what it will support in production mode though.
JackGreenEarth@lemm.ee 6 months ago
Cool! Any other, even FOSS models with a longer (than 4096, or 8192) context length?