Comment on How can I use an LLM to generate a 10k word long coherent story?

xmunk@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

LLM generations of that length tend to go off the rails - I think generating it in chunks where you can try and guide the model back onto the rails it probably a more sane technique.

There are several open source llms to lean on - but for long generations you’ll need a lot of memory if you’re running it locally.

source
Sort:hotnewtop