I’m interested in automatically generating lengthy, coherent stories of 10,000+ words from a single prompt using an open source local large language model (LLM). I came across the “Awesome-Story-Generation” repository which lists relevant papers describing promising methods like “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. However, these papers used GPT-3, and I was hoping to find similar techniques implemented with open source tools that I could run locally. If anyone has experience or knows of resources that could help me achieve long, coherent story generation with an open source LLM, I would greatly appreciate any advice or guidance.
Why do you want to do this? What is your end goal?
If it’s to read a story, there are already more stories in the world than you could hope to read in your entire lifetime. Written by humans, with actual intention behind them, guaranteed to be coherent.
If it’s to create a story, well, you’re not creating anything by having an LLM do it for you.
Deestan@lemmy.world 3 weeks ago
This concerns me:
An LLM excels at making passable derivative work. It does not, by definition, come up with original ideas.
What are you going to do with 100,000+ words of 100% derivative writing where anything potentially original can be summed up in a prompt of a few dozen words?
Will this be published or sold somewhere? Undercutting or crowding out original works?
hisao@ani.social 3 weeks ago
This is a cool way to put it, but I think even just errors and randomness in reproduction of source ideas sometimes can count as original ideas. Nevertheless, I also think it doesn’t fully encompass all range of mechanisms by which humans come up with original ideas.
Deestan@lemmy.world 3 weeks ago
Randomness can give novel combinations, sure, but we shouldn’t call than an original idea.
As for the various ways humans come up with original ideas, they are based on a level of reflection, reasoning and thought processing. We know that’s not possible for an LLM: while they are complex in their details, the way they work is very well defined. They imitate.
BlameThePeacock@lemmy.ca 3 weeks ago
You think Humans aren’t pumping out 100% derivative works all the time?
Like every shitty romance novel published. There’s only so many ways a man can woo a woman, they just change the location, randomize the set of actions from a list of things men can do to turn women on, throw in something to harm the relationship, and then come up with a set of names.
Deestan@lemmy.world 3 weeks ago
Don’t worry. I don’t think that.
A big hope I have for AI is that 100% derivative work by humans is now easier to call out. If a rock with a 9V battery could produce it, why should we value it?