I’m interested in automatically generating lengthy, coherent stories of 10,000+ words from a single prompt using an open source local large language model (LLM) on low-spec hardware like a laptop without GPU and with i5-8250U, 16GB DDR4-2400MHz. I came across the “Awesome-Story-Generation” repository which lists relevant papers describing promising methods like “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. However, these papers used GPT-3, and I was hoping to find similar techniques implemented with open source tools that I could run locally. If anyone has experience or knows of resources that could help me achieve long, coherent story generation with an open source LLM on low-spec hardware, I would greatly appreciate any advice or guidance.

  • Julian@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    8
    ·
    6 months ago

    You can get a really cool, coherent story of any length you want by writing one or hiring a writer.

    • AlligatorBlizzard@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      6 months ago

      I’ve won NaNoWriMo twice and I can confirm that writing your own does not necessarily result in a cool or coherent story. One of the two is likely better than an LLM could come up with, though.