17
submitted 4 days ago* (last edited 3 days ago) by Womble@lemmy.world to c/localllama@sh.itjust.works

I've recently been writing fiction and using an AI as a critic/editor to help me tighten things up (as I'm not a particularly skilled prose writer myself). Currently the two ways I've been trying are just writing text in a basic editor and then either saving files to add to a hosted LLM or copy pasting into a local one. Or using pycharm and AI integration plugins for it.

Neither is particularly satisfactory and I'm wondering if anyone knows of a good setup for this (preferably open source but not neccesary), integration with at least one of ollama or open-router would be needed.

Edit: Thanks for the recommendations everyone, lots of things for me to check out when I get the time!

top 11 comments
sorted by: hot top controversial new old
[-] nibby@sh.itjust.works 6 points 4 days ago

If you're up for learning Emacs, it has several packages for integrating with Ollama, such as ellama. It has worked satisfactory for me.

[-] Womble@lemmy.world 2 points 4 days ago

I actually already use emacs, I just find configuring it a complete nightmare. Good to know its an option though

[-] tal@lemmy.today 1 points 2 days ago* (last edited 2 days ago)

I installed the emacs ellama package, and I don't think that it required any configuration to use, though I'm not at my computer to check.

[-] brucethemoose@lemmy.world 3 points 3 days ago* (last edited 3 days ago)

Mikupad is incredible:

https://github.com/lmg-anon/mikupad

I think my favorite feature is the 'logprobs' mouseover, aka showing the propability of each token that's generated. It's like a built-in thesaurus, a great way to dial in sampling, and you can regenerate from that point.

Once you learn how instruct formatting works (and how it auto inserts tags), it's easy to maintain some basic formatting yourself and question it about the story.

It's also fast. It can handle 128K context without being too laggy.

I'd recommend the llama.cpp server or TabbyAPI as backends (depending on the model and your setup), though you can use whatever you wish.

I'd recommend exui as well, but seeing how exllamav2 is being depreciated, probably not the best idea to use anymore... But another strong recommendation is kobold.cpp (which can use external APIs if you want).

[-] hendrik@palaver.p3x.de 2 points 4 days ago* (last edited 4 days ago)

I'm not sure if this is what you're looking for, but for AI generated novels, we have Plot Bunni. That's specifically made to draft, generate an outline and chapters and then the story. Organize ideas... It has a lot of rough edges though. I had some very limited success with it, and it's not an editor. But it's there and caters to storywriting.

[-] copacetic@discuss.tchncs.de 2 points 4 days ago

I recently started with Zed. It works with ollama.

Too early for me to give more of an assessment than "it works".

[-] Womble@lemmy.world 1 points 4 days ago

I'll give it a try thanks.

[-] Smokeydope@lemmy.world 1 points 3 days ago* (last edited 3 days ago)

In an ideal work what exactly would you want an AI integrated text editor to do? Depending on what you need to have happen in your workflow you can automate copy pasting and automatic output logging with python scripts and your engines api.

Editing and audiing stories isnt that much different from auditing codebases. It all boils down to the understanding and correct use of language to convey abstraction. I bet tweaking some agebic personalities and goals in vscode+roo could get you somewhere

[-] Womble@lemmy.world 1 points 3 days ago

Things like highlight sections, ask the llm to review something about it, include other files as context (worldbuilding, lore material backstory etc) and easily insert bits of the text back into the main body. As I said I've used pycharm with AI integration for doing this but then you're using a code editor which doesnt really have features that would be nice for writing prose. I was wondering if there was anything off the shelf (or close to) that combined the two.

[-] Smokeydope@lemmy.world 1 points 3 days ago* (last edited 3 days ago)

Have you by chance checked out kobold.cpp lite webUI? It allows some of what your asking for like RAG for worldbuilding, adding images for the llm to describe to add into the story, easy editing of input and output, lots of customization in settings. I have a public instance of kobold webui setup on my website and I'm cool with allowing fellow hobbyist using my compute to experiment with things. If your interested in trying it out to see if its more what youre looking for, feel free to send me a pm and I'll send you the address and a api key/password.

[-] Womble@lemmy.world 1 points 3 days ago

I havent tried kobold. I have tried silly tavern, which I think is similar, but that wasnt really what I wanted as I dont want to use the LLM as a character but as an editor.

this post was submitted on 23 Jun 2025
17 points (87.0% liked)

LocalLLaMA

2978 readers
50 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

Rules:

Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.

Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.

Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.

Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.

founded 2 years ago
MODERATORS