For a local chatbot I guess you could download Ollama and then the various models via Ollama on your system (it has DeepSeek and Qwen). Ollama will run the models locally. Then you can use either Chatbox or Msty to serve as a front-end for your llm models. Msty has something called 'Knowledge Stacks' where you can do things like load your core rulebooks and maybe pdfs or epubs of other stories you want inspiration from into a database that your llm models can reference across different conversations. Chatbox has something called Knowledge Bases that can allegedly do the same thing but I could not get that to work.
For a local chatbot I guess you could download Ollama and then the various models via Ollama on your system (it has DeepSeek and Qwen). Ollama will run the models locally. Then you can use either Chatbox or Msty to serve as a front-end for your llm models. Msty has something called 'Knowledge Stacks' where you can do things like load your core rulebooks and maybe pdfs or epubs of other stories you want inspiration from into a database that your llm models can reference across different conversations. Chatbox has something called Knowledge Bases that can allegedly do the same thing but I could not get that to work.