- I still recommend this TCL TV model from last year
- The best Mini LED TV I've tested isn't from LG or TCL, and it's $800 off
- You can now speak to Microsoft's Copilot Voice in 40 languages, for free
- Forget Dyson: Roborock's wet-dry vacuum left my floors spotless (and it's on sale for $200)
- This powerful Anker Prime charger replaced all my adapters - and it's on sale for Presidents' Day
How I feed my files to a local AI for better, more relevant responses
![How I feed my files to a local AI for better, more relevant responses How I feed my files to a local AI for better, more relevant responses](https://www.zdnet.com/a/img/resize/cb374b1dfd5371fcc10ec771d583da731116fa90/2025/02/11/9cb2a95c-f639-4db9-9945-dac913fdf8b6/how-to-use-local-files-to-train-ollama-with-msty-knowledge-stacks.jpg?auto=webp&fit=crop&height=675&width=1200)
For a few weeks, I’ve been using Msty for research purposes. One of the main reasons I chose this route is that I like the idea of keeping those interactions isolated to my local machine. Thanks to Msty and Ollama, that’s a fairly easy task.
One Msty/Ollama feature that has intrigued me is the ability to add your own content to what’s called a Knowledge Stack, which can enable you to integrate local data sources to enhance the AI’s ability to provide more relevant and contextual responses. Once you’ve added a document stack, Msty can then serve as a smart assistant with knowledge of whatever it is you deem worth knowing. Once you’ve created a Knowledge Stack, Msty indexes it with a separate model to be used for a chat session.
Also: I tried Sanctum’s local AI app, and it’s exactly what I needed to keep my data private
Think about it this way: You’ve written several documents about a particular topic and you want to use them to fuel your AI chats. With Knowledge Stacks, this is not only possible — it’s easy.
You can add different items to a Knowledge Stack, such as:
- Files (PDF, CSV, MD, JSON, EPUB, DOCX, RTF, and TXT)
- Folders
- Obsidian vaults (from the Obsidian note-taking app)
- Notes
- YouTube transcripts
The process is much simpler than you might think and the results are impressive.
Let me show you how it works.
How to create your first Knowledge Stack
What you’ll need: To make this work, you’ll need both Ollama and Msty installed. You’ll also needs some supported files to add (see the list above).
The first thing to do is open Msty and then click the Knowledge Stack button in the sidebar. On the first page of the pop-up, click Use Local AI Model and then, in the resulting window, click Add Your First Knowledge Stack.
Make sure to select Use Local AI Model for this.
Jack Wallen/ZDNET
In the next pop-up, give your Knowledge Stack a name, leave the model drop-down as is, and click Add.
Creating a Text Stack Knowledge Stack for ZDNET.
Jack Wallen/ZDNET
In the resulting window, you can either drag and drop documents or click Browse Files and locate the files in your default file manager. You can also add Obsidian Vaults, Folders, Notes, and/or YouTube transcripts. Once you’ve done that, click Save as Draft and then click Compose.
You can drag and drop or add documents manually.
Jack Wallen/ZDNET
Depending on how many documents you’ve added to your Knowledge Stack, the compose process can take some time. Wait until it completes. Once the compose process is finished, close the Knowledge Stack Pop-up.
How to add your Knowledge Stack to a chat
With your Knowledge Stack added, it’s time to start chatting. It’s actually very simple. Click the New button (or hit Ctrl+N) to start a new chat. In the chat window, click the Knowledge Stacks button above the chat window (not the one in the sidebar).
Make sure to click the folder icon in the bottom toolbar.
Jack Wallen/ZDNET
From the pop-up menu, select your new Knowledge Stack to add it to the chat. Once you’ve done that, start chatting. Ask a question that would be answered in one of your documents, and you should get the expected answer.
Also: How I made Perplexity AI the default search engine in my browser (and why you should too)
I’ve run a few tests with Msty Knowledge Stacks, and every time I came away impressed. This feature is a great way to use your own documents/data for an AI chat, while ensuring it remains on your local machine and isn’t saved or used by a third party. It’s also pretty cool to see how many of your documents are cited in a chat.
As you can see, my work is being cited in the chat.
Jack Wallen/ZDNET
This should get you started with Msty Knowledge Stacks. Once you’re up to speed with how they work, I would suggest you poke around some of the more advanced features (like Similarity and Chunks).