u/Alexandiego

▲ 0 r/ollama

Hi! I’m building a small local app for my work and I’m a bit unsure about the best approach.

The app is a simple Python CRUD that stores and exports notes. I want to add a local LLM (Llama 3 with Ollama) to help rewrite and structure text.

My doubt is about where to put the instructions for the model.

Option 1:

Send a long prompt every time from Python with all the rules (tone, style, format, etc) + the text to process.

Option 2:

Create a Modelfile in Ollama with all the permanent instructions, and then send smaller task-specific prompts from Python.

For people who have built real apps with Ollama, what works better in practice?

Is the Modelfile approach worth it for maintainability and performance, or is it overkill for a small local project?

I’d really appreciate any real-world advice or examples

reddit.com
u/Alexandiego — 8 days ago