A Self Hosted AI Automation Stack Running n8n, Local LLMs, and Docker for $0 per Month
A complete self hosted automation stack combining n8n, Docker, and local LLMs that processes files, posts to blogs, and pushes to GitHub with zero ongoing API costs.
The Strategy
Commercial automation platforms like Zapier and Make charge per execution, and costs escalate quickly once workflows run at any meaningful volume. For developers comfortable with self hosting, there is an alternative approach that eliminates recurring fees entirely by running the entire automation stack on local infrastructure. The tradeoff is setup complexity, but the result is unlimited executions at zero marginal cost. This technical guide walks through building a complete self hosted AI automation stack using open source tools. The core workflow engine is n8n running in Docker, connected to local language models through LM Studio or Ollama. For tasks requiring more capable models, OpenRouter provides access to GPT 4 and other commercial APIs as an optional add on rather than a requirement. The automation pattern centers on a file drop trigger. A markdown file placed into a designated folder triggers an n8n workflow that processes the content through MCP (a CLI first AI prompt automation tool), runs it through either a local model or GPT 4 via OpenRouter, and then routes the cleaned output to downstream actions like posting to a blog, sending a Slack message, or pushing to a GitHub repository. Supporting infrastructure includes Watchtower for automatic container updates, Portainer for a visual Docker management interface, and Cronitor for monitoring job execution health. The entire stack runs on a single server with Docker Compose orchestrating all the containers. The author reports zero monthly costs unless external APIs like OpenRouter are used for specific tasks.
How It Works
Install Docker and Docker Compose on a server or local machine to provide the container runtime environment.
Deploy n8n as a Docker container to serve as the central workflow automation engine with a visual node editor.
Set up LM Studio or Ollama as local LLM runners, enabling AI processing without external API dependencies or costs.
Optionally configure OpenRouter as a multi model API wrapper for access to GPT 4 and other commercial models when local models are insufficient.
Install MCP (Mass Code Prompting) as a CLI first tool for structured AI prompt automation that integrates with n8n workflow nodes.
Create file drop trigger workflows in n8n: when a markdown file lands in a designated folder, n8n picks it up and begins processing.
Route the file content through MCP for AI processing using either local models or OpenRouter, generating cleaned output, code, or summaries.
Configure downstream action nodes in n8n to post results to a blog, send Slack notifications, or push commits to GitHub repositories.
Deploy Watchtower for automatic Docker container updates, Portainer for visual container management, and Cronitor for job execution monitoring.
Results
The complete stack runs at $0 per month when using only local models. External API costs through OpenRouter are optional and usage based. The system provides unlimited workflow executions without per run pricing that commercial platforms charge. All components are open source and self hosted on a single server using Docker Compose.
Our Take
We think this is the most practical self hosting guide we have seen for developers who want to escape per execution pricing on commercial automation platforms. The file drop trigger pattern is elegant in its simplicity: no complex API integrations needed to get started, just drop a file and let the pipeline handle the rest. The inclusion of Watchtower, Portainer, and Cronitor shows operational maturity that most self hosting tutorials skip entirely. The honest limitation is that this requires genuine Docker and DevOps comfort. This is not a weekend project for someone who has never touched a terminal. But for developers already running Docker in their workflow, adding n8n and local LLMs to the stack is a natural extension. Best suited for technical builders who run high volume automations and want to eliminate the ongoing costs of Zapier or Make.
Related Strategies
More AI agent strategies you might find useful
A Roadmap From Zero to $25K Per Month Selling Automation Services
A step by step roadmap showing how an automation agency scaled from zero to $25K…
12 n8n Workflows That Replaced $3,000 Per Month in Manual Work
Twelve n8n automations eliminated $3,000 per month in manual tasks covering lead…
How a House Cleaning Company Uses AI for Hiring, Training, Quality Control, and Retention
A Plano cleaning company built machine learning models for recruiting, AI traini…
Want more strategies like this?
Get weekly AI agent case studies in your inbox.