010. Caddy proxy Ollama Computers

In the early days of computing, access to powerful machines was a privilege. Universities and labs relied on mainframes shared across dozens of terminals, each user submitting tasks to a single, central system. The concept of “remote computing” was born not out of luxury, but necessity — nobody had the resources for their own mainframe, so you accessed what you needed through a network of connections.

Fast forward three decades, and we see history echoing in our personal AI experiments. Tools like Ollama turn your Mac/PC into a powerful inference engine, yet much like the mainframes of the past, you may not want to dedicate it entirely to one task, every day. Instead, you broker controlled access through small, clever gateways — this time with a Debian VPS, Tailscale, and Caddy. The idea is the same: make compute power available when needed, without opening everything to the world.

Why not expose Ollama directly?

Running Ollama locally is simple, but making it available to the internet isn’t. If you simply bind Ollama to 0.0.0.0 and forward a port on your router, you WILL EXPERIENCE:

  • Public attacks, with no real protection.
  • Certificates and HTTPS hassles.
  • Zero control over who can connect.

In other words, the same set of risks that have always plagued self-hosters when rushing to expose services in the clear.

Instead, we apply the same philosophy we’ve used in earlier posts: never expose your LAN directly. Always proxy through a trusted external node.

The Stack

  • Tailscale – VPN between VPS and your Ollama Mac.
  • Caddy Server – For reverse proxy and automated Lets Encrypt Certificates.
  • Debian VPS – Lightweight external node, serving as your internet-facing entrypoint.

Optional but helpful: define ACLs in Tailscale to ensure the VPS can only talk to your Ollama machine, nothing else.

Architecture Overview

Scenario: Public Proxy to Ollama

     [Client Browser]
             │
  https://ollama.yourdomain.com
             │
    ┌───────────────┐
    │ VPS + Caddy   │
    └───────────────┘
             │ (Tailscale)
  ┌─────────────────────────┐
  │  Mac/PC running Ollama  │
  └─────────────────────────┘

The VPS becomes your only entrypoint, and it forwards authenticated, encrypted requests back to your Ollama machine only when you decide to run it.

Step-by-Step Guide

1. Set up Tailscale on the VPS

Tailscale setup (see post 007):

curl -fsSL https://tailscale.com/install.sh | sh sudo tailscale up

Set a Label to your VPS in the Tailscale Dashboard for easy management.

2. Set up Tailscale on your Mac/Pc with Ollama

Install Tailscale from their official clients. Once authenticated:

tailscale up

Confirm you see your Mac listed in the Tailscale admin panel.

3. Configure ACLs

Restrict the VPS to only be able to reach Ollama’s port (default 11434).

Example ACL:

{
  "ACLs": [
    {
      "Action": "accept",
      "Users": ["autogroup:admin"],
      "Sources": ["vps-tailnet-ip"],
      "Destinations": ["mac-ollama-ip:11434"]
    }
  ]
}


This ensures the VPS can only connect to Ollama.

4. Configure Caddy on the VPS

Install Caddy on Debian:

sudo apt install -y debian-keyring debian-archive-keyring apt-transport-https

curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/gpg.key' | sudo tee /etc/apt/trusted.gpg.d/caddy.gpg > /dev/null

curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/debian.deb.txt' | 

sudo tee /etc/apt/sources.list.d/caddy-stable.list

sudo apt update

sudo apt install caddy

Edit /etc/caddy/Caddyfile:

ollama.yourdomain.com {
    reverse_proxy 100.101.102.50:11434  # Tailscale IP of your Mac
}

Reload caddy service:

sudo systemctl reload caddy

Now, https://ollama.yourdomain.com proxies straight to your Mac, but only via the secure Tailscale link.

Optional: Add Basic Auth

If you don’t want your Ollama endpoint to be wide open, add lightweight Basic Auth to your Caddy config:

ollama.yourdomain.com {
    basicauth / user JDJhJDEwJGxqT...  # Hashed password
    reverse_proxy 100.101.102.50:11434
}

# To Generate a hash in a terminal $ caddy hash-password

Edge Use Case

One thing to bring your attention on this use case, is that not only you have something externally available, but it is also dynamic, this setup may not be fully automated, but having the flexibility of always using one public (or not so public) endpoint for Ollama, which you can route to X, Y, Z computer in your network gives you the flexibility to change hosts, transparently to the applications you may use.

Conclusion

This pattern repeats since the dawn of computing: resources are scarce, networks are risky, and trust must be mediated. Just like mainframes in the 70s, or personal webservers in the 90s, today’s homelabs and AI machines work best when proxied, gated, and controlled.

With Debian, Tailscale, and Caddy, you can make your Ollama-powered machine(s) available on-demand, without leaving your LAN open to the chaos of the internet.

Simple. Secure. Flexible. Exactly the way #selfhosting should be.

Show Comments
Creative Commons License
Content published under: Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.