AsiaVPN Free VPN

Unlock the Internet with Speed & Security 🚀

  • Overview

    Here’s the architecture diagram for hosting Dify + Ollama on Mac Mini M4, using Nginx Proxy Manager and NoIP for dynamic IP resolution:

    1. Install Docker & Docker Compose

    • Install Docker Desktop for macOS (with arm support)

    2. Configure No‑IP DDNS on Router

    1. Sign up at No‑IP.com, create a hostname (e.g., mydify.ddns.net)
    2. In router’s Admin UI (typically under WAN → DDNS or Advanced → Dynamic DNS):
      • Select No‑IP as provider
      • Enter username/email, password or DDNS key, and the hostname.
      • Save/apply—the router will automatically update the public IP.

    3. Deploy NGINX Proxy Manager

    Create a folder (e.g., ~/nginx-proxy-manager) and add:

    version: '3'
    services:
      nginx-proxy-manager:
        image: jc21/nginx-proxy-manager:latest
        restart: unless-stopped
        ports:
          - "80:80" # for verify
          - "443:443" # for http traffic
          - "81:81" # for cms admin
        volumes:
          - ./data:/data
          - ./letsencrypt:/etc/letsencrypt
    cd ~/nginx-proxy-manager
    docker compose up -d
    • Configure admin via http://my-mac-static-ip:81
    • Set SSL in UI using Let’s Encrypt

    4. Install & Run Dify

    git clone https://github.com/langgenius/dify.git
    cd dify/docker
    cp .env.example .env
    docker compose up -d

    Dify UI now runs on port 5001.

    5. Create Proxy Host in NGINX Proxy Manager

    In the NPM web UI:

    • Domain Names:
      • mydify-hostname.ddns.net
    • Forward to http://host.docker.internal:5001(HTTP)
    • Enable SSL with Let’s Encrypt
    • Force HTTP → HTTPS

    This ensures Dify instance is securely reachable.

    6. Set Up Ollama LLM

    brew install ollama
    ollama pull llama3.2:8b
    ollama pull deepseek-r1:8b
    OLLAMA_HOST=0.0.0.0:11434 ollama serve
    

    This makes Ollama accessible at 11434 on all network interfaces for Docker to reach.

    7. Connect Ollama to Dify

    In the Dify dashboard (Settings → Model Providers):

    • Model Name: llama3.2:3b
    • Base URL: http://host.docker.internal:11434
    • Type: Chat
    • Max Tokens: 4096

    Save it—Dify will detect and list llama3.2:3b under available models.

    8. Restart & Verify

    docker compose restart nginx-proxy-manager
    cd ~/dify/docker && docker compose restart
    

    Now go to https://mydify-hostname.ddns.net, log in, and start building AI workflows!