๐Ÿš€ Ollama Query Interface

Build #5 โ€ข Version 5.0.0 โ€ข Cookie-Based Storage โ€ข 2025-08-25

๐ŸŽฏ Quick Select Server

๐Ÿ“ก Configuration

๐Ÿ’ก Tip: Having connection issues? Check the Help tab for a complete setup guide.

๐Ÿ’ฌ Query

๐Ÿ“ Response

Time to First Token -
Total Time -
Tokens/sec -
Ready to query your Ollama instances...

๐Ÿ–ฅ๏ธ Ollama Servers

Click to use as template:

Local Mac/Linux: http://localhost:11434
Local Windows: http://localhost:11434
LAN (IP): http://192.168.1.100:11434
LAN (hostname): http://hostname.local:11434
Cloud (HTTPS): https://ollama.example.com
Cloud (custom port): https://api.example.com:8080

๐Ÿ’พ Backup & Restore

Export your settings to a file or import from a previous backup.

๐Ÿ”„ Auto-refresh

๐Ÿช Cookie Settings

All settings are stored in browser cookies (expires in 1 year).

โ“ Complete Setup Guide

๐Ÿšจ Most Common Issue: "Connection Failed"

If you're seeing connection errors when trying to connect to Ollama, you're experiencing the #1 issue users face. Here's exactly how to fix it:

The Problem: By default, Ollama only accepts connections from the same computer (localhost) and blocks web browsers from connecting due to security restrictions (CORS).
The Solution: We need to configure Ollama to accept connections from your browser and optionally from other devices on your network.

๐ŸŽ Fix for macOS

โ–ผ

Quick Fix (Temporary - Lasts until Ollama restarts)

  1. Quit Ollama - Click the Ollama icon in menu bar and select "Quit Ollama"
  2. Open Terminal and run these commands:
    launchctl setenv OLLAMA_ORIGINS "*" launchctl setenv OLLAMA_HOST "0.0.0.0"
  3. Restart Ollama - Open Ollama from Applications
  4. Test it works:
    curl http://localhost:11434/api/tags
    You should see a JSON response with your models.

Permanent Fix (Survives restarts)

  1. Create a launch configuration:
    cat > ~/Library/LaunchAgents/com.ollama.server.plist << 'EOF' Label com.ollama.server ProgramArguments /Applications/Ollama.app/Contents/Resources/ollama serve EnvironmentVariables OLLAMA_HOST 0.0.0.0 OLLAMA_ORIGINS * RunAtLoad KeepAlive EOF
  2. Stop current Ollama and load new configuration:
    pkill -f "Ollama" launchctl load -w ~/Library/LaunchAgents/com.ollama.server.plist
  3. Verify it's working:
    curl -I http://localhost:11434/api/tags -H "Origin: file://" | grep -i access-control
    You should see: Access-Control-Allow-Origin: *

๐ŸชŸ Fix for Windows

โ–ผ

Setting Environment Variables

  1. Open System Properties:
    Press Win + X, select "System" Click "Advanced system settings" Click "Environment Variables"
  2. Add OLLAMA_ORIGINS variable:
    • Click "New" under System variables
    • Variable name: OLLAMA_ORIGINS
    • Variable value: *
    • Click OK
  3. Add OLLAMA_HOST variable (for network access):
    • Click "New" under System variables
    • Variable name: OLLAMA_HOST
    • Variable value: 0.0.0.0
    • Click OK
  4. Restart Ollama:
    # In PowerShell (as Administrator) Stop-Process -Name "ollama" -Force Start-Process "C:\Program Files\Ollama\ollama.exe"
  5. Open Windows Firewall (for network access):
    # In PowerShell (as Administrator) New-NetFirewallRule -DisplayName "Ollama" -Direction Inbound -Protocol TCP -LocalPort 11434 -Action Allow

๐Ÿง Fix for Linux

โ–ผ

Systemd Service Configuration

  1. Edit Ollama service:
    sudo systemctl edit ollama
  2. Add these lines:
    [Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*"
  3. Restart the service:
    sudo systemctl daemon-reload sudo systemctl restart ollama
  4. Open firewall (if using ufw):
    sudo ufw allow 11434/tcp
  5. Verify it's working:
    curl http://localhost:11434/api/tags

๐Ÿณ Fix for Docker

โ–ผ

Running Ollama in Docker

  1. Run with proper environment variables:
    docker run -d \ -v ollama:/root/.ollama \ -p 11434:11434 \ -e OLLAMA_ORIGINS="*" \ -e OLLAMA_HOST="0.0.0.0" \ --name ollama \ ollama/ollama
  2. Or use docker-compose.yml:
    version: '3.8' services: ollama: image: ollama/ollama ports: - "11434:11434" environment: - OLLAMA_ORIGINS=* - OLLAMA_HOST=0.0.0.0 volumes: - ollama:/root/.ollama volumes: ollama:

โ˜๏ธ Connecting to Cloud/Remote Servers

To connect to Ollama running on a cloud server or VPS:

Requirements:

  • โœ… Ollama installed on the remote server
  • โœ… OLLAMA_HOST set to 0.0.0.0 on the server
  • โœ… OLLAMA_ORIGINS configured to allow your domain
  • โœ… Port 11434 open in server firewall
  • โœ… (Optional) HTTPS proxy for secure connections

Nginx Reverse Proxy Example:

server { listen 443 ssl; server_name ollama.example.com; ssl_certificate /path/to/cert.pem; ssl_certificate_key /path/to/key.pem; location / { proxy_pass http://localhost:11434; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; # WebSocket support for streaming proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; # Longer timeouts for model generation proxy_read_timeout 300s; proxy_send_timeout 300s; } }
โš ๏ธ Security Warning: When exposing Ollama to the internet, always use HTTPS and consider implementing authentication. Never use OLLAMA_ORIGINS=* in production - specify exact allowed origins.

๐Ÿงช Testing Your Setup

After configuring Ollama, test that everything works:

1. Test Local Connection:

curl http://localhost:11434/api/tags

2. Test CORS Headers:

curl -I http://localhost:11434/api/tags -H "Origin: http://example.com"

Look for: Access-Control-Allow-Origin: * in the response

3. Test Network Access (from another device):

curl http://YOUR_IP:11434/api/tags

4. Pull a Model (if needed):

ollama pull llama2

๐Ÿ’ก Pro Tips

  • ๐Ÿ”’ Security: Use specific origins instead of "*" in production
  • ๐Ÿš€ Performance: Keep models loaded with ollama run modelname
  • ๐Ÿ“Š Monitoring: Check logs at /tmp/ollama.log (macOS/Linux)
  • ๐Ÿ”„ Updates: Keep Ollama updated for latest features and fixes
  • ๐Ÿ’พ Models: Download models when on fast internet for better experience

๐Ÿ”ง Troubleshooting

๐Ÿ” Diagnostic Tools

Common Error Messages & Solutions

  • "Failed to fetch" or "Network error":
    Ollama is not running. Start it with ollama serve or open the Ollama app.
  • "CORS policy blocked":
    Set OLLAMA_ORIGINS environment variable. See Help tab for OS-specific instructions.
  • "Connection refused":
    Ollama is not listening on the network. Set OLLAMA_HOST=0.0.0.0
  • "No models found":
    Pull a model first: ollama pull llama2
  • "Timeout" errors:
    Model is downloading or system is slow. Wait for model to fully download.

๐Ÿ“ Check Your Configuration

๐Ÿš‘ Emergency Fixes

๐Ÿ“Š Query History

Time Server Model Query TTFT (ms) Total (ms) Tokens/s