How to Run Ollama and Connect to the Service API Through Internal Network or Internet

How to Run Ollama and Connect to the Service API Through Internal Network or Internet

Jul 26, 2024

Kimi Lu

Motivation

Setting up Ollama to be accessible over a network can be tricky, but with the right configuration, you can seamlessly connect to the service API from both internal and external networks. In this blog, we'll guide you through the process of configuring your Ollama server to be accessible over the network, resolving common issues, and ensuring a smooth connection.

TLDR

To make Ollama accessible over the network, edit the Ollama service configuration to listen on all interfaces (0.0.0.0). Restart the service, test the connection using curl, and configure your firewall and router for external access. This setup enables powerful models like Llama 3.1 to be used locally and accessed from front-end clients (like Open-WebUI).

Background

Since the launch of Llama 3.1, the landscape of AI models has seen a significant shift. For the first time, an open-source/open-weight model is capable of reaching the performance levels of leading closed-source models such as GPT-4 Omni and Claude 3.5. This breakthrough has made high-performance AI more accessible to a wider audience, enabling developers to leverage powerful models without relying on proprietary solutions.

Ollama has emerged as one of the most user-friendly platforms for running these powerful models locally. It allows users to harness the capabilities of models like Llama 3.1 with ease. However, to fully utilize these models, especially in a networked environment, there are some common tricks and configurations needed to make your local machine accessible through the API service.

Configuring Ollama on Different Operating Systems

Setting Environment Variables on macOS

If Ollama is run as a macOS application, environment variables should be set using launchctl:

  1. For each environment variable, call launchctl setenv:

    codelaunchctl setenv OLLAMA_HOST "0.0.0.0"
  2. Restart the Ollama application.

Setting Environment Variables on Linux

If Ollama is run as a systemd service, environment variables should be set using systemctl:

  1. Edit the Ollama Service File: Open the Ollama service configuration file with the following command:

    sudo
  2. Add the Environment Variable: In the editor, add the following lines under the [Service] section:

    [Service]
    
    Environment="OLLAMA_HOST=0.0.0.0"
    • Note #1: Sometimes, 0.0.0.0 does not work due to your environment setup. Instead, you can try setting it to your local ip address like 10.0.0.x or xxx.local, etc.

    • Note #2: You should put this above this line ### Lines below this comment will be discarded. It should look something like this:

    ### Editing /etc/systemd/system/ollama.service.d/override.conf
    ### Anything between here and the comment below will become the new contents of the file
    
    [Service]
    Environment="OLLAMA_HOST=0.0.0.0"
    
    ### Lines below this comment will be discarded
    
    ### /etc/systemd/system/ollama.service
    # [Unit]
    # Description=Ollama Service
    # After=network-online.target
    #
    # [Service]
    # ExecStart=/usr/local/bin/ollama serve
    # User=ollama
    # Group=ollama
    # Restart=always
    # RestartSec=3
    # Environment="PATH=/home/kimi/.nvm/versions/node/v20.5.0/bin:/home/kimi/.local/share/pnpm:/usr/local/sbin:/usr/local/bin:/usr/s>
    #
    # [Install]
    # WantedBy=default.target
  3. Restart the Service: After editing the file, reload the systemd daemon and restart the Ollama service:

    sudo systemctl daemon-reload
    sudo systemctl restart
    
    

Setting Environment Variables on Windows

On Windows, Ollama inherits your user and system environment variables:

  1. Quit Ollama by clicking on it in the taskbar.

  2. Start the Settings (Windows 11) or Control Panel (Windows 10) application and search for environment variables.

  3. Click on "Edit environment variables for your account".

  4. Edit or create a new variable for your user account for OLLAMA_HOST.

  5. Click OK/Apply to save.

  6. Start the Ollama application from the Windows Start menu.

Step 2: Testing the Connection

To test if the Ollama server is accessible over the network, use a curl command from a client system.

Example curl Command

Replace 192.168.1.105 with the IP address of your server:

curl http://192.168.1.105:11434/api/generate -d '
{
  "model": "tinyllama",
  "prompt": "Why is the sky blue?",
  "stream": false,
  "options": {
    "num_thread": 8,
    "num_ctx": 2024
  }
}'

Troubleshooting Common Issues

  1. 404 Not Found Error: If you receive a 404 error, it means the server is receiving the request but cannot process it. Check the service logs for more details:

sudo journalctl -u ollama.service -f
  1. Connection Refused: If the connection is refused, ensure that the service is running and that the firewall rules allow traffic on port 11434.

Step 3: Configuring External Access

To make Ollama accessible from the internet, you need to ensure that your network allows incoming traffic on port 11434. This typically involves configuring port forwarding on your router and updating firewall rules.

Configuring Port Forwarding

  1. Log in to your router's admin interface.

  2. Locate the Port Forwarding section.

  3. Add a new rule to forward traffic from port 11434 to the IP address of your Ollama server.

Updating Firewall Rules

On your server, ensure that the firewall allows traffic on port 11434:

sudo ufw allow 11434/tcp
sudo

Conclusion

By following these steps, you can configure your Ollama server to be accessible over the internal network and the internet on Linux, macOS, and Windows. Ensure that you test the connection thoroughly and troubleshoot any issues using the provided commands. With the correct setup, you can leverage the powerful capabilities of Ollama from any network location.

Feel free to share your experiences and any challenges you face in the comments below. Happy coding!

Grow Your Business
with AI Digital Marketers

Grow Your Business
with AI Digital Marketers

Subscribe to know more

Subscribe to know more