Jul 26, 2024
Kimi Lu
Motivation
Setting up Ollama to be accessible over a network can be tricky, but with the right configuration, you can seamlessly connect to the service API from both internal and external networks. In this blog, we'll guide you through the process of configuring your Ollama server to be accessible over the network, resolving common issues, and ensuring a smooth connection.
TLDR
To make Ollama accessible over the network, edit the Ollama service configuration to listen on all interfaces (0.0.0.0). Restart the service, test the connection using curl, and configure your firewall and router for external access. This setup enables powerful models like Llama 3.1 to be used locally and accessed from front-end clients (like Open-WebUI).
Background
Since the launch of Llama 3.1, the landscape of AI models has seen a significant shift. For the first time, an open-source/open-weight model is capable of reaching the performance levels of leading closed-source models such as GPT-4 Omni and Claude 3.5. This breakthrough has made high-performance AI more accessible to a wider audience, enabling developers to leverage powerful models without relying on proprietary solutions.
Ollama has emerged as one of the most user-friendly platforms for running these powerful models locally. It allows users to harness the capabilities of models like Llama 3.1 with ease. However, to fully utilize these models, especially in a networked environment, there are some common tricks and configurations needed to make your local machine accessible through the API service.
Configuring Ollama on Different Operating Systems
Setting Environment Variables on macOS
If Ollama is run as a macOS application, environment variables should be set using launchctl:
For each environment variable, call
launchctl setenv
:Restart the Ollama application.
Setting Environment Variables on Linux
If Ollama is run as a systemd
service, environment variables should be set using systemctl
:
Edit the Ollama Service File: Open the Ollama service configuration file with the following command:
Add the Environment Variable: In the editor, add the following lines under the [Service] section:
Note #1: Sometimes,
0.0.0.0
does not work due to your environment setup. Instead, you can try setting it to your local ip address like10.0.0.x
orxxx.local
, etc.Note #2: You should put this above this line
### Lines below this comment will be discarded
. It should look something like this:
Restart the Service: After editing the file, reload the
systemd
daemon and restart the Ollama service:
Setting Environment Variables on Windows
On Windows, Ollama inherits your user and system environment variables:
Quit Ollama by clicking on it in the taskbar.
Start the Settings (Windows 11) or Control Panel (Windows 10) application and search for environment variables.
Click on "Edit environment variables for your account".
Edit or create a new variable for your user account for
OLLAMA_HOST
.Click OK/Apply to save.
Start the Ollama application from the Windows Start menu.
Step 2: Testing the Connection
To test if the Ollama server is accessible over the network, use a curl command from a client system.
Example curl Command
Replace 192.168.1.105
with the IP address of your server:
Troubleshooting Common Issues
404 Not Found Error: If you receive a 404 error, it means the server is receiving the request but cannot process it. Check the service logs for more details:
Connection Refused: If the connection is refused, ensure that the service is running and that the firewall rules allow traffic on port 11434.
Step 3: Configuring External Access
To make Ollama accessible from the internet, you need to ensure that your network allows incoming traffic on port 11434. This typically involves configuring port forwarding on your router and updating firewall rules.
Configuring Port Forwarding
Log in to your router's admin interface.
Locate the Port Forwarding section.
Add a new rule to forward traffic from port 11434 to the IP address of your Ollama server.
Updating Firewall Rules
On your server, ensure that the firewall allows traffic on port 11434:
Conclusion
By following these steps, you can configure your Ollama server to be accessible over the internal network and the internet on Linux, macOS, and Windows. Ensure that you test the connection thoroughly and troubleshoot any issues using the provided commands. With the correct setup, you can leverage the powerful capabilities of Ollama from any network location.
Feel free to share your experiences and any challenges you face in the comments below. Happy coding!