Configuring Ollama for LAN Access on macOS: Complete Guide
With AI applications becoming increasingly popular, Ollama has become a favorite tool among developers for running local large language models. However, by default, Ollama only listens on the local loopback address and cannot be accessed by other devices on the LAN. This article will detail how to configure Ollama on macOS to support LAN access.
Problem Background
By default, the Ollama service only binds to 127.0.0.1:11434
, which means only the local machine can access the service. If you want to use the Ollama service on other devices in the LAN (such as phones, tablets, or other computers), you need to configure Ollama to listen on all network interfaces.
Core Solution
To make Ollama listen on all network interfaces, the key is to set the environment variable OLLAMA_HOST=0.0.0.0:11434
. Depending on different startup methods, there are two main configuration approaches.
Method 1: GUI Application Startup Configuration
If you’re used to starting the service by double-clicking the Ollama application icon, it’s recommended to use launchctl
to set environment variables.
Setup Steps
- Set environment variable using launchctl
launchctl setenv OLLAMA_HOST "0.0.0.0:11434"
-
Restart Ollama application
- Quit the Ollama application completely
- Restart the application from Launchpad or Applications folder
-
Verify configuration
# Check if the service is listening on all interfaces
lsof -i :11434
You should see output similar to:
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
ollama 12345 user 8u IPv4 0x1234567890abcdef 0t0 TCP *:11434 (LISTEN)
Making It Permanent
The launchctl setenv
command only affects the current session. To make it permanent:
- Create a launch agent file
mkdir -p ~/Library/LaunchAgents
- Create the plist file
cat > ~/Library/LaunchAgents/com.ollama.environment.plist << 'EOF'
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.ollama.environment</string>
<key>ProgramArguments</key>
<array>
<string>sh</string>
<string>-c</string>
<string>launchctl setenv OLLAMA_HOST 0.0.0.0:11434</string>
</array>
<key>RunAtLoad</key>
<true/>
</dict>
</plist>
EOF
- Load the launch agent
launchctl load ~/Library/LaunchAgents/com.ollama.environment.plist
Method 2: Command Line Startup Configuration
If you prefer starting Ollama from the command line, you can set the environment variable directly.
Temporary Configuration
# Set environment variable and start Ollama
OLLAMA_HOST=0.0.0.0:11434 ollama serve
Permanent Configuration
Add the environment variable to your shell configuration file:
For Bash users:
echo 'export OLLAMA_HOST="0.0.0.0:11434"' >> ~/.bash_profile
source ~/.bash_profile
For Zsh users (default on macOS Catalina and later):
echo 'export OLLAMA_HOST="0.0.0.0:11434"' >> ~/.zshrc
source ~/.zshrc
For Fish users:
echo 'set -gx OLLAMA_HOST "0.0.0.0:11434"' >> ~/.config/fish/config.fish
Verification and Testing
1. Check Service Status
# Check if Ollama is listening on all interfaces
netstat -an | grep 11434
Expected output:
tcp4 0 0 *.11434 *.* LISTEN
2. Test Local Access
curl http://localhost:11434/api/version
3. Test LAN Access
From another device on the same network:
# Replace YOUR_MAC_IP with your Mac's IP address
curl http://YOUR_MAC_IP:11434/api/version
4. Find Your Mac’s IP Address
# Get IP address
ifconfig | grep "inet " | grep -v 127.0.0.1
Security Considerations
Firewall Configuration
macOS firewall might block incoming connections. To allow access:
- Open System Preferences → Security & Privacy → Firewall
- Click “Firewall Options”
- Add Ollama to the allowed applications list
Or configure via command line:
# Add firewall rule for Ollama
sudo /usr/libexec/ApplicationFirewall/socketfilterfw --add /Applications/Ollama.app/Contents/MacOS/ollama
sudo /usr/libexec/ApplicationFirewall/socketfilterfw --unblock /Applications/Ollama.app/Contents/MacOS/ollama
Network Security
Important Security Notes:
- Only enable LAN access on trusted networks
- Consider using authentication if available
- Monitor access logs regularly
- Disable LAN access when not needed
Troubleshooting
Common Issues and Solutions
Issue 1: Environment variable not taking effect
# Check current environment variables
env | grep OLLAMA
Solution: Ensure you’ve restarted the application after setting the environment variable.
Issue 2: Port already in use
# Check what's using port 11434
lsof -i :11434
Solution: Kill the conflicting process or use a different port.
Issue 3: Cannot access from other devices
- Check firewall settings
- Verify network connectivity
- Ensure devices are on the same subnet
Advanced Configuration
Custom Port Configuration:
# Use a custom port
export OLLAMA_HOST="0.0.0.0:8080"
Multiple Interface Binding:
# Bind to specific interface
export OLLAMA_HOST="192.168.1.100:11434"
Performance Optimization
Resource Monitoring
# Monitor Ollama resource usage
top -pid $(pgrep ollama)
# Check memory usage
ps aux | grep ollama
Network Performance
# Test network throughput
iperf3 -s -p 11435 # On Mac
iperf3 -c YOUR_MAC_IP -p 11435 # On client device
Integration Examples
Python Client Example
import requests
# Configure client for LAN access
OLLAMA_BASE_URL = "http://YOUR_MAC_IP:11434"
def chat_with_ollama(message, model="llama2"):
response = requests.post(
f"{OLLAMA_BASE_URL}/api/chat",
json={
"model": model,
"messages": [{"role": "user", "content": message}],
"stream": False
}
)
return response.json()
# Test the connection
result = chat_with_ollama("Hello, how are you?")
print(result)
JavaScript/Node.js Example
const axios = require('axios');
const OLLAMA_BASE_URL = 'http://YOUR_MAC_IP:11434';
async function chatWithOllama(message, model = 'llama2') {
try {
const response = await axios.post(`${OLLAMA_BASE_URL}/api/chat`, {
model: model,
messages: [{ role: 'user', content: message }],
stream: false
});
return response.data;
} catch (error) {
console.error('Error:', error.message);
return null;
}
}
// Test the connection
chatWithOllama('Hello, how are you?')
.then(result => console.log(result));
Conclusion
Configuring Ollama for LAN access on macOS involves setting the OLLAMA_HOST
environment variable to 0.0.0.0:11434
. Choose the appropriate method based on how you start Ollama:
- GUI startup: Use
launchctl
with launch agents for persistence - Command line startup: Set environment variables in shell configuration files
Remember to consider security implications and only enable LAN access on trusted networks. With proper configuration, you can easily share your local AI models across all devices in your network.
Have you successfully configured Ollama for LAN access? Share your experience and any additional tips in the comments!