tl-dr
-Can someone give me step by step instructions (ELI5) on how to get access to my LLM’s on my rig from my phone?
Jan seems the easiest but I’ve tried with Ollama, librechat, etc.
…
I’ve taken steps to secure my data and now I’m going the selfhosting route. I don’t care to become a savant with the technical aspects of this stuff but even the basics are hard to grasp! I’ve been able to install a LLM provider on my rig (Ollama, Librechat, Jan, all of em) and I can successfully get models running on them. BUT what I would LOVE to do is access the LLM’s on my rig from my phone while I’m within proximity. I’ve read that I can do that via wifi or LAN or something like that but I have had absolutely no luck. Jan seems the easiest because all you have to do is something with an API key but I can’t even figure that out.
Any help?
What OS is your server running? Do you have an Android phone or an iPhone?
In either case all you likely need to do is expose the port and then access your server by IP on that port with an appropriate client.
In Ollama you can expose the port to your local network by changing the bind address from 127.0.0.1 to 0.0.0.0
Regarding clients: on iOS you can use Enchanted or Apollo to connect to Ollama.
On Android there are likely comparable apps.
Sever is my rig which is running windows. Phone is iPhone.
Exposing the port is something I’ve tried to do in the past with no success! When you say, change the bind address, do I do that in the windows defender firewall in the inbound rules section?
I believe you set env vars on Windows through System Properties -> Advanced -> Environment Variables.
I believe you just need to set the env var
OLLAMA_HOST
to0.0.0.0:11434
and then restart Ollama.Env var means “environment variable”, which is information that’s available to all programs you run.
In Linux these are used to set path info for your package manager, shell preferences, a bunch of stuff.
In Windows it’s the same. You need to look up how to set env vars in Windows
https://github.com/ollama/ollama/issues/703
@[email protected]
When on your wifi, try navigating in your browser to your windows computer’s address with a colon and the port 11434 at the end. Would look something like this:
http://192.168.xx.xx:11434/
If it works your browser will just load the text: Ollama is running
From there you just need to figure out how you want to interact with it. I personally pair it with OpenWebUI for the web interface