Running Ollama in Android Linux Terminal
I had this random thought of running Ollama inside the new Android Linux terminal, so I tried running the small gemma3:1b
model and it did run successfully. I also recorded a quick video which you can watch here.
Now, let me document exactly what I did:
1. Install Ollama
I installed Ollama in the terminal by following their official guide to install on Linux. Basically, I just ran the following command:
curl -fsSL https://ollama.com/install.sh | sh
At the first try, the install failed for unknown reasons, so I increased the disk size from the Disk Resize option under the gear icon in the terminal. And then running the above install command did work, for whatever reasons.
I, then, ran the ollama
command and it showed me all the available options, as you can see in the screenshot here.
2. Running the gemma3:1b
model
I looked for a lightweight model on ollama.com/search page and found the gemma3:1b
to be a suitable one – it was 815MB in size which was manageable. First, I ran the pull
command and then the run
command:
ollama pull gemma3:1b
ollama run gemma3:1b
I did not directly run the
run
command because terminal was immediately crashing after running it.
It took multiple tries, I had to run the following commands and then close-open the app multiple times:
sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama
And then it finally started running...
droid@localhost:~$ ollama run gemma3: 1b
>>> hi
Hi there! How's your day going so far?
Is there anything you'd like to chat about or any help I can offer?
I said hi
to which it replied as you can see above but the terminal crashed immediately after. I tried opening it again 2-3 times, but it didn't open. I had to erase all terminal data and reinstall Ollama, but then it crashed again after just a hi
.
Clearly, the Android Linux terminal is not very stable at the moment but it's a good start. I am very hopeful about this, and I'm sure it will only improve in the future.
Update:
I got experiment some more and now Ollama is properly running for models like qwen2.5:0.5b
and qwen2.5-coder:0.5b
as they are significantly smaller than the 1b
models that I was trying to run earlier. I have also recorded a new video for this:
I asked a bunch of questions, and the terminal is not crashing this time.
- ← Previous
Life with a Fractured Leg