Member-only story

Creating a Local LLM Application with Golang

Zhimin Wen
3 min readOct 22, 2024

--

Generated with Meta AI

Creating a LLM application is now days so easy. Let’s spin a locally running LLM model, and create a command line LLM utilities with Golang.

Running LLM Model Locally

Download ollama,

curl -LO https://ollama.com/download/ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz

Create the user

sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)

Create the following systemd service file

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/bash -c "/usr/bin/ollama serve"
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_MODELS=/data/ollama/models"

[Install]
WantedBy=default.target

Notice we will have the models running in a custom directory. Save the file into the /etc/systemd/system directory. Enable and start the service

sudo systemctl enable ollama
sudo systemctl start ollama

Now we have the ollama service running on default port 11434.

sudo netstat -tnap | grep 11434
tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 11847/ollama

--

--

No responses yet