Install Ollama onto your platform.
Download the appropriate models.
ollama pull granite3.1-dense:8b-instruct-q4_1
ollama pull all-minilm:l6-v2
Start the server if not already started.
ollama serve
That’s it for local usage. If you want to run Ollama remotely, continue below.
Check out our configuration file to launch ollama on SkyPilot: ollama_setup.yaml
sky serve up ollama_setup.yaml
Set up local environment variables
setx OLLAMA_HOST=<sky-server-ip>:11434
export OLLAMA_HOST=<sky-server-ip>:11434
OLLAMA_HOST=0.0.0.0:11434
Note: If prompted for an API key and none was setup, just leave the input empty.
set llm ollama
tell me <enter prompt>