Ollama provides an easy way to host LLMs locally.

Setup guide

1

Environment setup

Start by downloading and installing ollama
2

Add model config

Add model config into config.yml file

~/.config/oxyconfig.yml
...
models:
  - name: llama3.2 
    vendor: ollama
    model_ref: llama3.2:latest # or model of choice
    api_url: http://localhost:11434
    api_key: # your ollama api key goes here
...
3

Update agent's model config

Update agent’s model config:

agents/agent.yml
model: llama3.2