Nix-Darwin Launch Agents
On macOS, a Launch Agent is a system daemon that runs in the background and performs various tasks or services for the user.
Having recently installed ollama
, I’ve been playing around with various local models.
One annoyance about having installed ollama
using Nix via nix-darwin, is that I need to run ollama serve
in a terminal session or else I would see something like this:
❯ ollama listError: could not connect to ollama app, is it running?
After some code searching, I discovered a method to create a Launch Agent plist for my user using nix-darwin
.
This allows ollama serve
to run automatically in the background for my user.
Here’s what it looks like:
{ description = "DCMBP Darwin system flake";
inputs = { ... };
outputs = inputs@{ self, nix-darwin, nixpkgs, home-manager, ... }:let configuration = { pkgs, ... }: { environment.systemPackages = with pkgs; [ ollama ];
...
launchd = { user = { agents = { ollama-serve = { command = "${pkgs.ollama}/bin/ollama serve"; serviceConfig = { KeepAlive = true; RunAtLoad = true; StandardOutPath = "/tmp/ollama_danielcorin.out.log"; StandardErrorPath = "/tmp/ollama_danielcorin.err.log"; }; }; }; }; }; }; in { ... };}
This configuration creates a plist file at /Users/danielcorin/Library/LaunchAgents
that looks like this:
<?xml version="1.0" encoding="UTF-8"?><!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"><plist version="1.0"><dict> <key>KeepAlive</key> <true/> <key>Label</key> <string>org.nixos.ollama-serve</string> <key>ProgramArguments</key> <array> <string>/bin/sh</string> <string>-c</string> <string>exec /nix/store/nql9lrcn99m34icj20ydm5jjw33pcpcy-ollama-0.1.27/bin/ollama serve</string> </array> <key>RunAtLoad</key> <true/> <key>StandardErrorPath</key> <string>/tmp/ollama_danielcorin.err.log</string> <key>StandardOutPath</key> <string>/tmp/ollama_danielcorin.out.log</string></dict></plist>
Now, when I create a new shell session and run ollama list
, it just works.
Here is the code diff where I added this in my nix config.
Recommended
Multi-Modal Models with ollama
I spent some time experimenting with multi-modal model (also called vision models on the ollama site) to see how they perform. You try these out with...
Local VLMs Have Improved
About 6 months ago, I experimented with running a few different multi-modal (vision) language models on my Macbook. At the time, the results weren't...
Running Huggingface Models with Llama.cpp and ollama
One challenge I've continued to have is figuring out how to use the models on Huggingface. There are usually Python snippets to "run" models that...