Skip to main content
Home Projects Logs Garden Feeds About

llama-cpp

1 entry

← All TILs
2024-08-30

Running Huggingface Models with Llama.cpp and ollama

One challenge I've continued to have is figuring out how to use the models on Huggingface. There are usually Python snippets to "run" models that often seem to require GPUs and always seem to run into some sort of issues when trying to install the various Python dependencies. Today, I learned how...

llama-cpp gguf safetensors ollama

© 2025 Dan Corin · Built with Astro

Keyboard Shortcuts

Global

⌘ K Open search
⌘ ⇧ M Toggle dark/light theme
⇧ ? Show keyboard shortcuts
Press ESC to close