Skip to content

jim3692/koboldcpp-flake

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

KoboldCpp Vulkan Flake

Run nix run github:jim3692/koboldcpp-flake

It downloads Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf using huggingface-cli, if it doesn't already exist in ~/.cache/huggingface, and then starts KoboldCpp in Vulkan mode.