Skip to content

ludvb/codepilot.el

Repository files navigation

codepilot.el

AI-powered code completion for emacs

Codepilot.el provides AI-based code completion for emacs in a similar way to copilot.el and GitHub Copilot but with open-source language models.

demo.gif

Getting started

Example configuration

straight.el
(use-package codepilot
  :straight (:host github :repo "ludvb/codepilot.el" :files ("*.el") )
  :hook ((prog-mode . codepilot-mode)
         (text-mode . codepilot-mode))

  :bind (:map evil-insert-state-map
              ("M-j" . 'codepilot-accept-line)
              ("M-l" . 'codepilot-accept-char)
              ("M-w" . 'codepilot-accept-word)
              ("M-<return>" . 'codepilot-accept-all)
              ("M-J" . 'codepilot-complete-next)
              ("M-K" . 'codepilot-complete-previous))

  :config
  ;; For llama.cpp backend:
  (require 'codepilot-llamacpp)
  (setq codepilot-backend 'llamacpp)
  (setq codepilot-llamacpp-address "http://localhost:8080")

  ;; For Ollama backend:
  (require 'codepilot-ollama)
  (setq codepilot-backend 'ollama)
  (setq codepilot-ollama-address "http://localhost:11434")")
  ;; Optionally, start Ollama server from emacs:
  (require 'codepilot-ollama-server)
  (codepilot-ollama-start))
Doom Emacs
;; packages.el
(package! codepilot
  :recipe (:host github :repo "ludvb/codepilot.el" :files ("*.el")))
;; config.el
(use-package! codepilot
  :hook ((prog-mode . codepilot-mode)
         (text-mode . codepilot-mode))

  :bind (:map evil-insert-state-map
              ("M-j" . 'codepilot-accept-line)
              ("M-l" . 'codepilot-accept-char)
              ("M-w" . 'codepilot-accept-word)
              ("M-<return>" . 'codepilot-accept-all)
              ("M-J" . 'codepilot-complete-next)
              ("M-K" . 'codepilot-complete-previous))

  :config
  ;; For llama.cpp backend:
  (require 'codepilot-llamacpp)
  (setq codepilot-backend 'llamacpp)
  (setq codepilot-llamacpp-address "http://localhost:8080")

  ;; For Ollama backend:
  (require 'codepilot-ollama)
  (setq codepilot-backend 'ollama)
  (setq codepilot-ollama-address "http://localhost:11434")")
  ;; Optionally, start Ollama server from emacs:
  (require 'codepilot-ollama-server)
  (codepilot-ollama-start))

Backends

Codepilot.el requires one of the following backends, which needs to be installed separately:

llama.cpp

Requirements:

The default prompt is designed for the DeepSeek Coder base models, which can be found on Hugging Face. If you are using another model, you may need to customize codepilot-prompt-fun and codepilot-postprocess-fun.

Ollama

Requirements:

By default, the Ollama backend uses the deepseek-coder:6.7b-base model, which needs to be pulled before use by running

ollama pull deepseek-coder:6.7b-base

To use another model, customize codepilot-ollama-model. You may also need to customize codepilot-prompt-fun and codepilot-postprocess-fun as needed.

Contributing

Contributions are always welcome! Feel free to open an issue if you have found a bug or have a feature request, or submit a PR that implements a proposed change.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published