You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I tried this parameters --model CodeLlama-7B --chat-model CodeGemma-7B, tabby downloaded and after that checked if the chat model has a chat_template.
Instead it should do that before starting downloading it.
Writing to new file.
🎯 Downloaded https://huggingface.co/TheBloke/CodeLlama-7B-GGUF/resolve/main/codellama-7b.Q4_0.gguf to /home/mte90/.tabby/models/TabbyML/CodeLlama-7B/ggml/model-00001-of-00001.gguf.tmp
00:05:34 ▕████████████████████▏ 3.56 GiB/3.56 GiB 10.91 MiB/s ETA 0s. Writing to new file.
🎯 Transferring https://huggingface.co/bartowski/codegemma-7b-GGUF/resolve/main/codegemma-7b-Q4_K_M.gguf
🎯 Downloaded https://huggingface.co/bartowski/codegemma-7b-GGUF/resolve/main/codegemma-7b-Q4_K_M.gguf to /home/mte90/.tabby/models/TabbyML/CodeGemma-7B/ggml/model-00001-of-00001.gguf.tmp
⠼ 5.269 s███Starting...The application panicked (crashed).B/s ETA 0s.
Message: Chat model requires specifying prompt template
Location: /__w/tabby/tabby/crates/llama-cpp-server/src/lib.rs:225
Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets.
The text was updated successfully, but these errors were encountered:
So I tried this parameters
--model CodeLlama-7B --chat-model CodeGemma-7B
, tabby downloaded and after that checked if the chat model has achat_template
.Instead it should do that before starting downloading it.
The text was updated successfully, but these errors were encountered: