You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I have tried all of the methods you give for running this but I get the same error each time no matter what, below is the first install method you gave as an example.
I am using fresh install of Ubuntu 22.04 LTS
installed for this project:
NPM [email protected]
Node.js v20.14.0
(I can reach the localhost URL. but if i press anything on that page, I also get the below error)
(If I do any curl etc I get the same error)
Any idea what I might be doing wrong? Thank you
Host terminal::: (note already downloaded the models in the previous run, but crashed similar to below)
LLAMANET_DEBUG=true npx llamanet@latest
no valid release
I found the issue is due to the latest server llama.cpp build binaries from https://github.com/ggerganov/llama.cpp have refactored server to llama-server.
I tried creating a symlink ln -s ~llamanet/build/bin/llama-server ~llamanet/build/bin/server, which mounted, but seg errored with the test payload.
I was able to quickly get things working by downloading a somewhat older build ~ month ago say https://github.com/ggerganov/llama.cpp/releases/tag/b3091
Hopefully @cocktailpeanut will update and soon, but in the meantime this is a straightforward workaround.
Hi, I have tried all of the methods you give for running this but I get the same error each time no matter what, below is the first install method you gave as an example.
I am using fresh install of Ubuntu 22.04 LTS
installed for this project:
NPM [email protected]
Node.js v20.14.0
(I can reach the localhost URL. but if i press anything on that page, I also get the below error)
(If I do any curl etc I get the same error)
Any idea what I might be doing wrong? Thank you
Host terminal::: (note already downloaded the models in the previous run, but crashed similar to below)
LLAMANET_DEBUG=true npx llamanet@latest
no valid release
█ llamanet running at http://localhost:42424
[QUICKSTART] Try opening a new terminal and run the following command.
curl --request POST
--url http://127.0.0.1:42424/v1/chat/completions
--header "Content-Type: application/json"
--data '{
"model": "https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf/resolve/main/Phi-3-mini-4k-instruct-q4.gguf",
"messages": [
{ "role": "system", "content": "You are a helpful assistant." },
{ "role": "user", "content": "Do aliens exist?" }
]
}'
./server -c 2048 --embeddings -m /home/my/llamanet/models/huggingface/microsoft/Phi-3-mini-4k-instruct-gguf/Phi-3-mini-4k-instruct-q4.gguf --port 8000
node:events:497
throw er; // Unhandled 'error' event
^
Error: spawn ./server ENOENT
at ChildProcess._handle.onexit (node:internal/child_process:286:19)
at onErrorNT (node:internal/child_process:484:16)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
Emitted 'error' event on ChildProcess instance at:
at ChildProcess._handle.onexit (node:internal/child_process:292:12)
at onErrorNT (node:internal/child_process:484:16)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
errno: -2,
code: 'ENOENT',
syscall: 'spawn ./server',
path: './server',
spawnargs: [
'-c',
'2048',
'--embeddings',
'-m',
'/home/my/llamanet/models/huggingface/microsoft/Phi-3-mini-4k-instruct-gguf/Phi-3-mini-4k-instruct-q4.gguf',
'--port',
'8000'
]
}
Node.js v20.14.0
client terminal:::
curl --request POST
--url http://127.0.0.1:42424/v1/chat/completions
--header "Content-Type: application/json"
--data '{
"model": "https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf/resolve/main/Phi-3-mini-4k-instruct-q4.gguf",
"messages": [
{ "role": "system", "content": "You are a helpful assistant." },
{ "role": "user", "content": "Do aliens exist?" }
]
}'
curl: (52) Empty reply from server
The text was updated successfully, but these errors were encountered: