Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using existing models on PC #553

Open
nauen opened this issue Aug 4, 2024 Discussed in #370 · 1 comment
Open

Using existing models on PC #553

nauen opened this issue Aug 4, 2024 Discussed in #370 · 1 comment

Comments

@nauen
Copy link

nauen commented Aug 4, 2024

Discussed in #370

Originally posted by Greggar September 12, 2023
Hi. I have downloaded a few models in the past. Rather than download them again, is there a directory I can place them in for LOLLMS to access them?

Hey ParisNeo!

I used lollms before and creating a new setup right now.
I'm struggeling to add Models manually. ( I was not able to find all models from hugging face in the search like https://huggingface.co/dagbs/dolphin-2.8-mistral-7b-v02-GGUF/tree/main)

First I thought I could just git clone an huggingface repo. But I cannot sucessfully load the model. I'm not sure if the model is recognised correctly.

I got so many weird behavior...
I have the version 10.1 (warp drive)

I can copy old Folders from my old lollms installation in to the model folders and they just work fine.
Also if I download something (if the model is working in webui).

`
p@mchn:/ai/lollms/personal_data/models$ ll
insgesamt 36
drwxrwxr-x 9 p p 4096 Aug 4 18:03 ./
drwxrwxr-x 18 p p 4096 Aug 4 14:34 ../
drwxrwxr-x 3 p p 4096 Aug 4 18:31 awq/
drwxrwxr-x 3 p p 4096 Aug 4 17:27 ggml/
drwxrwxr-x 6 p p 4096 Aug 4 18:19 gguf/
drwxrwxr-x 2 p p 4096 Aug 2 23:39 gptq/
drwxrwxr-x 2 p p 4096 Aug 4 16:49 hugging_face/
drwxrwxr-x 2 p p 4096 Aug 4 18:03 TGI/
drwxrwxr-x 5 p p 4096 Aug 4 17:10 transformers/
p@mchn:/ai/lollms/personal_data/models$

`

I have these folders. The old stuff were in awq.

I downloaded this repo:

p@mchn:/dolphin-2.8-mistral-7b-v02$ ll
insgesamt 21660288
drwxrwxr-x 2 p p 4096 Aug 4 16:40 ./
drwxr-x--- 30 p p 4096 Aug 4 18:02 ../
-rw-rw-r-- 1 p p 51 Aug 4 13:50 added_tokens.json
-rw-rw-r-- 1 p p 657 Aug 4 13:50 config.json
-rw-rw-r-- 1 p p 7695875168 Aug 4 16:40 dolphin-2.8-mistral-7b-v02.Q8_0.gguf
-rw-rw-r-- 1 p p 138509 Aug 4 13:50 eval_results.json
-rw-rw-r-- 1 p p 1545 Aug 4 13:50 eval.sh
-rw-rw-r-- 1 p p 137 Aug 4 13:50 generation_config.json
-rw-rw-r-- 1 p p 1519 Aug 4 13:50 .gitattributes
-rw-rw-r-- 1 p p 4943178720 Aug 4 14:05 model-00001-of-00003.safetensors
-rw-rw-r-- 1 p p 4999819336 Aug 4 14:05 model-00002-of-00003.safetensors
-rw-rw-r-- 1 p p 4540532728 Aug 4 14:11 model-00003-of-00003.safetensors
-rw-rw-r-- 1 p p 23950 Aug 4 14:05 model.safetensors.index.json
-rw-rw-r-- 1 p p 7650 Aug 4 14:05 README.md
-rw-rw-r-- 1 p p 443 Aug 4 14:05 special_tokens_map.json
-rw-rw-r-- 1 p p 1675 Aug 4 14:05 tokenizer_config.json
-rw-rw-r-- 1 p p 493443 Aug 4 14:05 tokenizer.model
p@mchn:/dolphin-2.8-mistral-7b-v02$

and placed it in awq or hugging_face.

if I restart lollms, there is a new card for custom model. If I hit it, its saying that its building the model.
But it's getting not finished and I cannot accept the settings.

Then I restart lollms, but I get errors when trying selecting or stuff...

what is the difference of downloaded repos and downloaded over webui?
I tried some ways to install "Download from web" but it's not starting.
I tried reference from local file... Then it creates the reference but could also not load the model.
I feel really stupid and don't understand whats going on xD.

Bildschirmfoto 2024-08-04 um 18 07 17 Bildschirmfoto 2024-08-04 um 18 07 11 Bildschirmfoto 2024-08-04 um 17 30 37
@ParisNeo
Copy link
Owner

to use prexisting models you need to use add reference and put your local path to the model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants