AMD GPU Linux #427
Replies: 3 comments 2 replies
-
try gpt4all binding |
Beta Was this translation helpful? Give feedback.
-
Hi, and thanks for sharing this. Well, the audio starts talking as soon as a sentence is received by the Webui. It is not yet generated by the backend. I wonder if the wheel you installed has some modifications that makes streaming not working! In LoLLMs exllamaV2 I activate stream mode which allows me to read the outputs one by one and then show it. Can you share the Rocm wheel address. I can add that to the installer so that other people can use it from the start. |
Beta Was this translation helpful? Give feedback.
-
Hi, I have taken 2 screen recordings to show what I mean, I'm not the best at explaining things! You will see from lollms_1 video it takes some time to run until outputting, in lollms_2 you will see what happens when I stop the generation and it prints the output. I have included my terminal windows so that you can see the token generation etc. For the exllamav2 I downloaded from the following link: https://github.com/turboderp/exllamav2/releases/download/v0.0.7/exllamav2-0.0.7+rocm5.6-cp310-cp310-linux_x86_64.whl Thank you |
Beta Was this translation helpful? Give feedback.
-
I'm having difficulty understanding if it is possible to install lollms-webui on linux with amd gpu and rocm etc. I am aware that certain bindings work with amd gpu,however I am unclear on how to make it all work together as on the settings screen AMD GPU is not detected and when I install the bindings they all install specific to nvidia gpu. Can anyone help me? Thank you.
Beta Was this translation helpful? Give feedback.
All reactions