-
Notifications
You must be signed in to change notification settings - Fork 49
Issues: containers/ramalama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Document or include additional dependencies - huggingface cli and tqdm
#491
opened Nov 25, 2024 by
jarcher
Ramalama seems mispelled, and I end up mistyping it because of that
#489
opened Nov 25, 2024 by
stefwalter
'ramalama ps' returns exception on macOS when no container-based llms are running
#488
opened Nov 24, 2024 by
planetf1
[packit] Propose downstream failed for release v0.2.0
#482
opened Nov 22, 2024 by
packit-as-a-service
bot
Ramalama Container needs updating on the quay.io to use new llama-simple-chat
#458
opened Nov 15, 2024 by
bmahabirbu
Add podman serve --generate compose MODEL which would generate a docker-compose file for running AI Model Service.
good first issue
Good for newcomers
#184
opened Sep 24, 2024 by
rhatdan
Find a way to automatically build and push x86_64 and aarch64 images
#27
opened Aug 1, 2024 by
ericcurtin
ProTip!
Find all open issues with in progress development work with linked:pr.