-
Notifications
You must be signed in to change notification settings - Fork 130
Issues: microsoft/onnxruntime-genai
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
awq example runs into error with llama 3.2 3b due to embedding layer
#1089
opened Nov 22, 2024 by
tranlm
Add how to build from source for Android (and iOS)
platform:mobile
#1081
opened Nov 20, 2024 by
Agredo
Python Phi3Vision Sample Error when divided into multiple Functions
#1068
opened Nov 17, 2024 by
nmoeller
The onnxruntime-genai.dll(0.5.1.0) crashes when the windows C++ application is closed.
platform:windows
#1067
opened Nov 16, 2024 by
luomaojiang2016
DirectML Execution Provider Error: Unable to load D3D12Core.dll
ep:DML
#1054
opened Nov 9, 2024 by
sjpritchard
Benchmark script crashes when trying to run Llama 3.2 1b DML model
ep:DML
#982
opened Oct 14, 2024 by
natke
phi3.5 genai converted model output garbage results with input length around 3000 and 8000.
#954
opened Oct 3, 2024 by
yufang67
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.