Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

是否可以使用tgi, vllm, lightllm等llm推理框架来做为推理后端而不是用ollama呢? #545

Open
Wq-dd opened this issue Oct 14, 2024 · 0 comments

Comments

@Wq-dd
Copy link

Wq-dd commented Oct 14, 2024

Please Describe The Problem To Be Solved
(Replace This Text: Please present a concise description of the problem to be addressed by this feature request. Please be clear what parts of the problem are considered to be in-scope and out-of-scope.)

(Optional): Suggest A Solution
(Replace This Text: A concise description of your preferred solution. Things to address include:

  • Details of the technical implementation
  • Tradeoffs made in design decisions
  • Caveats and considerations for the future

If there are multiple solutions, please present each one separately. Save comparisons for the very end.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant