Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Json mode #287

Open
b10902118 opened this issue Nov 16, 2024 · 1 comment
Open

Json mode #287

b10902118 opened this issue Nov 16, 2024 · 1 comment

Comments

@b10902118
Copy link

Currently, the json response for keyword extraction is done by:

result = await use_model_func(kw_prompt)
json_text = locate_json_string_body_from_string(result)

But today, most language models already support response in json format. Manual parsing is clumsy and can be unstable.
https://github.com/ollama/ollama-python/blob/dc38fe467585037e02a4bc27a100cf6cb10202ff/ollama/_client.py#L277
https://github.com/openai/openai-python?tab=readme-ov-file#nested-params

If this is desired, I can create a pull request. Thanks.

@LarFii
Copy link
Collaborator

LarFii commented Nov 19, 2024

Thank you very much! We took this approach because some models do not support JSON responses directly, so we opted for a method that, while clumsy, is more universal. However, we would greatly welcome a pull request to add support for models that do provide JSON responses directly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants