Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama._types.ResponseError #305

Open
heyuqi1970 opened this issue Nov 19, 2024 · 1 comment
Open

ollama._types.ResponseError #305

heyuqi1970 opened this issue Nov 19, 2024 · 1 comment

Comments

@heyuqi1970
Copy link

windows 10 运行ollama demo,ollama ==0.4.2, python==3.9
image

提示错误:

INFO:lightrag:[Entity Extraction]...
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST http://127.0.0.1:11434/api/chat "HTTP/1.1 503 Service Unavailable"
INFO:lightrag:Writing graph with 0 nodes, 0 edges
Traceback (most recent call last):
  File "C:\Users\heyuqi\dev\rag\LightRAG\examples\lightrag_ollama_demo.py", line 35, in <module>
    rag.insert(f.read())
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\lightrag.py", line 227, in insert
    return loop.run_until_complete(self.ainsert(string_or_strings))
  File "C:\Users\heyuqi\AppData\Local\Programs\Python\Python39\lib\asyncio\base_events.py", line 647, in run_until_complete
    return future.result()
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\lightrag.py", line 276, in ainsert
    maybe_new_kg = await extract_entities(
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\operate.py", line 333, in extract_entities
    results = await asyncio.gather(
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\operate.py", line 276, in _process_single_content
    glean_result = await use_llm_func(continue_prompt, history_messages=history)
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\utils.py", line 89, in wait_func
    result = await func(*args, **kwargs)
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\llm.py", line 523, in ollama_model_complete
    return await ollama_model_if_cache(
  File "C:\Users\heyuqi\dev\rag\LightRAG\lightrag\llm.py", line 319, in ollama_model_if_cache
    response = await ollama_client.chat(model=model, messages=messages, **kwargs)
  File "C:\Users\heyuqi\dev\rag\venv\lib\site-packages\ollama\_client.py", line 654, in chat
    return await self._request_stream(
  File "C:\Users\heyuqi\dev\rag\venv\lib\site-packages\ollama\_client.py", line 518, in _request_stream
    response = await self._request(*args, **kwargs)
  File "C:\Users\heyuqi\dev\rag\venv\lib\site-packages\ollama\_client.py", line 488, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError
@penghd
Copy link

penghd commented Nov 19, 2024

这是解析不出关系,用qwen2.5:14b
再不行就设置上下文12800
如果显存不够还是买4090吧😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants