You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I have my own local model loaded using vLLM. So, it's using openAI API format. When I tried it to SmartScraperGraph, I got the error: TypeError: Completions.create() got an unexpected keyword argument 'model_tokens'
To Reproduce
importjsonfromscrapegraphai.graphsimportSmartScraperGraph# Define the configuration for the scraping pipelinegraph_config= {
"llm": {
"api_key": "sk-abc",
"model": "openai/Qwen2.5-7B",
"base_url": "http://127.0.0.1:8000/v1",
"model_token": 32768
},
"verbose": True,
"headless": False,
}
# Create the SmartScraperGraph instancesmart_scraper_graph=SmartScraperGraph(
prompt="Find some information about what does the company do, the name and a contact email.",
source="https://scrapegraphai.com/",
config=graph_config
)
# Run the pipelineresult=smart_scraper_graph.run()
print(json.dumps(result, indent=4))
Describe the bug
I have my own local model loaded using vLLM. So, it's using openAI API format. When I tried it to
SmartScraperGraph
, I got the error:TypeError: Completions.create() got an unexpected keyword argument 'model_tokens'
To Reproduce
Here is the error
Expected behavior
The result return as expected
The text was updated successfully, but these errors were encountered: