Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

_extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format' #160

Open
smartadpole opened this issue Aug 28, 2024 · 0 comments

Comments

@smartadpole
Copy link

开始加载模型配置
No sentence-transformers model found with name GanymedeNil/text2vec-base-chinese. Creating a new one with MEAN pooling.
/media/hao/work/LIB/python/webLLM/lib/python3.8/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884
  warnings.warn(
No sentence-transformers model found with name GanymedeNil/text2vec-base-chinese. Creating a new one with MEAN pooling.
Loading checkpoint shards: 100%|██████████| 7/7 [00:08<00:00,  1.28s/it]
模型配置加载成功
Backend TkAgg is interactive backend. Turning interactive mode on.
加载模型出错: _extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'
Running on local URL:  http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant