Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using custom inference script and models from Hub #102

Open
Tarun02 opened this issue Sep 21, 2023 · 1 comment
Open

Using custom inference script and models from Hub #102

Tarun02 opened this issue Sep 21, 2023 · 1 comment

Comments

@Tarun02
Copy link

Tarun02 commented Sep 21, 2023

Hello,

Can we use custom inference script without downloading the model to the S3 ? From the link, it says that we need to download the model artifacts and push them to S3 before using the custom inference script.

This will add considerable overhead depending on the model size.

@babeal
Copy link

babeal commented Oct 13, 2023

+1 if I have to download the weights and repackage them, it kind of defeats the purpose of this library. The embeddings models "feature-extraction" return the last hidden state which is missing a step. The only thing I can do is repackage everything. I can't even add a file "code/inference.py" to the huggingface repository as the FILE_LIST_NAMES prevents it from being downloaded.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants