Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to do inference with trained model? #113

Open
013292 opened this issue Feb 15, 2024 · 2 comments
Open

How to do inference with trained model? #113

013292 opened this issue Feb 15, 2024 · 2 comments

Comments

@013292
Copy link

013292 commented Feb 15, 2024

First of all, appreciate for sharing the source code.
Following the instructions of glue-factory, we've got the trained model saved as checkpoint_best.tar which includes parameters of super-point and light-glue. However, the example code in this repo loads the parameters of the extractor and the matcher individually.
Is there any suggestion to close the gap between these two repo?
Any script of loading the trained models for inference?

Thank you :-)

@Phil26AT
Copy link
Collaborator

Phil26AT commented Apr 8, 2024

Hi @013292, great that you got your model running with glue-factory! Indeed this small step is missing. What you need to do is open the tar (the content is a dict), and extract the "matcher" item there. These weights should be directly re-usable with LightGlue. You can load the tar with the utils in glue-factory: https://github.com/cvg/glue-factory/blob/1f56839db2242929960d70f85bfac6c19ef2821c/gluefactory/utils/experiments.py#L65-L91

@aidongmandexiaowowo
Copy link

Hello, I would like to ask you, I also want to use the trained model for inference, but I still don’t know how to use the experiments.py file in glue-factory to export the matchers weights and extractor weights. Looking forward to your reply

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants