-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How much memory is needed to run the example? #66
Comments
Same issue here |
The STTR builds a full resolution feature so it can sparsely sample the feature at different stride as discussed in Sec 3.5. This unfortunately, together with attention mechanism, requires a large memory. To avoid this, there are two ways
Let me know if you have more issues! |
Thanks for your kindly reply.Now i have quite large pictures such as 1920*1080,Even if i use sttr-light model,i still get OOM error on GTX2080TI.Any suggestions such as resize the picture to a smaller size? |
Hi @lfxx Yes, downsampling will definitely help. A 2x downsampling will cut the memory consumption by more than 4 times. |
i run inference example on a 2080ti,but out of memory
The text was updated successfully, but these errors were encountered: