Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA out of memory #83

Open
suriya030 opened this issue Apr 16, 2023 · 2 comments
Open

CUDA out of memory #83

suriya030 opened this issue Apr 16, 2023 · 2 comments

Comments

@suriya030
Copy link

While running the pre trained model code in Colab/local device getting the error of ' CUDA out of memory '

@suriya030
Copy link
Author

I even tried to increase attention stride to 4 , at max I can infer result on google Colab 2 images without completely filling the gpu .

@mli0603
Copy link
Owner

mli0603 commented Apr 18, 2023

Hi @suriya030

The memory consumption depends on various factors. Likely your input image is too high resolution. I assume you are not using the KITTI images by default.

To run on high res images, you can

  • Increase the attention stride as you did. Higher strides may help.
  • Use the lightweight implementation in the branch here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants