Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference takes one and a half day to run on a single movie #99

Open
fatihdinc opened this issue Jun 6, 2023 · 0 comments
Open

Inference takes one and a half day to run on a single movie #99

fatihdinc opened this issue Jun 6, 2023 · 0 comments

Comments

@fatihdinc
Copy link

Hi!

I have been using deep interpolation on our large-scale calcium imaging movies, but it seems that even with a state-of-the-art GPU (3090 Ti), inference step takes around 1.5 days in a 512x512x50,000 movie. I am suspecting that the inference is not using the GPU by default? Which value should I change for it to use GPU and also load with multiprocessing? In contrast, a reasonable training process takes 1-2 hours with good convergence on the first 2000 frames, which seems to be enough for these movies.

Also, is there a demo for inference on 1p movies? I was able to see a training script, but not demo. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant