Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use multiple GPUs ? #72

Open
phanxlee opened this issue Dec 28, 2024 · 1 comment
Open

How to use multiple GPUs ? #72

phanxlee opened this issue Dec 28, 2024 · 1 comment

Comments

@phanxlee
Copy link

When i running multiple jobs, it use only one GPU card.

python3 tools/infer_e2e.py --img_path=/home/cx/test.png

How to make it use multiple GPU cards?

[root@localhost OpenOCR]# nvidia-smi
Sat Dec 28 21:24:36 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.216.03 Driver Version: 535.216.03 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 Tesla V100S-PCIE-32GB Off | 00000000:3B:00.0 Off | 0 |
| N/A 42C P0 59W / 250W | 4014MiB / 32768MiB | 15% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 1 Tesla V100S-PCIE-32GB Off | 00000000:AF:00.0 Off | 0 |
| N/A 29C P0 25W / 250W | 4MiB / 32768MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 213858 C python3 668MiB |
| 0 N/A N/A 213869 C python3 668MiB |
| 0 N/A N/A 213986 C python3 668MiB |
| 0 N/A N/A 214050 C python3 668MiB |
| 0 N/A N/A 214114 C python3 668MiB |
| 0 N/A N/A 214178 C python3 668MiB |
+---------------------------------------------------------------------------------------+

@Topdu
Copy link
Owner

Topdu commented Jan 2, 2025

The commands you're using currently only support single-card processing. If you have a large number of text images to recognize, you can manually divide them into several batches and assign each batch to a different card for inference, enabling multi-card processing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants