You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The commands you're using currently only support single-card processing. If you have a large number of text images to recognize, you can manually divide them into several batches and assign each batch to a different card for inference, enabling multi-card processing.
When i running multiple jobs, it use only one GPU card.
python3 tools/infer_e2e.py --img_path=/home/cx/test.png
How to make it use multiple GPU cards?
[root@localhost OpenOCR]# nvidia-smi
Sat Dec 28 21:24:36 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.216.03 Driver Version: 535.216.03 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 Tesla V100S-PCIE-32GB Off | 00000000:3B:00.0 Off | 0 |
| N/A 42C P0 59W / 250W | 4014MiB / 32768MiB | 15% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 1 Tesla V100S-PCIE-32GB Off | 00000000:AF:00.0 Off | 0 |
| N/A 29C P0 25W / 250W | 4MiB / 32768MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 213858 C python3 668MiB |
| 0 N/A N/A 213869 C python3 668MiB |
| 0 N/A N/A 213986 C python3 668MiB |
| 0 N/A N/A 214050 C python3 668MiB |
| 0 N/A N/A 214114 C python3 668MiB |
| 0 N/A N/A 214178 C python3 668MiB |
+---------------------------------------------------------------------------------------+
The text was updated successfully, but these errors were encountered: