This project implements a virtual zoom system using hand gestures, utilizing OpenCV and cvzone to track hand movements and zoom in/out based on distance between hands. The system detects hands in real-time and resizes an image dynamically based on gesture input.
- Hand gesture-based zooming
- Real-time hand tracking using OpenCV and cvzone
- Smooth zooming effect with distance measurement
- Works with any webcam
git clone https://github.com/yourusername/AI-Virtual-Zoom-Gesture.git
cd AI-Virtual-Zoom-Gesture
Ensure you have Python installed, then run:
pip install opencv-python cvzone numpy
Run the script:
python AIVirtualZoomGesture.py
- Start the script → Webcam turns on.
- Show both hands with index & middle fingers up → Zoom starts.
- Move hands closer → Zooms in.
- Move hands apart → Zooms out.
- Release hands → Resets the zoom.
- Hand tracking: Detects hands using
cvzone.HandTrackingModule
. - Distance measurement: Calculates distance between hand centers.
- Dynamic image resizing: Resizes the image based on hand distance.
Issue | Solution |
---|---|
ModuleNotFoundError: No module named 'cvzone' |
Run pip install cvzone |
Webcam not detected | Check cv2.VideoCapture(0) , try 1 if using an external webcam |
Image not appearing | Ensure pikatchu.jpeg is in the correct directory |
- Add hand tracking smoothing for better accuracy.
- Implement single-hand zooming (e.g., thumb & index finger).
- Display zoom percentage for user feedback.
This project is licensed under the MIT License.
🚀 Developed by Sayantika Laskar