An advanced virtual keyboard system using AI for hand gesture detection and interaction, developed with Python, OpenCV, Mediapipe, and other supporting libraries. Created in collaboration with smbtk-ops.
AI USM Keyboards is an innovative solution that leverages artificial intelligence for hand gesture recognition, transforming them into inputs for a virtual keyboard interface. By combining computer vision and machine learning techniques, this project offers a touchless typing experience that can benefit a wide range of users. It is particularly useful for individuals with physical limitations, reducing dependency on traditional input devices. Additionally, it offers potential applications in fields such as medicine, education, assistive technology, and remote working scenarios, providing a hygienic alternative for public settings and healthcare facilities.
- Touchless Typing: Enhances accessibility for users with limited mobility and offers a more hygienic alternative to physical keyboards.
- Real-Time Hand Gesture Recognition: Utilizes computer vision models for fast, responsive input processing.
- Assistive Technology: Empowers individuals with disabilities by allowing interaction through natural hand movements.
- Healthcare Applications: Reduces the need for contact with input devices in sterile or sensitive medical environments.
- Flexible Integration: Can be extended or adapted for various use cases such as virtual reality, telemedicine, and public kiosks.
- Python: Core programming language for logic and development.
- OpenCV: Used for computer vision operations such as detecting and processing video input.
- Mediapipe: Powers hand-tracking and gesture recognition functionalities.
- Pygame: Facilitates user interface development and audio interactions.
- Pynput: Handles system keyboard input for mapping gestures.
- Pillow: Manages image rendering for text and visual feedback.
To set up this project on your local machine, follow these steps:
-
Clone the Repository:
git clone https://github.com/Lykman/AI_USM_Keyboards.git cd AI_USM_Keyboards
-
Create and Activate a Virtual Environment:
On Linux/MacOS:
python -m venv venv source venv/bin/activate
On Windows:
python -m venv venv venv\Scripts\activate
-
Install Dependencies:
pip install -r requirements.txt
To run the main script:
python main.py
=======
# AI_USM_Keyboards
An advanced virtual keyboard system using AI for hand gesture detection and interaction, built with Python, OpenCV, and Mediapipe.