This is am accessibility project made by a team of five, for people who want to learn sign language. It's implemented in Swift using many Apple frameworks such as:
- CreateML
- CoreML
- Vision
- AVFoundation
- SwiftUI
Here we trained an machine learning model to recognise hand gestures and movements to make possible the check on the sign the user is tring to learn.