Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 386 Bytes

README.md

File metadata and controls

11 lines (9 loc) · 386 Bytes

ThePupasRepo

This is am accessibility project made by a team of five, for people who want to learn sign language. It's implemented in Swift using many Apple frameworks such as:

  • CreateML
  • CoreML
  • Vision
  • AVFoundation
  • SwiftUI

Here we trained an machine learning model to recognise hand gestures and movements to make possible the check on the sign the user is tring to learn.