Have you ever wished you could understand sign language?
A real-time translation software that detects user's hand and predicts what ASL letter they are holding up. Utilizes CNNs mostly trained on this dataset. Run for a demo: $ python main.py
Implemented:
- CNN on raw ASL data from online source using TensorFlow and PyTorch
- CNN on raw self-collected webcam data using TensorFlow
- CNN on coordinates of hand given by processing raw online data through MediaPipe
- Model fine-tuning (online source model was fine-tuned with self-collected data)
- Computer vision using OpenCV and MediaPipe hand detection library
- Basic spell check using SpellChecker
Demo video: