WWDC25 Swift Student Challenge - Deaf Children's Sign Language Education App
Signy Play is a project that helps deaf children who are learning the alphabet for the first time learn the American Sign Language (ASL) through games.

Period : 2025.01.31 - 2025.02.23
Development Environment: Xcode 16 App Playground
Development Equipment: iPad Pro (12.9 inches, 11 inches), iPad Mini
SwiftUI - Advanced app experiences and tools.
CoreML - Train a variety of hand images and extract them into a CoreML model.
Vision - Use the device's camera to identify and extract information from hand images.
AVFoundation - Use the camera and show a preview.
Users can choose a quiz from one of the alphabet, word themes, and select the number of questions.

Users can use the front-facing camera to recognize hands, detect hand shapes, and practice sign language.

I used data from Kaggle(https://www.kaggle.com/grassknoted/asl-alphabet) to train a Convolutional Neural Network.
The quiz question data was searched on Google and designed in Figma in the form of appropriate cards.