Project Introduction
Project Name: AI Sign Language Game
  • Disciplines and OU in collaboration: ENG & HLS & STEM Centre

Key Features:
Interactive game for learning basic sign language to enhance the understanding between the hearing and hearing-impaired community.

Project Description:

Jointly developed by the students from Health and Life Sciences Discipline and Engineering Discipline, the project aims to develop an interactive game to encourage more people to learn basic sign language, which will enhance mutual understanding between the hearing and hearing-impaired community.

Sign language is the natural medium for the deaf community. There are around 155,000 people with hearing impairment in Hong Kong (2.2% of total population). According to The Hong Kong Council of Social Service, there are only 56 qualified Sign Language interpreters listed on the "Hong Kong Sign Language Interpreter Names list". In order to raise the social awareness of the importance of sign language, the students have developed the AI Sign Language Game, facilitated by the VTC STEM Education Centre, and hope that everyone can learn some basic phrases of sign language for communication with the hearing impaired.

In this interactive game, the interface guides the player to watch a video demonstration, which requires some familiarity with specific words and phrases before starting the game. The Artificial Intelligence system adopts the computer vision technology - a real-time system - to extract human body, hand, and facial keypoints (in 130 total keypoints) on single images, which will be compared with the pre-trained models for image classification and accuracy checking. The framework for sign language recognition uses machine learning: Neural Network and Support Vector Machine to classify the keypoints of human body, hand and face with images in training the dataset.

Function Index

About Project