Indian Sign Language is a translation system designed to bridge communication gaps by converting Indian Sign Language (ISL) gestures into text and speech in real-time. This project leverages computer vision and machine learning models to recognize ISL gestures and translate them into natural language text, making communication more accessible for the hearing and speech-impaired community.
- Real-time Gesture Recognition: Uses computer vision to capture and interpret ISL gestures.
- Text and Speech Output: Translates recognized gestures into text and synthesizes them into speech.
- Multi-Language Support: Offers translation into multiple languages, enhancing accessibility.
- User-Friendly Interface: Designed to be intuitive and accessible.
-
Clone the repository:
git clone https://github.com/grimreapermanasvi/Indian_Sign_Language.git cd Indian_Sign_Language
-
Install required dependencies:
pip install -r requirements.txt
-
Ensure that your environment supports necessary libraries for computer vision and NLP.
Run the application with:
python app.py
Place your hand in the frame, and the system will capture and interpret the ISL gestures, displaying the translated text and generating corresponding speech output.
Contributions are welcome! Please fork the repository and submit a pull request with any enhancements or bug fixes.
This project is licensed under the MIT License. See the LICENSE file for details.
Special thanks to organizations and contributors promoting accessible communication for individuals with hearing and speech impairments.