The Live-Emotion-Color-Detector is an application designed to detect human emotions in real-time using a webcam feed. Utilizing the robust RMN model trained on the FER-2013 dataset, this application classifies facial expressions into distinct emotions and highlights them with unique color codes. This application serves as a uselful tool for augmenting interactive experiences and engaging users with intuitive emotion recognition.
Image: Face images courtesy of the AICE
- Real-time Emotion Detection: Quickly identifies and classifies emotions from facial expressions as they occur.
- Color-Coded Feedback: Assigns unique colors to different emotions for intuitive and immediate understanding.
- Flexible Application: Ideal for diverse contexts including interactive installations, educational tools, and enhancing customer service experiences with intuitive emotion recognition.
The Live-Emotion-Color-Detector uses a distinct color code to visually represent each detected emotion in real-time. This approach allows for immediate and intuitive recognition of the emotional state being displayed. Below is the color coding rule used by the application:
- Angry: Red
RGB(0, 0, 255)
- Disgust: Dark Green
RGB(0, 128, 0)
- Fear: Magenta
RGB(255, 0, 255)
- Happy: Pink
RGB(147, 20, 255)
- Sad: Blue
RGB(255, 0, 0)
- Surprise: Cyan
RGB(0, 255, 255)
- Neutral: White
RGB(255, 255, 255)
Each color has been carefully selected to represent the essence of its corresponding emotion, enhancing the user's ability to quickly understand and react to the detected emotional states.
Ensure you have the following installed:
- Python 3.8 or later
- OpenCV
- TensorFlow
- Clone the repository to your local machine:
git clone https://github.com/Op27/Live-Emotion-Color-Detector.git
- Install the required dependencies:
pip install -r requirements.txt
- Running the Application
To start the application, navigate to the project directory and run:
python app.py
The Live-Emotion-Color-Detector leverages the power of the Residual Masking Network (RMN), a state-of-the-art facial expression recognition model that has shown remarkable accuracy in emotion detection tasks. The RMN model, which is the centerpiece of this application, performed the best on the FER2013 dataset with an impressive accuracy of 76.82% as of 21 February 2024, making it the leading choice for emotion detection.
Facial Expression Recognition using RMN involves analyzing facial expressions from video feed in real-time to classify them into distinct emotions. This model stands out due to its unique approach to handling the spatial hierarchies between facial parts to recognize emotions accurately. For more technical insights, visit the RMN GitHub page.
This project is licensed under the MIT License - see the LICENSE file for details.