[TCSS 2024] MAE pre-training models (ViT and ConvNeXt) using AffectNet images for static facial expression recognition (SFER).
-
Updated
Jun 3, 2025 - Python
[TCSS 2024] MAE pre-training models (ViT and ConvNeXt) using AffectNet images for static facial expression recognition (SFER).
Hands-on computer vision projects exploring face recognition, object tracking, digit classification, and gesture detection using Python and OpenCV.
This project is a real-time facial emotion recognition system using OpenCV, Mediapipe, and DeepFace. It captures video from a webcam, detects facial landmarks, and analyzes emotions in real-time using deep learning models.
🌿 An interactive AI-powered plant that reacts to your facial emotions. Built with Transformers, PyTorch, and Streamlit — capture your mood, and the plant responds with joy, calmness, or sadness!
Automating E-Government services using AI: sentiment analysis, digit recognition & facial detection.
Add a description, image, and links to the facial-expression topic page so that developers can more easily learn about it.
To associate your repository with the facial-expression topic, visit your repo's landing page and select "manage topics."