Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Facial Emotion Recognition using OpenCV and Deepface

License

NotificationsYou must be signed in to change notification settings

manish-9245/Facial-Emotion-Recognition-using-OpenCV-and-Deepface

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This project implements real-time facial emotion detection using thedeepface library and OpenCV. It captures video from the webcam, detects faces, and predicts the emotions associated with each face. The emotion labels are displayed on the frames in real-time.This is probably the shortest code to implement realtime emotion monitoring.

  • Give this repository a ⭐ if you liked it, since it took me time to understand and implement this
  • Made with ❤️ byManish Tiwari

Dependencies

  • deepface: A deep learning facial analysis library that provides pre-trained models for facial emotion detection. It relies on TensorFlow for the underlying deep learning operations.
  • OpenCV: An open-source computer vision library used for image and video processing.

Usage

Initial steps:

  • Git clone this repository Run:git clone https://github.com/manish-9245/Facial-Emotion-Recognition-using-OpenCV-and-Deepface.git
  • Run:cd Facial-Emotion-Recognition-using-OpenCV-and-Deepface
  1. Install the required dependencies:

    • You can usepip install -r requirements.txt
    • Or you can install dependencies individually:
      • pip install deepface
      • pip install tf_keras
      • pip install opencv-python
  2. Download the Haar cascade XML file for face detection:

  3. Run the code:

    • Execute the Python script.
    • The webcam will open, and real-time facial emotion detection will start.
    • Emotion labels will be displayed on the frames around detected faces.

Approach

  1. Import the necessary libraries:cv2 for video capture and image processing, anddeepface for the emotion detection model.

  2. Load the Haar cascade classifier XML file for face detection usingcv2.CascadeClassifier().

  3. Start capturing video from the default webcam usingcv2.VideoCapture().

  4. Enter a continuous loop to process each frame of the captured video.

  5. Convert each frame to grayscale usingcv2.cvtColor().

  6. Detect faces in the grayscale frame usingface_cascade.detectMultiScale().

  7. For each detected face, extract the face ROI (Region of Interest).

  8. Preprocess the face image for emotion detection using thedeepface library's built-in preprocessing function.

  9. Make predictions for the emotions using the pre-trained emotion detection model provided by thedeepface library.

  10. Retrieve the index of the predicted emotion and map it to the corresponding emotion label.

  11. Draw a rectangle around the detected face and label it with the predicted emotion usingcv2.rectangle() andcv2.putText().

  12. Display the resulting frame with the labeled emotion usingcv2.imshow().

  13. If the 'q' key is pressed, exit the loop.

  14. Release the video capture and close all windows usingcap.release() andcv2.destroyAllWindows().

image


[8]ページ先頭

©2009-2026 Movatter.jp