Open In App

Facial Expression Recognizer using FER – Using Deep Neural Net

In our daily life, we knowingly or unknowingly carry out different types of Facial Expressions. These movements convey the emotional state of humans.

We can judge the mood and mental state of the next person by his Facial Expression. In the early Twentieth century, Ekman and Friesen defined `six` basic emotions.



This expression does not change with cultures, they are universal. Six Facial Expressions are:-

In this article, I’ll share how to build a Facial Expression Recognizer using the `FER` library from python.



FER Library: 

Facial Expression Recognition Library is developed by Justin Shenk. This Library requires OpenCV>=3.2 and Tensorflow>=1.7.0 dependencies installed in the system. Faces are detected using OpenCV’s Haar Cascade classifier. For more information and the Source code of FER Library, you can visit FER’s GitHub page here.

 Setting up our code!

For this article, you can use an online code editor called Repl.it, or you can use your favorite code editor. It can be installed through  

pip:`$ pip install fer`

1. Edit your new `main.py` file with the following code:




import cv2
 
from fer import FER
 
import matplotlib.pyplot as plt
 
import matplotlib.image as mpimg

The code above says that:

 2. Next, we’ll predict emotions of Still Images by giving Input Image:




# Input Image
input_image = cv2.imread("smile.jpg")
emotion_detector = FER()
# Output image's information
print(emotion_detector.detect_emotions(input_image))


Input Image

 Give a try to understand what the code above is trying to say. Here is the explanation of the above code:

Below is Sample Output:




[{'box': [277, 90, 48, 63], 'emotions': {
  'angry': 0.02, 'disgust': 0.0, 'fear': 0.05,
  'happy': 0.16, 'neutral': 0.09, 'sad': 0.27, 'surprise': 0.41}]

What is going on under hood???

The `detector` is the main part of our code. Let’s see what the detector do:

A detector is an object created to store `FER()`. Detector returns an Ordered Dictionary of Bounding Box Notations. It observes and detects where the face is situated and classifies it in decimal values from `0` to `1` with a probability of all 6 emotions respectively. FER exhibit Keras model built with CNN or well know as Convolutional Neural Networks. It stores the values in the `HDF5` model. FER by default detects faces using OpenCV’s  Haar Cascade Classifier. Alternatively, we can use a more advanced Multi-Cascade Convolutional Network in short `MTCNN`. It will use Peltarion API in the backend in place of the Keras model. By default, FER uses CNN, but if we want to use advanced MTCNN just pass `mtcnn=True` in FER parentheses just like `detector = FER(mtcnn=True)`. For further use we will use the MTCNN model for more accurate results.

Data:

Data is known as the new oil of the Twenty-First Century. We can do different things with the data generated above which we will discuss in the `Hacking Section` at last so stay tuned. We will store the emotion and detected face. Then display bounding yellow box around face detected and scores below it. Let’s see below how we can do this. 




import cv2
from fer import FER
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
input_image = cv2.imread("smile.jpg")
emotion_detector = FER(mtcnn=True)

Until now we have reached till here.

Next, we will declare the new variable `result` and store our output in a separate array.




# Save output in result variable
result = emotion_detector.detect_emotions(input_image)

Now it is time to Highlight Yellow Bounding Box around face:




bounding_box = result[0]["box"]
emotions = result[0]["emotions"]
cv2.rectangle(input_image,(
  bounding_box[0], bounding_box[1]),(
  bounding_box[0] + bounding_box[2], bounding_box[1] + bounding_box[3]),
              (0, 155, 255), 2,)

An explanation of the above code is:

Add Score to Bounding Box

Now we will add Score to Bounding Box by using the following code:




emotion_name, score = emotion_detector.top_emotion(image )
for index, (emotion_name, score) in enumerate(emotions.items()):
   color = (211, 211,211) if score < 0.01 else (255, 0, 0)
   emotion_score = "{}: {}".format(emotion_name, "{:.2f}".format(score))
 
   cv2.putText(input_image,emotion_score,
               (bounding_box[0], bounding_box[1] + bounding_box[3] + 30 + index * 15),
               cv2.FONT_HERSHEY_SIMPLEX,0.5,color,1,cv2.LINE_AA,)
 
#Save the result in new image file
cv2.imwrite("emotion.jpg", input_image)

The above code says:

Display Output Image




# Read image file using matplotlib's image module
result_image = mpimg.imread('emotion.jpg')
imgplot = plt.imshow(result_image)
# Display Output Image
plt.show()

We at the final step of ours article. Where we display output image by using matplotlib Library functions `mpimg.imread(’emotion.jpg’)` method read the image provided in parentheses. Store the image image in `imgplot` by using `plt.imshow(img)`. And Finally `plt.show()` will display our output image.

Output Image

Our Facial Expression Recognizer is just starting point for new and innovative FER systems. The time has come where you can build something new by what you have learned until now. Here are some of my ideas for which you can try:

  1. We have build FER for still images. You can try FER for video analysis of facial emotions present in the video.
  2. Creating a real-time Facial Expression Recognizer using a Live Camera will be a very exciting thing to try.
  3. You can create a different recommendation system based upon the emotion of Humans by extracting and manipulating data from the `result` variable. Like recommending fun books, videos, gifs, etc to people who have sad emotions or motivation books, videos for people in fear or anger.

Article Tags :