Facial expression detection


how does the SDK calculate/know the facial expressions used to calculate the likelihood of an emotion ??
how does setDetectSmile (bool activate) work when its set to true for example ?


Here is a general overview:

In a nutshell, using deep learning AI models, the Affectiva SDK first locates faces in an image, and then for each face, it analyzes the expressions and emotions present.