Short explanation for terms that are found within visage|SDK documentation and in face tracking in general.
Fundamental actions of individual muscles or groups of muscles, estimated and returned by the visage|SDK (e.g. lower lip drop, right outer brow raise). Often applied to a 3D object’s Morph targets.
Estimation of a person’s age in a frame or a continuous stream of frames.
Estimation of intensities of human emotions from a predefined set in a frame or a continuous stream of frames.
Localization of pupil positions in a continuous stream of frames
Detection of face bounding boxes in a frame
A mask-like augmented reality that adds virtual objects to an individual's face
Process of determining person’s identify by checking its face descriptor against a database of labeled face descriptors
An obstructed view of the face where only parts of the face are visible. Obstructions include hands, glasses, mask, beard, etc.
Extracting and matching face descriptors from a frame.
Unique identifier of the human face, usually represented as an array of values.
Tracking/localization of feature points in a continuous stream of frames.
Process of verifying whether two face descriptors match, i.e. belong to the same person.
Salient points on the human face.
Binary estimation of a person’s gender in a frame or a continuous stream of frames.
Estimation of head pose in a continuous stream of frames.
A variation of the base mesh. Morph targets are typically used for facial animation. The base mesh defines the neutral expression and the morph targets define expression such as "smile", "frown", "eyes closed".
Animating a 3D object using morph targets. When applied to a human face, for example, the head is first modeled with a neutral expression. A "target deformation" is then created for each other expression.
Detection of more than one face bounding boxes in a single frame.
Tracking/localization of feature points of multiple faces in a continuous stream of frames.
A camera-based system pointed at the driver’s face which provides a real-time evaluation of the presence and the state of the driver.
Monitoring of all passengers in the vehicle to better understand their state and condition.
Monitoring the level of drowsiness in the driver (by monitoring eye closure, blinking, yawning, etc.)
Verifying the driver's identity, e.g. by using face recognition, usually in order to provide access to specific car functions.
The function of the car that allows it to take control of a least one significant car function from the driver when necessary.