BIOPAC® Systems, Inc. Logo

FaceReader Software

FaceReader is the premier professional software for automatic analysis of facial expressions

Real-time expression data synchronized with Physiology data and/or eye tracking

More reasons to smile!

Request More Info
expression analysis
Part #: FR-PROJECT, FR-ACTIONUNIT, FR-SOFTWARE

Easily gather emotion data with other biometrics for a complete subject profile

Emotion data provides crucial insights that allow researchers to explain complex behaviors in greater depth. FaceReader is ideal for collecting this data, and helps you to better understand human-human, human-machine, and human-product interactions. FaceReader emotion reading software locks onto a subject’s face and analyzes facial movements to classify the subject’s response. The software was “trained” with more than 10,000 manually annotated images, and accurate modeling of the face is achieved by describing 500 key points FaceReader provides validated objectivity in observations. Classifications include happy, sad, scared, disgusted, surprised, angry, contempt, and neutral. Add Action Units to measure three common affective attitudes: boredom, interest, confusion. FaceReader also provides gaze direction, head orientation, and person characteristics, such as gender and age.

Easily integrate with physiological parameters

For more comprehensive analysis, add licensing to integrate biometrics from an MP160 Research System or eye tracking data. When you start AcqKnowledge to begin recording, FaceReader data will be automatically synchronized and recorded in the same graph file. Record synchronized ECG, fEMG, EDA, etc. Monitor data in the FaceReader display for real-time feedback.

Add Capabilities to Expand Your Research

Module add-ons unlock research potential.

  • Action Unit Module adds automatic analysis of 20 Action Units to measure affective attitude
  • Project Analysis Module adds streamlined analysis & reporting tools
See More...

Seamless integration

Easily integrate and synchronize physiological data or eye tracking through AcqKnowledge

Complete facial expression analysis

Delivers objectivity in observations

Accurate modeling of the face by describing 500 key points delivers. Action Units available.

Useful for a variety of applications

Psychology, educational research, consumer behavior research, usability studies, neuromarketing, etc.

FaceReader software license
View Spec PDF

Are You Looking for These?

Part #: N/A
Categories: Video Monitoring Systems - Research
Subcategories: Face reader - Research

Details

Compatibility


FaceReader Software

Emotions affect everyone in daily life, and play a key role in non-verbal communication. They are also essential to understanding human behavior. FaceReader easily and reliably delivers participant emotion analysis. Facial expressions can be visualized as bar graphs, in a pie chart, and as a continuous signal. A gauge display summarizes the negativity or positivity of the emotion (valence). The timeline provides a detailed visual representation of the data. A separate reporting window displays a pie chart with percentages, a smiley, and a traffic light, indicating whether a person’s mood is positive, neutral, or negative. All visualizations are available in real-time and may be viewed afterward.

FaceReader runs on Windows only and current release 7.1 is supported on Windows 7 64 bit (SP 1) or Windows 10 (64 bit).

Integrating with BIOPAC

Choose a license add-on for AcqKnowledge 5 to integrate FaceReader 7.0 (FR-INTERFACE) or the complete MP160 System with FaceReader Integration License (MP160WSW-FR).

Action Unit Module

Nose wrinkler Cheek raiserDimpler
AU 9. Nose Wrinkler – AU 6. Cheek Raiser – AU 14. Dimpler

Action Units are the actions of individual muscles or groups of muscles. This add-on module allows for automatic analysis of a selection of 20 commonly Action Units (such as raising of cheeks, wrinkling of nose, dimpling, and lip tightening) to measure affective attitudes (such as interest, boredom, and confusion) and monitor combined activation/non-activation of specific Action Units—for instance, the emotional state confusion is correlated with either Action Unit 4 (Brow Lowerer) or 7 (Eyelid tightener).

When an action is active, its intensity is displayed in 5 categories: Trace (A), Slight (B), Pronounced (C), Severe (D), or Max (E). Output is presented on this scale with different colors and can be exported for further analysis in Excel, The Observer XT, or another program of your choice.

Action Units in FaceReader
Action Units are responsible for facial expressions.

Detecting boredom, confusion, and interest

FaceReader 7.1 introduced the analysis of three commonly occurring affective attitudes: interest, boredom, and confusion. Unlike regular facial expressions, these affective attitudes are computed over a window of time (typically 2-5 seconds), rather than per time frame. In addition, some of these affective attitudes also take into account certain additional facial cues like nodding or head shaking, that are also internally computed over the analysis history.
These affective analyses are available on an experimental basis.
Detecting confusion
Confusion is one of the affective attitudes that is now available on an experimental basis.

Project Analysis Module

The Project Analysis Module can be used for advanced analysis and reporting. With this module, you quickly gain insight into the effects of different stimuli. With the new version of FaceReader, you can now use images as a source of a stimulus as well!

Version 7 allows you to compare responses to different video stimuli in one view, offering faster insights into the effects of stimuli.

Independent variables

Selections of participants can easily be made automatically or manually, for example, by selecting all female test participants and comparing their responses to different commercials. You can also add independent variables such as educational level or previous knowledge level, which allows you to make groups (female with high education level) and compare results between the groups you have created.
Numerical analysis in FaceReader
Numerical analysis in FaceReader – compare groups.

Analysis

The Project Analysis Module offers a number of different analysis and reports:

  • Temporal Analysis – this type of analysis is based on a group response toward a single stimulus. The average response of the group can be viewed side-by-side with the stimulus video, the participant videos and line-charts of the absolute and relative average expression, arousal and valence. The results can also be presented as a pie chart. When the stimulus includes audio, this is also audible when reviewing your results.
  • Numerical Analysis – averages and standard deviations of the responses to stimuli and event markers can be viewed in tables, bar charts, and box-plots. All graphs can be saved as a picture for presentation purposes.

T-test in FaceReader
Analyze with a T-test which facial expression intensities differ significantly between this participant group and the other participant group.

Visual presentation of your data

The Project Analysis Module can create multiple visual presentations of your data, including:

  • An individual line graph shows the intensity of the recorded emotions and can be displayed in sync with the video of the test participant and the stimulus, giving you a complete overview.
  • When working with multiple test participants, a summary line graph can be displayed and synchronized with the stimulus providing a solid impression of the overall responses.
  • A pie chart which shows a summary of results of multiple test participants based on your own (marker) selection. Choose a pie chart of combined line graphs, whatever provides you with the best overview.
  • Box plots per emotion of the results of all test participants per stimulus are a real visual aid in determining which emotion prevailed on which stimulus.
  • All graphs can be saved as a picture for presentation purposes. Output (log files) can be exported to a program of your choice, such as Excel or The Observer XT.

Videos

FaceReader software modules do not include the FaceReader Integration License—add the FR-INTERFACE license for AcqKnowledge 5 to integrate FaceReader.

FaceReader Classifications Demo

FaceReader - Contempt classifier

FaceReader Affective Attitudes

Noldus The Observer XT Import/Export in AcqKnowledge

Integrating BIOPAC Research Systems | Data Acquisition & Analysis

Support

Downloads/Resources

Recommended Items

Spotlight On
free BIOPAC webinar

Effort & Motivation: Using a Hand Dynamometer in the MRI and in the Lab

Many studies use hand dynamometry to objectively quantify exerted effort during experiments most commonly related to the study of motivation.
We’ll focus on this topic and go over everything you need to know to record dynamometry data in the MRI or in the lab. Topics include
– Calibrating for maximum voluntary contraction (MVC)
– Real-time access to the dynamometer signal by third-party applications
– How researchers have used this equipment
– Creating a visual task that gives feedback on exerted effort as well as rewards to the participant

On Demand Playback

Register Now
Latest News

New Citations | BIOPAC in Motivation Studies

Discovering and understanding what motivates humans to produce better results has intrigued many researchers. Many researchers have explored the psychophysiological processes that drive our behavior. Here are some recent studies that have used BIOPAC systems to research motivation… Motivation and Pleasure Deficits Undermine the Benefits of Social Affiliation in Psychosis. Blanchard, J. J., Smith, J. […]

New Citations | BIOPAC in Eye Tracking Studies

Eye tracking technology has come a long way and has enabled researchers to conduct mobile experiments and track participants in real world scenarios. These featured studies demonstrate some of the use cases for mobile eye tracking technology. Here are some recent studies that have used BIOPAC systems for eye tracking research… Drivers’ gaze patterns when resuming […]

Read All
Request a Demonstration
Request a Demonstration