FaceReader Desktop

Leading the field of Affective Computing

The world’s first tool capable of automatically analyzing facial expressions

Our history. Your results.

FaceReader is the world’s first tool capable of automatically analyzing facial expressions, providing users with an objective assessment of a person’s emotion.

In 2007 FaceReader 1.0 was released. Since then, a new FaceReader release has been brought to the market on annual basis, with FaceReader 9.1 being the current version released in 2023. With every purchase of FaceReader you receive a complete software package with full customer support offered by VicarVision’s partner, Noldus.

Trusted by 1000+ clients

FaceReader is used worldwide at more than 1000 universities, research institutes, and companies across various verticals, including consumer behavior research, usability studies, psychology, educational research and market research. Are you next?
Classifications

FaceReader unlocks a whole new world of facial classifications

Facial expressions

Happy, Sad, Angry, Surprised, Scared, Disgusted, Contempt and Neutral.

Facial states

A measure of the attitude of the participant (positive vs negative).

Characteristics

Gender, Age and the presence of Glasses, a Beard and a Moustache.

Valence

A measure of the attitude of the participant (positive vs negative).

Arousal

A measure of the activity of the participant (active vs inactive).

Global gaze

A global gaze direction (left, forward or right) helps to determine attention.

Head pose

Accurate head pose can be determined from the 3D face model.

Head positioning

Using the 3D head model, we can report the exact positioning of the head w.r.t. the camera.

Custom expressions

You can create your own custom expressions using the built in designer.
Modules

Add state of the art techniques to reveal next-level insights

Get more out of FaceReader. Our add-on modules expand the depth of our insights to meet (and exceed) all of your research needs. In one place.
Action Unit classification adds valuable information to facial expressions recorded by FaceReader. This add-on module allows for the automatic analysis of a selection of 20 common Action Units, such as raising of cheeks, wrinkling of nose, dimpling, and lip tightening.

Design your own algorithms for analysis of workload, fake versus genuine smiles, pain, embarrassment, and more. You can do this by combining variables such as basic facial expressions, Action Units, and heart rate or heart rate variability. This way, you can create your own custom expressions.

Pattented technology

With this module, based on patented technology, you can analyze heart rate and heart rate variability of the test participant without additional hardware, using the FaceReader camera.

The Photoplethysmography (PPG) technique measures the small changes in color caused by changes in blood volume under the skin epidermis. Determining the subject’s heart rate can be particularly useful as an additional indicator of arousal for subjects or situations where there is little variation in facial expressions.
For the analysis of behavior that is related to eating and drinking, the consumption behavior module is available on an experimental basis in FaceReader 9 and 9.1. This add-on module allows you to analyze the following behaviors:
  • Chewing and intake behavior (biting or sipping)
  • Number of chews and intake events
  • Interaction with food
  • Emotional response to specific food products
Dashboard output

Transform data into stunning visuals

Instead of generic data, FaceReader contains endless visualization options to make the data easily accessible (and appealing) for researchers.

Continuous expression intensities

FaceReader outputs the 6 basic expressions, Happy, Sad, Angry, Surprised, Scared, Disgusted and an extra Neutral state as continuous intensity values between zero and one.

New in FaceReader is the addition of Contempt as the 7th expression.

intensities example

Action unit detection

The six basic emotions are only a fraction of the possible facial expressions. A widely used method for describing the activation of the individual facial muscles is the Facial Action Coding System (Ekman 2002).

FaceReader can detect the 20 most common AUs.

Action units example

Circumplex model of affect

The circumplex model of affect describes the distribution of emotions in a 2D circular space, with arousal and valence dimensions.


Circumplex models (Russel 1980) are commonly used to assess liking in marketing, consumer science, and psychology.

circumplex graph

Expression summary

A summary of the expressions during a single analysis can be viewed in a easy understandable pie chart, showing overall responses.

Different subparts of the analysis can be selected to view the summary of the expressions.
Expression summary example

Heart rate

The current heart rate, including the Heart Rate variability (both RMSSD & SDNN).
Heart rate example

Facial states

FaceReader can automatically classify the state of some key parts of the participants face.
Facial states example
Custom expressions

Advanced modeling of your own expression definitions

Make your own

You can design your own algorithms for analysis of workload, pain, embarrassment, and infinitely more by combining variables such as facial expressions, Action Units, and heart rate.

Input Blocks

All the classifications are available as input blocks. You can use the Facial Expressions, Valence & Arousal, Action Units, Head Orientation, Head Position, Gaze Angles & Heart Rate

Processor Blocks

The processory blocks allow you to process the input blocks. These contain mathematical operations like for instance Maximum, Minimum, Sum, Average, but also Logical Operations like AND and OR and If.. Else.. statements.

You can do temporal averaging and other operations too.
Project analysis

Get results of your participants combined

All your participants in one project

In FaceReader you can (re)create your complete experiment, adding all your participants to one single project. The Project Analysis Module allows for analysis of the response of groups of participants, towards your stimuli.

Participants can be grouped based on independent variables, like age, gender or any manually entered variable.
FaceReader Project Explorer

Select & Filter

Fine grained data selection and participant filtering allows you to get insights for subgroups within your data.
Project analysis data picker

Compare

Quickly see the difference between subgroups or between stimuli.
Project analysis male subgroup
Project analysis female subgroup
Take a look

Ready to experience the FaceReader difference?

You are invited behind the scenes of FaceReader’s desktop solution. Take a look at our screenshots.

Request a free trial

Get your free example report

Get your free whitepaper