iMotions empowers researchers to explore, collect and analyze human behavior through modules divided into four central behavioral categories: Arousal, Emotion, Attention, and Cognitive. Discover how ...
iMotions supports multimodal clinical research to help researchers leverage physiology to evaluate therapeutic interventions, investigate nonconscious behaviors during simulations, expand assistive ...
2024’s Oscars brought emotional contrasts, with blockbusters evoking joy and Oscar nominees leaning toward intense narratives. Using Affectiva Media Analytics and iMotions’ respiration insights, this ...
Communication between individuals involves multiple factors: attending to the other person’s thoughts, interpreting their non-verbal behavior, and (un)intentionally mirroring their expression- and ...
Affectiva’s patent portfolio reflects our innovation mindset. Our patents are diverse and span different types of technology. They entail AI, machine learning, deep learning, computer vision, speech ...
In this paper we introduce AFFDEX 2.0 – a toolkit for analyzing facial expressions in the wild, that is, it is intended for users aiming to; a) estimate the 3D head pose, b) detect facial Action Units ...
For decades, Audience Measurement has been the cornerstone of media planning and advertising effectiveness around the globe. Traditionally, agencies and networks relied on ratings and demographic ...
In collaboration with Center for Marketing and Sales Innovation at USF, Affective(ly) Research is an annual in-person gathering where researchers working with biometrics and emotional insights gather ...
Affectiva’s emotion database has now grown to nearly 6 million faces analyzed in 90 countries. To be precise, we have now gathered more than 13.000.000 face videos, for a total of 38,944 hours of data ...