The one-paragraph definition

Emotion AI dating uses real-time facial Action Unit analysis — the same technology originally developed for clinical psychology, defence, and financial services — to read a user's involuntary emotional responses during a short video session. These responses are mapped to a Valence-Arousal-Dominance (VAD) emotional profile. Users are then matched with people whose emotional profiles are genuinely compatible with theirs, rather than with people whose photographs they found attractive.

The core insight: involuntary emotional response is a better predictor of interpersonal chemistry than physical appearance or stated preferences. Emotion AI can measure it. Photographs cannot.

How it is different from photo-based matching

Every mainstream dating app — Tinder, Hinge, Bumble, Match.com, eharmony — uses photographs as the primary matching signal. The user assesses photos, expresses interest via swipe or like, and hopes that the chemistry emerges in person. Research consistently shows this mechanism performs poorly at predicting actual connection: photo-based match quality is a weak predictor of in-person chemistry.

Emotion AI dating replaces the photograph with a different input: how a person genuinely responds to stimuli in real time. Because the responses are involuntary — they occur within 200-400 milliseconds, faster than conscious control — they cannot be curated or gamed. The system reads who you actually are rather than who you have chosen to present.

The result is matches based on genuine emotional resonance rather than photo assessment. In Attune's closed beta, 94% of matched users rated their first conversation as genuinely interesting or better — a substantially higher rate than typical first-date satisfaction on mainstream apps.

The underlying science: FACS and the VAD model

Emotion AI dating is grounded in two established scientific frameworks. The Facial Action Coding System (FACS), developed by psychologist Paul Ekman and Wallace V. Friesen in 1978, provides a comprehensive anatomical taxonomy of human facial expressions. It maps 44 discrete facial muscle movements — called Action Units — to emotional states. FACS is the established standard for facial expression research across psychology, neuroscience, and human-computer interaction.

The Valence-Arousal-Dominance (VAD) model provides a three-dimensional framework for representing emotional experience: Valence (positive-to-negative affect), Arousal (activation level), and Dominance (sense of control). Together, these three dimensions capture the full range of human emotional experience and produce an individual emotional fingerprint that is meaningfully distinct between people.

Attune's EchoDepth engine analyses 44 FACS Action Units at under 200ms latency during a 90-second video session, mapping activations to a VAD profile. This profile is matched against other profiles in the user pool using a compatibility model derived from relationship psychology research.

Privacy and data architecture

A common concern about emotion AI dating is the privacy implications of facial analysis. Attune's architecture is built to address this directly.

All facial analysis happens on the user's device. No raw video, images, or biometric data are transmitted or stored at any point. What is stored is an anonymised mathematical emotional vector — a set of numbers representing the emotional profile — that cannot be reverse-engineered to reconstruct the user's face or identify the user from the stored data.

This architecture is designed to avoid classification as biometric data processing under UK GDPR Article 9, which would otherwise trigger the highest level of regulatory scrutiny. The stored emotional vector is not a biometric record; it is a derived mathematical representation.

Is emotion AI dating the same as facial recognition?

No. Facial recognition identifies who a person is by matching facial geometry against a database. Emotion AI analyses how a person's face moves — the micro-expressions and muscle activations that accompany emotional responses — without recording or identifying the face itself.

Attune does not record your face. It does not store an image of your face. It does not use your face to identify you. It reads the pattern of muscle movements that occur during your emotional responses and derives a mathematical emotional profile from that pattern. The distinction is significant both technically and legally.

Who built the first emotion AI dating app?

Attune, built by Cavefish Ltd of Cardiff, Wales, is the first consumer application of emotion AI to dating. Cavefish developed EchoDepth — its FACS-based facial Action Unit analysis engine — for enterprise applications in defence, financial services, and sports performance before applying the technology to human connection. Attune launches in the UK in Q3 2026.

44FACS Action Units analysed per frame
<200msdetection latency
94%beta match satisfaction
Q3 2026UK launch

The first emotion AI dating app. Launching UK Q3 2026.

No photos. No questionnaires. No performance. Genuine emotional compatibility matching. Join the waitlist for early access and six months of Premium free.

Join the waitlistRead the methodology →