
Stream your webcam through joint-probability EMFACS scoring, Russell circumplex affect, and pose/gaze tracking — all locally. Nothing leaves your machine unless you opt in.
Sit relaxed. Look at the camera. Don't smile, don't frown — just rest your face.
A real-time mirror that turns your facial expression into Action Units, EMFACS emotions, dimensional affect, and a colour.
FACS Studio computes everything locally. The Insight feature is the one exception — it sends a 30-second summary of your affect (numbers only, no video) to a Cloudflare Worker that proxies Anthropic's API. Deploy your own from worker/api.js and paste the URL below.
localStorage on this device.Pick an EMFACS prototype, or record your own reference and practice repeatability.
Get into the expression you want to save. Capture begins after the countdown.
Add images or videos. They play full-stage in sequence while your facial response is recorded — perfect for affect studies, ad reaction tests, or reference-emotion training.