Capture high-fidelity gaze signals without hardware, plugins, or installs.

For teams who
Understand human behaviour by measuring where people look during online tasks.
Measure attention, memory, and decision-making in scalable online studies.
Test landing pages and creative. Quantify what gets noticed to improve messaging and conversion.
Validate layouts and information hierarchy. See which elements users miss, ignore, or fixate on.
How it works
Chiasm stays out of the way—integrate once, run anywhere.
Build your experiment in jsPsych, Qualtrics, or any browser platform.
Add a lightweight script to your workflow to start capturing gaze data.
Export or stream clean, timestamped data for analysis or reporting.
Why Chiasm
Built for speed, scale, and seamless integrations.
FAQ
A few quick answers to set expectations up front.
Accuracy depends on webcam quality, lighting, head movement, and screen setup. Under favorable conditions, Chiasm is designed to approach the performance of hardware-based eye trackers. In preliminary internal evaluations, we observed a mean gaze offset (accuracy) of ~1.5° of visual angle and a mean spatial precision of ~1.0° of visual angle.
Currently, Chiasm works best in Google Chrome. We plan to support all modern browsers over time. Chiasm runs on laptops and desktops equipped with a webcam.
After sign-up, you’ll get an API token (a secret key used to connect your study and track usage), then add a lightweight JavaScript snippet to your study flow (jsPsych, Qualtrics, or custom). Participants calibrate, then your study runs as usual while gaze data is captured.
You get timestamped gaze data (x,y coordinates) linked to your custom study events/segments, plus session metadata such as screen resolution/size and calibration data.
Yes—if your study runs on your own setup, Chiasm can send gaze data directly to your server (including a local computer or your own cloud server). If you’re using a platform like Qualtrics, the data is stored on Chiasm’s servers and you export it afterward.
Pricing is usage-based: you only pay for the eye-tracking data you choose to process. You can enable eye tracking only for specific parts of a study (events/segments), and you won’t be charged for periods where gaze data isn’t needed—such as instructions, breaks, or questionnaires.
Usage is tracked per API token (a secret key that identifies which study/app is sending data). In the dashboard, you can assign time credits to tokens for each study and monitor usage as your study runs.
By default, Chiasm does not store webcam video. We process frames in real time to produce derived gaze signals and session metadata. If you explicitly opt in, we may store limited calibration/validation recordings for internal quality improvement of the eye-tracking system.
Chiasm uses cloud-based processing: webcam frames are streamed to the cloud and processed in real time, without being stored. This enables consistent processing and quality control across devices.
Yes. We carefully follow GDPR requirements in our processes and data-handling protocols. For details, see our privacy policy.
Request demo
Tell us about your study and we’ll help you launch quickly. Typical response time: 1 business day.