all notes
product

How Blinkwell measures blink rate on a Mac, privately

by Triantafyllos (Rose) SamarasMay 1, 20265 min read

Blinkwell watches your eyes from the menu bar. It runs entirely on your Mac, never sees the network, and doesn’t store a single camera frame. Here’s exactly how that works.

The signal Blinkwell tracks

Blinkwell tracks two things in real time:

  1. Blink rate and blink completeness: how many times per minute you blink, and what fraction of those blinks are full (lid fully reaches lid) versus incomplete.
  2. Posture: whether your head is drifting forward or your shoulders are collapsing into a slouch over time.

These are the two physical signals most strongly correlated with Computer Vision Syndrome and screen-time fatigue. Tracking them in real time is the difference between a wall-clock break app and a break reminder.

Apple’s Vision framework, on-device

Blinkwell uses Apple’s built-in Vision framework, the same one Photos uses to detect faces, to pull two classes of features from the FaceTime camera:

  • VNDetectFaceLandmarksRequest for eye landmarks (used to compute the eye-aspect ratio and a blink event each time it collapses below threshold).
  • VNDetectHumanBodyPoseRequest for shoulder and head position over time.

Both run as native Apple Silicon ML models, accelerated on the Neural Engine. A single inference takes a few milliseconds, and Blinkwell samples at roughly 5 to 10 Hz, which is more than enough to pick up every blink without taxing the CPU.

What “on-device” really means here

  • The camera frame is decoded into a CVPixelBuffer, passed to Vision, and discarded as soon as the inference returns.
  • Frames are never written to disk, not even to a temp file.
  • Frames are never sent over the network. Blinkwell has no remote inference, no analytics on raw images, and no upload path for video.
  • The only data that persists is a small local time series of blink and posture events (timestamp plus signal), used to power your history.
  • You can verify this. Blinkwell ships with the macOS app sandbox entitlement, requests only the com.apple.security.device.camera capability, and declares no outbound network entitlement on the helper that talks to the camera.

This is the right architecture for a tool that lives on your camera. It is also the only architecture that survives an ophthalmology-app threat model: anything that uploads frames, regardless of how well-meaning, is one breach away from a privacy nightmare.

Why nudges feel different

A wall-clock break app fires every 20 minutes. Blinkwell fires on the body. The decision logic is roughly:

  • Track a rolling window of blink rate and blink completeness.
  • If average blink rate falls below ~9 blinks/min for several minutes, suggest a soft “blink break”.
  • If posture stays in a forward-head pattern for more than a few minutes, suggest a posture nudge.
  • If neither signal is unusual, stay silent, even past the “20 minutes” mark.

The result is fewer, better-timed nudges. In my own usage, that turned out to be the difference between an app I closed and an app I keep running.

What Blinkwell is not

  • It is not a medical device. It does not diagnose dry eye disease, presbyopia, or any other condition.
  • It does not record video. There is no “timeline view” of your face.
  • It does not run a cloud service. There is no Blinkwell account.

Quick answers

Does Blinkwell upload my camera feed?
No. Frames are processed in memory using Apple’s Vision framework on your Mac and discarded immediately. They are never written to disk and never sent over the network.
How does Blinkwell detect blinks?
It uses Apple’s on-device face-landmark detector to compute an eye-aspect ratio many times per second. A blink is registered when that ratio crosses a threshold; a full versus incomplete blink is classified by how far the upper lid travels.
Does Blinkwell work without an internet connection?
Yes. Eye tracking, posture detection, and break logic all run locally. The internet is only used for licence verification and app updates, both completely separate from the camera path.
Why is on-device eye tracking better?
Privacy and latency. Inference happens in milliseconds with no network round-trip, nothing leaves your computer, and the architecture has no surface area for image breaches.
on-device mlapple vision frameworkprivacyblink detectionmacosproduct
about the author

Triantafyllos (Rose) Samaras, Founder, Blinkwell

Builds small, opinionated software out of Athens. Spends most of the day staring at a screen, which is how Blinkwell happened.

try blinkwell

A quiet menubar app for your eyes and back.

Download for Mac