2026-02-19signalsfrustration-detectioncompetitive

Rage Clicks Are Not Enough

Detecting rage clicks catches 1 of 18 frustration signals. Here's what you're missing.

Microsoft Clarity detects rage clicks. PostHog detects rage clicks. Hotjar, on the $80/month plan, detects rage clicks. That's the whole frustration detection story for most tools. One signal.

Rage clicks are real. When a user clicks the same button 5 times in 2 seconds, they're frustrated. No argument there. But rage clicking is the most visible, most dramatic expression of frustration. It's the user yelling. Most frustrated users don't yell. They go quiet. They hesitate. They loop. They leave.

91% of unsatisfied users don't complain. They just bounce. If your frustration detection only catches the ones who slam the mouse, you're surveilling the loudest 9% and ignoring everyone else.

What frustration actually looks like

I watched (I know, I know) about 400 session recordings across two SaaS products while building Flusterduck. I was cataloging behaviors. What do confused people do before they abandon a page?

They don't always rage click. Here's what I saw, over and over:

They pause on form fields. A user focuses an input labeled "Company Size" and sits there for 8 seconds. Do you want the number of employees? Revenue bracket? Headcount? The label is ambiguous. They type something, delete it, type something else. This is form hesitation. Weight: 12.

They loop between pages. Pricing, then features, back to pricing, then docs, back to pricing. They're trying to understand the product before buying, and the content isn't answering their questions. This is loop navigation. Weight: 20.

They scroll up and down the same section. Past the pricing table, back up, past it again. The information is there but it isn't clear. This is scroll bounce. Weight: 8.

They click things that look clickable but aren't. A styled div with a hover effect and pointer cursor, no click handler. Dead click. Weight: 12. The distance to the nearest real interactive element is diagnostic. A 4px miss means your tap target is too small. A 150px miss means they thought a non-interactive element was a button.

They rotate their phone. Twice, three times in 30 seconds. The responsive layout isn't working at their viewport size and they're searching for one that does. Orientation thrash. Weight: 8.

They Tab furiously. 14 Tab presses in 5 seconds, cycling through nav elements trying to reach a form they can see but can't focus. This is tab-thrashing. Weight: 15. No other UX monitoring tool detects this. Zero.

The co-occurrence problem

Any single signal, alone, could be innocent. A rage click on a carousel arrow is just someone advancing slides quickly. A scroll bounce on a blog post is someone rereading a paragraph they found interesting.

But signals compound. When the same page shows rage clicks AND dead clicks AND form hesitation from the same cluster of users in a 15-minute window, that page is broken. Individually, maybe not alarming. Together, absolutely.

Flusterduck applies a co-occurrence multiplier. 10+ users frustrated within 5 minutes: 1.5x. 20+: 2.0x. 50+: 2.5x. Simultaneous frustration means something is actively broken. Scattered frustration across hours is noise.

No rage-click-only tool can do this, because it doesn't have enough signal types to triangulate.

What the numbers look like

Flusterduck tracks 18 signals across four categories:

9 desktop, 5 mobile/touch, 3 accessibility, 1 content. Each weighted between 6 and 25 based on correlation with actual abandonment. Rage clicks are the heaviest at 25, but they're 1 of 18 inputs.

A tool that only detects rage clicks is measuring frustration with a thermometer that only reads fevers.

Accessibility frustration is invisible everywhere else

The three accessibility signals (tab-thrashing, focus traps, keyboard navigation frustration) don't exist in any competing product. Not Clarity. Not FullStory. Not LogRocket. Nobody is monitoring whether keyboard and screen reader users can actually use the site.

A focus trap, where a keyboard user is stuck in a modal and can't escape, is weighted at 20. Same as loop navigation. Because being literally trapped in a UI element is at least as frustrating as going in circles between pages.

If your frustration detection ignores keyboard users, you're not detecting frustration. You're detecting mouse frustration and calling it done.

All posts