Bounce Rate Tells You They Left. Confusion Score Tells You Why.
A high bounce rate is a symptom. The cause is almost always confusion on a specific page element.
Our /pricing page had a 72% bounce rate. Every week at standup, someone mentioned it. "We should redesign the pricing page." "Maybe we need clearer copy." "The tiers are confusing."
Nobody knew which of those things was true. We were guessing at cause from an outcome metric.
What bounce rate actually measures
Bounce rate is the percentage of visitors who leave after viewing one page. Useful in aggregate. A 90% bounce rate on your homepage is probably bad. 50% might be fine depending on the content type.
It tells you how many people left. Nothing else.
You can segment it by source, device, browser. You can compare it across time periods. At the end of that analysis, you have a better-described outcome. You still don't have a cause.
What confusion score actually measures
A confusion score is a deviation from baseline. Not "how many people bounced" but "are users exhibiting more frustration signals on this page right now than what's normal for this page?"
When our pricing page score hit 71, the element breakdown was unambiguous: 84% of the frustration events were dead clicks on the feature comparison table. Specifically, a column header that had a hover effect and looked like it would show a tooltip. It wasn't interactive. Users clicked it expecting an explanation of "Seat-level access." Nothing happened. They clicked harder. They left.
We added the tooltip. Confusion score dropped to 28. Bounce rate dropped 14 percentage points.
Same copy. Same tiers. Same pricing. One non-interactive element made interactive.
The diagnostic gap
Google Analytics, Mixpanel, Amplitude: all excellent at measuring what happened. Funnels, retention, activation, conversion. These tools are built around actions. Page view, button click, form submit, purchase.
None of them tell you what didn't happen. The click that went nowhere. The form field stared at for 11 seconds. The loop between two pages seven times before someone gave up.
Confusion score is diagnostic where bounce rate is symptomatic. You still need both. Bounce rate tells you which pages to investigate. Confusion score tells you what to fix when you get there.
The co-occurrence signal
One dead click on a pricing table is a user misclick. Fifty dead clicks on the same element from 34 different users in 90 minutes means the design is actively misleading people.
The confusion score's co-occurrence multiplier exists for this. Distributed frustration across days is background noise. Clustered frustration within a short time window means something is broken right now, or has always been broken and enough users are hitting it simultaneously to confirm it.
An alert fires. You investigate. The element is identified. You fix it. The score drops. That loop doesn't exist if all you're watching is a bounce rate that updates once a day.
What this means in practice
Bounce rate and confusion score answer different questions. One is "are people leaving?" The other is "what is making them leave?" You need a clear answer to the second before you can fix the first.
The pricing page had a 72% bounce rate for four months. The confusion score would have told us about the column header in the first week.