PostHog Does a Lot. UX Monitoring Isn't One of Them.
PostHog combines product analytics, session replay, feature flags, and experiments in one platform. Impressive breadth. It also doesn't tell you when your checkout confusion score spikes after a deploy.
PostHog is the most honest product in this category. Open source, self-hostable, priced on actual usage, and built with engineers as the primary audience. Over 90% of PostHog companies use the free tier. That stat tells you something: the tool is accessible enough that most teams never outgrow the free version for their core use cases.
I use PostHog. Most of the teams I talk to use PostHog. This post isn't arguing against it.
It's arguing that "PostHog plus nothing" leaves a specific gap that product teams discover late, usually when support tickets start piling up from a page nobody was watching.
What PostHog actually covers
Product analytics: event tracking, funnels, retention, cohorts, user paths. Session replay with element inspection. Feature flags with gradual rollouts. A/B experiments with statistical significance tracking. A survey product. A data warehouse connector.
It's a lot. The cloud pricing reflects that: free up to generous event limits, then usage-based from there. PostHog's $450/month plan covers 50M events, reasonable for a mid-sized product.
The engineering investment for self-hosting is real. Not prohibitive for a team with infrastructure experience, but not zero either. Most teams use the cloud version.
Where the gap is
PostHog's session replay shows you what users did. Its funnels show you where users dropped off. Neither tells you when a page's behavioral quality degrades in real time.
The scenario that exposes the gap: you ship a deploy on a Tuesday afternoon. Error rates are stable in PostHog. Conversion funnel looks normal. Three days later, you notice the pricing page conversion rate is down 11%.
PostHog can help you investigate after you notice the problem. Session replay lets you find sessions from around the deploy time. Funnels show when the drop started.
What it won't do: fire a Slack alert at 3:47 PM Tuesday when the confusion score on your pricing page went from 24 to 68, 90 minutes after the deploy, with 79% of the signal concentrated on dead clicks on a feature comparison row that broke in the new layout.
That's deploy correlation. PostHog doesn't have it.
The behavioral signal gap
PostHog tracks what users did. Flusterduck tracks how users felt while doing it.
Rage clicks, loop navigation, form hesitation, focus traps, scroll hijack: all 18 of these signals are invisible in standard event tracking. A rage click looks like a regular click in PostHog's event log. Five rage clicks in two seconds on the same element is a user in crisis. Without the behavioral layer, you can't tell them apart.
Research on cursor behavior from Jeff Huang's CHI 2011 study found that cursor movement correlates strongly with user attention and intent. When cursor behavior becomes erratic (rapid back-and-forth, repeated targeting of non-responsive elements), it's a signal that something on the page is wrong. PostHog's event tracking records the clicks. It doesn't record the pattern.
Who should use PostHog alone vs. PostHog plus monitoring
Teams under 500 daily active users: PostHog alone is probably fine. Session volume is low enough that you can manually review recordings when something seems off.
Teams above 2,000 daily active users: the volume makes manual investigation slow, and the statistical noise means problems can hide in funnel data for days before they're visible. Proactive monitoring that surfaces problems to you (rather than waiting for you to look) starts paying for itself.
Teams shipping multiple deploys per week: deploy correlation has direct ROI. Every deploy that degrades UX is costing conversions until someone notices. Shrinking that window from days to 90 minutes is measurable.
PostHog is the best tool I know for understanding what users do. Flusterduck is built to tell you when they're struggling. Both questions matter. The gap between them is where confusing products live undetected.