Making Sense of AI Insights: It’s All About Framing
Making Sense of AI Insights: It’s All About Framing
AI can summarize information fast. That part’s easy. What’s challenging is making sure what comes out is actually useful - not just a wall of text, but insights you can act on.
We’ve seen this play out clearly with Otter vs Gong
Otter is great at summarizing - you get transcripts, bullet points, speaker tags - but it’s general-purpose. It doesn’t tell you what to do with that info. You still have to connect the dots yourself.
Gong, on the other hand extracts insights with a goal - helping you become a better sales rep or account manager. It highlights objection patterns, talk ratios, and deal risks. It frames the data in a way that pushes you to improve.
Same data. One is just information. The other is insight. That’s the difference framing makes.
Our Philosophy: Show Less, Mean More
We’re not here to flood you with every possible data point. Our goal is to surface what’s essential - and cut the noise.
Yes, we can analyze everything that happens in a session replay. But that doesn’t mean we should. Instead, we focus on the pieces that actually drive understanding.
Learning from the Best
Companies like Gong and Fireflies have raised the bar when it comes to extracting meaningful insights. They didn’t just build a note-taker - they built a feedback loop. The tech works because it’s aligned to a purpose.
That’s what we’ve taken to heart. We’re doing something similar for session replays: helping teams understand what’s actually happening in their product and why certain users struggle.
How We Approach Insight Extraction
You can point AI at a dataset in a hundred ways. But unless you set it up with the right framework, you’ll end up with noise - not clarity.
So we built a structure that keeps us (and you) honest. We separate facts from guesses and make it clear what’s what.
Quantitative – What happened, how often, to whom.
Qualitative – Why we think it happened, always labeled as a hypothesis - not a fact.
This avoids the trap of treating AI output like gospel. It’s a guide, not a final answer.
Tone and Format Matter
Even the best insight can be ignored if it’s buried in the wrong format. We care a lot about how things are presented — clarity, structure, and confidence without overreach.
So everything we share follows this format:
User intent – What were they trying to do?
User actions – What did they actually do? (optional)
Points of friction – Where did they get stuck?
Why might this be happening?
How frequently does it happen?
How many users or sessions are affected?
We always include sources and link directly to the session or moment in question. If we can’t support an insight with context or coverage, we don’t include it.
The Bar for Inclusion
We keep it tight. If something doesn’t meet our bar - if it can’t be tied to a real user pattern, doesn’t impact enough sessions, or can’t be explained clearly - we leave it out.
You’re busy. We’re here to make your life easier, not overload you with more things to sift through.
In the end, our job isn’t to answer every question. It’s to point you to the ones that matter - and help you ask better ones.