Skip to main content

From Sessions to Findings

Learn how raw session evidence becomes findings in Lookback, why findings are timestamped video moments, and how themes, reels, and AI support qualitative sense-making without replacing judgment.

Henrik Mattsson avatar
Written by Henrik Mattsson
Updated over a week ago

In Lookback, understanding does not come from summaries or dashboards.
It comes from evidence.

That evidence begins with sessions and becomes meaningful through findings — short, timestamped moments that capture something worth paying attention to.

This article explains how that transformation works, and why Lookback is designed this way.


Sessions are raw material

Sessions capture what actually happened:

  • what participants did

  • what they said

  • where they hesitated, struggled, or changed course

  • the context in which behavior occurred

Sessions are necessary, but they are not yet usable insight.

Qualitative research requires selection, interpretation, and comparison - work that starts live, happens during, and continues after sessions.


Findings are the primary unit of evidence

A Finding is a short, timestamped video clip taken from a session.

Findings represent moments that matter:

  • a point of confusion

  • a workaround

  • a statement that reveals intent

  • a contradiction between what someone says and does

Findings are not conclusions.

They are evidence.

Each finding stays directly connected to the original session and moment in time.


Why findings are video-based

Lookback treats video as central because qualitative insight often depends on:

  • tone

  • timing

  • hesitation

  • body language

  • context

Video preserves ambiguity and nuance in a way summaries cannot.

This allows stakeholders and researchers to:

  • see the same evidence

  • interpret it together

  • challenge assumptions without relying on second-hand descriptions


Findings emerge in three ways

Findings can be created through different forms of attention:

Things researchers knew they were looking for

Researchers define goals in advance, often informed by stakeholder input. These goals help focus attention during and after sessions, and provide context for Eureka.

Things researchers notice as they happen

During live or replayed sessions, researchers may recognize important moments and capture them immediately. Lookback makes it easy to capture those moments while under cognitive load.

Things only recognized later

Some patterns only emerge when sessions are compared, revisited, or viewed by others. Lookback supports returning to evidence without losing context.

AI assists in all three cases - but does not replace judgment.


Themes group related findings

A Theme groups related findings across sessions.

Themes help researchers:

  • see patterns over time

  • compare evidence across participants

  • understand variation and consistency

Themes do not flatten evidence. Each finding remains visible and traceable.

Themes support synthesis - not abstraction away from evidence.


Reels are for storytelling, not stronger evidence

Reels are collections of findings assembled to:

  • illustrate a pattern

  • tell a story

  • communicate insight efficiently

Reels do not create new evidence.

A single finding can be sufficient on its own. Reels simply provide a way to present multiple findings together when that is helpful.


The role of AI in analysis

Lookback’s AI is designed to reduce cognitive load, not to automate conclusions.

AI can help:

  • surface moments aligned with defined goals

  • highlight patterns across sessions

  • reduce the amount of video that must be manually reviewed

AI does not decide what is important.


It assists researchers in staying close to evidence while working more efficiently.

Human judgment remains central.


Why this evidence model matters

By keeping findings:

  • timestamped

  • video-based

  • tied to sessions

  • visible to stakeholders

Lookback supports:

  • transparency

  • shared understanding

  • evidence-based decision-making

This makes it harder to over-summarize, over-generalize, or lose nuance - and easier to reason together about what is actually happening. This sets the tone for great collaboration.


How this fits the bigger picture

  • Sessions capture what happened

  • Findings mark what matters

  • Themes reveal patterns

  • Reels help communicate insight

All of it remains grounded in observable evidence.


What to explore next

To go deeper:

  • Learn how research roles interact with evidence differently

  • Explore how stakeholders collaborate around findings

  • See how AI supports sense-making without bypassing judgment

Did this answer your question?