Skip to main content

Working With Recordings, Findings, and AI

How to work with recordings, notes, findings, and AI in Lookback to turn raw sessions into shareable qualitative evidence.

Henrik Mattsson avatar
Written by Henrik Mattsson
Updated yesterday

Qualitative research does not become impactful when sessions end.


It becomes impactful when teams can work with the raw material in a way that preserves context, supports judgment, and makes evidence easy to share.

Lookback is designed to keep researchers close to recordings while reducing the amount of video that needs to be manually reviewed.


WHEN TO READ THIS

Read this article if you are:

• reviewing sessions after research
• trying to turn observations into insights
• working with large volumes of video
• involving stakeholders in sense-making
• using AI to explore qualitative data


RECORDINGS AS THE SOURCE OF TRUTH

In Lookback, recordings are the primary source of evidence.

All sessions - moderated, unmoderated, and AI-moderated - are immediately available inside the same project where the research was run.

This means:
• no exporting and re-uploading
• no loss of context between tools
• no artificial separation between collection and analysis

Seeing and hearing participants remains central. The goal is not to abstract away the data, but to make it navigable.


NOTES, FINDINGS, AND TIMESTAMPS

Lookback distinguishes between notes and findings.

Notes

Notes can be created by humans or by AI.

They are used to:
• capture observations
• mark interesting moments
• support later sense-making

Notes are lightweight and flexible.

Findings

A finding is a specific piece of video evidence, tied to a defined time range between two timestamps.

Findings are used to:
• anchor insights in real behavior
• create shared reference points
• support discussion and decision-making

Findings are not separated by round.

They live at the Project level, making it easy to see patterns across methods and iterations.


THE EUREKA BUTTON

During live sessions or replays, moderators and observers can use the Eureka button.

When pressed:
• the surrounding context is analyzed
• a draft note is created automatically
• the cognitive load of writing is reduced

These notes can later be refined or promoted to findings.

The goal is to capture important moments without breaking focus during live research.


AI AS A NAVIGATION TOOL

AI in Lookback is designed to reduce the amount of video that needs to be manually reviewed - not to replace human judgment.

Researchers typically need to find three kinds of things:

  1. Things they knew they were looking for

  2. Things they recognized as important when they saw them

  3. Things they did not recognize as important until later

Lookback supports all three.

• goals help AI surface relevant moments automatically
• humans capture unexpected moments with the Eureka button
• AI exploration helps reveal patterns that emerge over time

This is how AI and humans work together in Lookback.


THEMES AND PATTERNS

Findings can be tagged with themes.

Themes allow teams to:
• group evidence across sessions
• see recurring patterns
• filter findings by topic

Themes are not conclusions.

They are tools for organizing evidence.


HIGHLIGHT REELS

Highlight reels allow you to stitch multiple findings into a single shareable video.

Highlight reels are useful when:
• a pattern only makes sense across multiple clips
• you want to tell a coherent story
• a single finding is not sufficient on its own

Highlight reels are not a “higher” form of evidence than findings.

They are simply a way to present related findings together.


DISCOVER: EXPLORING WITH AI

Discover is an AI-powered interface for exploring project data.

It can:
• summarize themes and patterns
• surface relevant quotes
• point directly to video evidence

Discover acts as an agent inside the project.

It does not replace watching important moments - but it can dramatically reduce the need to revisit full sessions.


DISCUSSION AND SHARING

Sense-making is collaborative.

In Lookback, teams can:
• start threaded discussions directly on notes and findings
• @-mention teammates
• involve stakeholders without exposing sensitive data

This keeps discussion anchored in evidence, not memory.


WHAT TO READ NEXT

• Working With Stakeholders in Qualitative Research – for impact and collaboration
• Using Discover to Explore Research Data – for AI-assisted exploration
• The Lookback Player – for detailed playback and controls

Did this answer your question?