Skip to main content
GF.

Go1 · 2024

Building Insights from Zero to One

TL;DR

Go1's learning platforms collected enormous amounts of learning data. L&D managers had no reliable way to make sense of it. As Lead Product Designer on the L&D Manager experience, I led design from early concept testing through to a live product — shipping iteratively across a Beta and four V2 milestones. In the first year, the product reached over 7,000 unique visitors and 15,000 sessions, with exploratory actions growing 85% as the feature set deepened.

Customer interviewsConcept testingData visualisationUI designUsability testingDesign systemAnalyticsIterative delivery

Certain information has been omitted or obfuscated in this case study. The opinions presented here represent my views alone, not of my current or past employers.

Go1 Insights — skills and content analytics dashboard

Go1's learning platforms collected enormous amounts of learning data. But very few actionable insights were being surfaced out of the box. L&D managers had no reliable way to understand what skills their people were developing, whether their content was landing, or how they compared to others in their industry. The problem had existed for a while. In 2024, it finally became a priority.

Discovery

We spoke to seven L&D professionals across Finance, Healthcare, Hospitality, Automotive, Education, and IT, alongside Go1's Customer Success Managers who worked with these customers daily. Sessions were structured in two halves: a generative exploration of how they worked and what they measured, followed by an evaluative review of early concept designs.

The finding that reframed the project most was deceptively simple: good insights don't just help L&D managers demonstrate ROI after the fact — they drive the engagement that produces better ROI in the first place. Participants described feeling like they were “shooting in the dark.” Without reliable data, they couldn't improve programs, couldn't justify investment, and couldn't build the learning culture their organisations expected of them.

“You can't drive engagement without the right insights.”

— Research participant

Concept Testing

We tested a range of concepts spanning Skills, Content, Compliance, and Impact insights — drawn from an earlier vision for a Quarterly Learning Report. The goal was to understand which areas participants found most immediately useful, and which required capabilities we didn't yet have.

Content insights concept
Impact insights concept
Skills insights concept
Compliance insights concept

Skills insights emerged as the clearest priority — specifically, understanding what employees were developing and how that compared to their industry. Content insights, particularly around ratings and learner feedback, were a close second. The concept testing gave us enough confidence to narrow scope and move toward a first release.

Beta

The Beta was deliberately minimal — a single “Top Skills” view showing learner counts, average time spent per skill, and an industry benchmark column. Twenty customers were given early access, and we followed up with evaluative interviews after two weeks.

Skills Insights Beta release
Skills Insights Beta release

Participants had no trouble interpreting the data, and the overall sentiment was positive despite the limited feature set. But the interviews were most useful for what they revealed about the ceiling. Compliance learning was drowning out upskilling signal — admins needed a way to filter between the two. A single month of data wasn't enough to track campaigns that ran across quarters. And the industry benchmark, while compelling in concept, generated confusion about what was actually being compared.

“The industry benchmark is interesting but I'm not sure what I'm being compared to.”

— Beta participant

The Beta hadn't just validated the concept. It had written the roadmap.

V2

V2 was built across four milestones, each targeted at a gap the Beta had surfaced. The upskilling filter came first — letting admins toggle between compliance and upskilling learning, with upskilling as the default, so the skills list finally reflected the growth culture they were trying to build. Longer timeframes followed, introducing quarterly views that matched the cadence of real L&D campaigns. A skill detail view then gave admins the ability to click into any skill and see the content driving it, the learners developing it, and top-rated industry content they could curate directly into their library. The final milestone widened the lens further still — surfacing emerging skills trending across an organisation's industry, giving admins a reason to return to Insights regularly rather than only when checking a specific program.

Skills Insights — upskilling at a glance and sought after skills

Skills insights V2 and beyond

A parallel workstream tackled Content Insights — an area that was effectively starting from zero. What existed previously was a basic count of thumbs up and thumbs down responses for each piece of content: a blunt instrument that told admins very little about whether their learning was actually working. The goal was to build something genuinely useful in its place.

The new Content Insights surfaced five-star ratings and written learner feedback for every piece of content in an admin's library — giving them a real signal on what was landing and what wasn't. Beyond their own portal's data, admins could also see content that was trending and highly rated across their industry, opening up a new curation pathway. And to make the volume of qualitative feedback digestible, AI-generated summaries surfaced the key themes from learner reviews without requiring admins to read through every comment individually.

“The thing I miss most is basic information like viewing statistics of our courses and the written feedback of users on what they liked or disliked.”

— L&D manager, via Typeform feedback

That quote was collected before the new feature shipped. It captures exactly what we were building toward.

Content insights — overview of ratings and content engagement

Content insights

Outcomes

Since launching in October 2024, the product has accumulated over 15,000 sessions from around 7,000 unique visitors. Sessions grew from near zero at launch to a peak of around 550 per week by the end of the year — roughly a 3x increase over the period — and have sustained at around 450–500 since.

The more meaningful signal is what admins are doing within those sessions. Exploratory actions — clicking deeper into skills data — now account for 28% of all sessions, up 85% since launch. Curation actions, where admins save or add content to their library directly from Insights, account for 5%. Both of these were zero when the product launched; they only became possible as V2 milestones shipped.

What building this product taught me is that analytics features are uniquely sensitive to the gap between what's visible and what's actionable. A number without context doesn't inform — it just prompts another question. Every milestone in V2 was essentially an attempt to close that gap a little further: more context, more depth, more ways to act on what you're seeing. The exploratory and curation data reflects exactly what we set out to achieve — admins treating Insights less like a dashboard to check and more like a tool to work with.