In May 2023, we tested the start of the journey for teams to find out about and book a discovery peer review. Users recognised the step-by-step pattern and responded positively to being about to see and move through the journey ahead, from finding out about, to booking and getting a report.

However, guidance scope changed as we moved from alpha into beta. This design history documents the design changes we made and why.

Start of user journey into assessments changed

In beta phase, we added assessment, as well as alpha and beta peer review guidance.

As a result, we found that the user’s entry points into the assessment process had changed.

A team could be at discovery – so start there, but they could also have gone straight to alpha, or a later phase, depending on the delivery stage, or when they joined a team.

As a result, the step-by-step pattern started to feel laborious and over engineered the guidance.

Users told us:

“the general step-by step-guide... why are you using this? Where do I come in?’ Participant 5

“...Why is ‘book’ hidden? I just want to book my service assessment.’ Participant 5

‘how would I find out about this? How would I get here?’ Participant 1

How we got there: in discovery users were less aware of peer reviews

We knew that team awareness of discovery peer reviews in DfE was low.

So when we tested the step-by-step pattern, people appeared to walk through the navigation with ease. They would start with checking what type of assurance they needed before reviewing guidance for service teams. They went on to make a booking.

The step-by-step pattern was seen as reassuring – so much so, that it was requested to be a constant navigation:

‘I miss the righthand side nav. My safety blanket has gone!’ Participant 1

Scope changed in beta: how we brought service assessment and alpha and beta peer review guidance into the journey

As we built out the MVP for finding out about and booking a discovery peer review, we also brought in service assessments. The booking and administering the booking processes would remain the same. Only guidance would be different.

Our first iteration was based on the assumption that the step-by-step pattern had worked for users in alpha. We relooked at the flow of finding out about discovery peer reviews and iterated the information architecture to include service assessments and alpha and beta peer reviews.

version 1 of step by step guidance to assurance that includes discovery peer reviews and service assessments.

Interaction, service and content design sketched the flow on paper before moving to Lucid frames then the prototype kit. We led with phases of delivery, then the user group within each phase, for example, the service team or assessors.

version 2 of step by step guidance to assurance that includes discovery peer reviews and service assessments.

We assumed the journey would be linear, from discovery through to alpha, beta and live

The user journey started with discovery peer reviews, then moved into alpha or beta service assessments or peer reviews.

They would then complete a survey and provide feedback on the experience.

We took the journey one level higher and shifted the architecture of content for each phase into phase sections.

We knew we would need to crit and retest the journey and our assumptions.

version 3 of step by step guidance to assurance that includes discovery peer reviews and service assessments.

We built out content within each section by basing it on user needs and acceptance criteria, pair writing with subject matter experts and putting it through our workflow for content.

We ran content crits on any new guidance written for assessments, this included crits with xgov assessors, lead assessors and new and experienced assessors.

We knew we would have to reconsider the ‘landing page’ or entry point into the service assessment guidance and how teams find out about assurance in DfE. The current focus was on the content within the journey.

In alpha: content was created and validated but the start of the journey needed reconsidering for all user groups

As part of testing the Check what assurance you need tool, we tested this higher level user journey of how someone in a service team, or an assessor, may start finding out about assessments in DfE.

This is what we learned:

  • a team's journey through the assurance process is not always linear

  • sometimes, all someone wants to do is ‘book a thing’

  • navigating through service team and assessor content in one place is a challenge

Changed step-by-step pattern to flat content with side navigation

The step-by-step guide pattern was no longer working or meeting needs.

We reviewed the information architecture and broke it down into elements, considering what users were trying to do at each stage.

We removed the step-by-step pattern. We split the user groups into 2: ‘for teams’ and ‘for assessors’ - and used this as the main architecture for content to sit under each heading in a side navigation bar with an overview page of assessments in DfE.

We reduced a 6-step process with 19 pieces of guidance to 12 pieces of guidance spilt into 2 user groups.

We removed pages of content, for example, the ‘book’ and ‘manage’ pages and ‘design crits’ and put the relevant links to the service in the right part of the user journeys.

Screenshot of simplified information architecture for assessments and peer reviews

Ran a design crit to review the iterated content

We reviewed the content at a design crit and made minor adjustments. Overall content was well received and navigation appeared clear.

We’re ready to share and test the guidance as part of our MVP for teams and assessors to find out about service assessments and peer reviews in DfE.

Next steps

These include:

  • test the guidance when we launch the Service assessment service

  • look at the service assessment intranet pages to direct users to this guidance