What we did

We tested the user journey for delivery teams, from when they start a discovery peer review booking, through to submitting it. Then, adding artefacts and teammates to support the peer review.

What we found

All users were successfully able to make a booking for a discovery peer review, however slight confusion in the journey existed in the purpose, project code and portfolio questions.

All users were able to add team members and artefacts in the services with limited issues. However, the service needs to outline what is expected in terms of artefacts and when.

Accessing the report was easy and broadly aligned with users expectations, however there are concerns about who from the team has access to the report at what points in the journey.

Users understood the start of the journey but it wasn’t clear how to prepare and when to book

Start page to book a discovery peer review. H1 is Book an assessment, with 4 bullets telling the user what they can do in the service, for example, book a review, write a report, update project details. Page also contains a link to drafts and guidance.

Added relevant guidance

Made the link to the step by step guidance clearer, and directed users towards it as the first thing they should do before they start a booking.

Book a discovery peer review, H1 is Book an assessment, with 4 bullets explaining what you can do in the service: book, update a project, read reports, write a report. You can also continue with a draft or read guidance about assurance in DfE.

Slight confusion around the purpose of the discovery

Some users weren't completely clear as to how much information they should include about their discovery.

This is the second question in the service.

They wondered whether they should add their portfolio details here. And, questioned which disciplines in the team they should include.

As users continued with the booking, some felt it would be better to have the overall portfolio and wider service questions first, with the purpose of the discovery coming second; so it then sits within the wider context of DfE.

Tweaked hint content and review order of questions in future

Hint content before:

Tell us the purpose of your discovery. For example, if it's part of a wider service or based on policy intent. This will help us to arrange a review panel with the most relevant experience

Hint content after:

Tell us the purpose of your discovery. For example, if it's part of a wider service or based on policy intent. This description will help us to arrange a review panel with the most relevant experience

Users did not know their project code

Most users were unsure about the project code. What it is, and where to find it.

Do you know your project code?

Added description to question with guidance

We will link out to where they can find it in DfE. We’ll still give users the option to continue the journey by selecting, ‘no’ and continuing onwards.

As the GOV.UK Design System states, that links within hint content make the content inaccessible, we changed the style to paragraph text. The text includes: other names the project code may be known as, an example, and a link to the DDaT project tracker to look for the code. Plus, a link to find out who a team’s business partner is - who can provide a code if one is not on the tracker.

question with hint content and guidance to help users find their project code. As well as ask for the project code, this page gives an example of what one looks like, DDaT_22/23 and gives a link to the business partners for users to ask them

Language around portfolios and groups is used interchangeably

A list of DfE portfolios

Added hint content to reflect users language and referred to portfolios and groups

As more users referred to their area as ‘groups’ we changed the question to reflect this and added hint content, ‘You may know these as portfolios’.

List of DfE groups with hint text referring to them as portfolios. Includes families group and funding group.

Users wanted to know next steps after they submitted a request

On the task list page, where users check their answers, they wanted to know exactly what the next steps were.

Added content to reflect the journey

We spoke with the service assessment team, who confirmed that they usually get back that same day - unless they have queries with pipeline control. Based on this, we added a line of content to show what happens next and set expectations for the user booking the discovery peer review.

New content added to submit page:

Once you submit your request, the service assessment team will be in touch to confirm your discovery peer review and agree a date within 3 working days.

Users weren’t always clear which artefacts could be added to the service

After the user submits a request, they receive a confirmation email.

Within this email, it states the two things they can now do in the service: add team members working on the discovery and artefacts to show the work.

It wasn’t always clear what ‘artefacts’ meant.

''Add links to artefacts' - is - 'Add things you will use in the assessment'? The wording doesn't mean much to me’. Participant 3

We added examples of what artefacts might be to the email, mirroring language used in the service

Content added to booking request submitted email:

Add links to artefacts

Add any links to show the work of the discovery. An artefact could be a Lucid frame, document or short slide deck

Testing emails in service made us revisit the purpose of each

In the book journey, users receive 3 emails at the start:

  1. Confirmation of booking request

  2. Request accepted

  3. Date and time of disco peer review

When testing this, we saw users questioning the reason for the ‘request accepted’ email. We know from research with the service assessment team, that occasionally, requests are ‘rejected’ due to either a request already being in place, made by someone else in the team, or other reasons. This is rare, though it does happen.

Deleted the ‘request accepted’ email

Based on users showing confusion over the language of ‘request accepted’ and the combination of a rejection being a rare occurrence, we deleted the accept email. Now, users receive only 2 emails at this stage: confirmation and date and time, which will arrive several days apart.

Users expected to see confirmation that the assessment will happen online

Some users, who were new to the service assessment process, noted that the service assumes they know the location of the assessment.

Added direct reference to location of assessment on date and time email

Confirmed location of assessment on Notify email confirming the date and time

Content added:

Peer review date

Your discovery peer review for ((nameOfDiscovery)) will take place on ((date)) at ((time)). This will happen online on Microsoft Teams.

When adding discovery teammates, users worried junior team members might read the report before it was shared by the delivery or product manager

In the manage section of the service for the delivery team, the person who made the booking can add their teammates working on the discovery.

The content explains that people added can then add artefacts to support the work of the discovery, and, view the peer review report.

Screenshot showing discovery team and user can add teammates. Teammates will be able to add artefacts and view the report.

Removed details panel

Although teammates will be able to see the published report, only the delivery manager, product manager and requestor would see the report once it’s been shared. The rest of the team only access it once it’s been accepted.

It seems misleading to say they can ‘view the report’, so we removed the details panel and added reference to adding artefacts in the line under the H2 Discovery team.

Discovery team page with changes, removed the details panel and content referring to reviewing the report.

When adding discovery teammates, users were concerned that some roles were grouped together and others not mentioned

List of teammates in a discovery with design covering content, interaction and service and no business analyst

Agreed to separate design and add business analyst role

For MVP, we have spilt out design roles and added business analyst, as these roles were specifically mentioned in research. Post-MVP, we will learn through research which other DDaT roles may be part of service teams in DfE and will broaden the list.

List of DDaT roles including content, interaction and service design plus business analyst

When teams added artefacts to support the discovery, they were interested to understand what else they could do to prepare

As users were adding artefacts to support the work of the discovery, they questioned what they should consider to add and how to prepare.

Guidance to what to prepare added to artefacts page to encourage users to read it before adding things to support the work of the disco

Users questioned how to challenge the report

Once the assessors write the report, the service assessment team review it and share it with the discover team to review and accept or challenge it. Although users appreciated being able to challenge the report, they weren’t quite sure how to do this.

Added guidance to reviewing the report to the email

Added a link to reviewing the report in the step by step guidance so that teams can understand how to accept or challenge it.

This is an example of showing the right content at the right time and designing a whole user journey that dips in and out of the service.

Email explaining how a team can read or challenge the report

What's next

Running a Notify email design crit with the content design community, checking for consistency and general email patterns in DfE.

Reviewing user research for naming the service. This has been ongoing from the start of alpha. Now we have completed current round of research, we will do some desk research into Google Trends and analysis what we've learned from our internal users.

Firming up beta plans and preparing for our service assessment.

Share this page

Tags

Content design User research Design Ops