What we did

We conducted research with 10 users. Testing involved a number of tasks within the prototype. Afterwards, we asked users to complete a survey which included a system usability score and their reaction to the product itself.

Users were a mix of delivery managers, product managers and service owners working in DfE. This included contractors and civil servants.

What we found

Overall the step-by-step guide was received positively, with it acting as a reference point and users being able to see the possible journey ahead.

However, users expected the side navigation step-by-step guide to be visible for the whole of the Find out about journey. In step 1, Check what assurance you need, the side nav disappears when users are checking their assurance type and the outcomes page.

'I miss the righthand side nav on the outcome page. My safety blanket has gone!' Participant 1

'I recognise the pattern and feel comfortable with it.' Participant 3

screenshot showing page in service with an H1 of Discovery peer review to demonstrate the lack of service navigation on the righthand side

What we’ve done as a result

Added the step-by-step side navigation throughout the entire journey.

screenshot showing page in service with an H1 of Discovery peer review to demonstrate service navigation has now been added to the righthand side , to show users where they are up to in their journey and that they can go back to the nav if they need to

What we found

On the What to prepare for a discovery peer review page, some users were confused by the content explaining that peer reviews are not the same as service assessments but still reference which service standard points they need to meet at end of discovery.

What we’ve done as a result

Due to confusion around language of assurance types in DfE and having several names for the same thing, we removed all reference to service assessments.

We also added links to the specific standard points that are considered at discovery peer reviews to make the content work harder.

We replaced:

A discovery peer review is not the same as a service assessment. However you should consider guidance in Apply the Service Standard in DfE for the discovery phase in standards 1 to 7.

Don't put a lot of time and effort into creating a large slide deck. Show your work and talk the panel through it.

What you show could be a mixture of sharing Lucid boards, docs, and sketches. It could include slides but it doesn't have to.

With the following new content:

These pointers are taken from things teams should consider in the discovery phase for standards 1 to 7. Show what you've done to explore these at your peer review.

What you show could be a mixture of sharing Lucid boards, docs, and sketches. It could include slides but it doesn't have to.

Image showing a screenshot of what to prepare for a disco peer review showing no references to service assessments to be consistent with language about discovery peer reviews

What we found

It was straightforward for users to know how to book a discovery peer review from the home page, but there were questions around the content which stated that only DfE staff can book a discovery peer review and whether this includes contractors.

Also, for returning users, who've already started a request and now want to continue with their draft, they struggled to know where to find it.

What we’ve done as a result

We’ve been more explicit in the description of what ‘DfE staff’ means.

We added new content to be clear about who DfE staff are:

Only Department for Education staff can book a discovery peer review online. Staff include civil servants and contractors working in DfE.

We’ve added a ‘continue with a draft’ option to take users to their booking draft if they’re completing it in multiple sittings.

Screengrab of home page in Assure your service showing an option to continue with a draft if you have already started one

What we found

One of the questions we ask is for the service project code. Although, no users knew their project code and very few even knew their project had one.

Users said:

’I would scramble around people to ask.’ Participant 8

‘Oh project code makes me wonder, I wouldn’t know what that is or where to find it.’ Participant 6

Screengrab showing a question asking Do you know your project code

What we’ve done as a result

We considered removing this question in the service and the service assessment team provide the project code data at the booking in stage. This process currently happens in DfE.

To help us to decide, we reached out to stakeholders and users of the DDaT service tracker that's used in the triage process for spend control. If all services are listed on the tracker, we could use this as a data source to pull the service information into the service.

As well as speaking with stakeholders and users of the DDaT tracker - which sources the project codes - we will also conduct a round of research with business partners, as they create the project code. We will look to understand their needs and processes around assurance and assessments. Then iterate the design based on evidence.

We'll test designs and needs in our round of research with the service assessment team when they manage a request.

What we found

As part of research, we ran an A/B test to understand whether users were likely to book a peer review before they’ve started discovery.

Version A question:

When did your discovery start? For example, 18, 2, 2023 with a date input.

Version B question:

Have you started your discovery? Select one option. Followed by two radios, one with yes as an option and a date input. One with no as an option.

Screengrab showing the questions When did your discovery start? With a date input and a save and continue button

What we’ve done as a result

Both versions tested well and users were able to progress. However, when asked, users said they never book a discovery peer review or service assessment before the phase starts.

As a result we've decided to proceed with option A.

What we found

The question in the service, Which DfE portfolio or arms-length body is your discovery for? Caused some confusion as not all participants knew which portfolio they come under. Users also didn't recognise the term ‘group’ in the radio descriptions.

The original list was taken from intranet content about portfolios and it demonstrated that there are many different ways areas of work are described across DfE. It's not always clear to people which term is used the most.

It was also particularly prevalent for those in teachers services, as they were unsure to select schools groups or teachers regulation agency, leading to some users guessing.

' I would pick one so that I could submit it and it isn’t a deal breaker.’ Participant 7

‘I always get lost how DfE is made up.' Participant 6

What we’ve done as a result

All options will be referred to as portfolios, as this term is commonly used within service teams. Additionally, we will provide hint text on relevant options for clarity, for example schools portfolio includes teachers.

Screengrab showing descriptions of areas in DfE such as 'families portfolio' and 'schools portfolio, includes teacher services'.

What we found

We asked users, Who is the business partner of the portfolio?’ Users were unsure as to who their business partner was, with only a couple having a rough idea. Some suggested they would have to guess and hope they see a name they recognise on the list of suggestions. They also weren’t sure how to go about finding out the information.

What we’ve done as a result

For our minimal viable product (MVP) we will remove the business partner question and test how users in the service assessment team find entering this data. This is what currently happens in the assessment process in DfE.

Question in service asking do you know who your business partner is? With an option of a yes radio button and a text box to add their details. Also, with a no radio button, to allow the user to select no they do not know who their business partner is.

What we found

Once a user submits a request for a discovery peer review the next page they see is a request submitted page. Users expected to receive an email once they have submitted their request. We're using Notify to send these types of emails between the service and users. However, there was uncertainty about who would receive the emails and if it went beyond the requestor.

'I would expect to be contacted, or the dm to be notified.' Participant 6

'Wonder if it's worth stating that the confirmation email is about the booking, and then another email with dates once booked.' Participant 9

What we’ve done as a result

We made a decision within the content on the request submitted page, we will clarify who will receive the emails, for example, all those listed within the booking.

We changed:

We've sent you a confirmation email.

To:

We've sent an email confirming details to you as the requestor, the team's delivery manager and the team's product manager.

Screengrab of request submitted confirmation with the following new content to explain who receives the email, 'We've sent an email confirming details to you as the requestor, the team's delivery manager and the team's product manager.'

What we found

Users found the Check you answers pattern confusing at the end of the booking stage when they were completing the booking over multiple sittings. The journey for users if returning to a draft form of a booking took them to the Check your answers page first, causing confusion and some users wanted to jump straight in and select the Submit request button. However, users were pleased to see a summary of their answers before submitting, in case they had put wrong information in or needed to change anything.

Check your answers GOV.UK pattern used at the end of the booking journey for users

What we've done as a result

We checked WCAG 3.2.3 standards and understood that if we used a Check your answers pattern for new users, then a Task list pattern for returning users to achieve similar solutions we would not be following accessibility standards. So we fall back to accessibility guidance and changed the Check your answers pattern to the Task list pattern for all users. One pattern for all journeys. We can test this in following rounds of research.

Task list pattern used instead of Check your answers for return users. Screen showing all the details of the Book journey that the user needs to complete

Summary of system usability score

Overall the usability of the service tested positively. With users giving an average usability score of 85.5 out of 100, which is noted in the acceptable range and towards the excellent scale.

Usability score of 85.5 out of 100 for booking a discovery peer review service.

Feedback from users included things like:

'Site looked like a typical government site, almost comforting.' Participant 2

'Clear labelling meant it was easy to understand where to go for what information.' Participant 1

'The 7-step journey was a nice step-by-step guide on what you can expect to do on the site.' Participant 5

What's next

Through research and working with stakeholders, we have understood that DDaT business partners have a role to play in setting up teams delivering digital services and working with them throughout the process.

DDaT business partners aren't our main user group. However, it’s essential for us to understand the role business partners play from their perspective and any possible involvement they would need to have for any service we design that around the service assessment process.

To help us to understand the role of the business partner and any needs or business requirements they might have for booking discovery peer reviews and the wider assurance process we’ll plan a round of research with DDaT business partners in DfE.

Share this page

Tags

User research Design Ops