What we did

We conducted research with 4 service assessors - including lead assessors - in DfE. We wanted to see if they use the service to find out about a discovery peer review and write and submit a report to the service assessment team.

What we found

The overall journey tested well, with participants being able to navigate their way through the assessment process for discovery peer reviews.

Things we saw working well included: users being able to find a list of the panel members and artefacts relating to the service, navigating to and completing and submitting the report.

We heard things like:

‘this will be really useful having everything in one place.’ Participant 5

‘feels overall clear and simple to use... relatively minor things to change.’ Participant 4

However, there were several content and design changes we can make to improve navigation further and show assessors and peer reviewers the content they expect to see, when they need to see it.

Starting the user journey in an email made sense but it could be clearer

We’re using Notify to send emails to users from the service.

Participants started their journey via an email, notifying them they were on a discovery peer review panel. Although, when looking for the email in their inboxes, some users missed it. They also commented on the formatting, which included a bullet before their role on the panel.

Users questioned whether they would receive 2 emails if they were the lead reviewer and maybe also the design reviewer. They also questioned whether they needed to add their own reminder in their calendar.

Made the subject line clearer and referenced a separate calendar invite

We iterated the email to make the subject line clearer, to include dynamic content and the name of the discovery.

Old email subject line: You’ve been added to a discovery peer review panel

New email subject line: Peer review details for (name of discovery)

We tidied up the formatting and added content to cover all roles on a panel. We added content to explain that they will receive a separate calendar invite for the review, so don’t need to add it to their diary.

Example of new email:

Email to panel member showing discovery name in heading and containing a link to the service.

Users felt confused landing on a task list page when entering the service

When users followed the link in the email, it took them to a task list landing page. This caused confusion, as people on the panel were interested in reading details about the discovery. Most users navigated straight to the artefacts then the overview page from here.

Screenshot showing tasklist page for peer reviewers.

Changed the landing page to an overview and show all tasks

We changed the task list page to a summary page, which includes a summary of tasks, description of the discovery, who is in the team, and links to artefacts.

By doing so, we have also reduced the clicks users need to make to view /tasks, /overview of project, /artefacts and /panel members. This content is now all visible on the overview page for the discovery peer review.

Peer reviewer overview of a service they are a peer reviewer for

Users liked the artefacts page but wondered when they might be there

Users found the artefacts page with ease and commented that they liked the idea of them all being in one place. We did hear users say if artefacts weren’t yet there when they looked then they would come out of the service and speak to the lead assessor or the service team to chase. It wasn't clear who adds the artefacts.

Manage expectations and make it clear that artefacts will be added by the team

We added a reference to the team to be clear who will add the artefacts. We can test this during round 2 to see if this is clear to panel members.

Image showing new dashboard summary page for assesors, includes artefacts, description of project and tasks.

Peer reviewers write notes before the report but don’t need a prescribed template

We tested a notes template as part of the journey, to see if users would find it useful to work collaboratively during the peer review. With the idea being that they then add final comments to the report.

We found that although assessors and peer reviewers do tend to write notes, they tend to do this individually, then complete the report in a separate document. Using a template was seen as duplication.

Removed the template and explored multiple editing rights to the report

Although most assessors write notes during the assessment, with some liking to add directly to the report, they do so in a more organic way before adding to the report after discussing as a team. It was felt that a template was an unnecessary restriction, so we removed it as an option in the step-by-step guidance for peer reviewers.

We also explored the technology to see if we could build a report option into the service to have multiple editors. We discovered this is an option, but it will be post-MVP, as the technology to do this has not yet been used in a DfE service. Also, we can follow the current process of assessors making their own notes before adding to the final report, which current works well for the panel.

Naming the service feels like a challenge to users

During each round of research, we ask users, ‘What would you call this service?’. We have a working title of ‘Assure your service’ and are refering to the end of disco checks as ‘discovery peer reviews’ but are testing this language out and talking with users about levels of assurance and assessment in DfE.

We’ve heard things like:

‘DfE assessments.’ Participant 5

‘the assessment service or service assessments.’ Participant 2

‘problem with assure is that it is not plain English... this is about booking an assessment – assessment should be in the title.’ Participant 2

We’re gathering evidence to base the service name on language used in DfE

As we gather language used around assessments and assurance, we will continue to add to our insights and evidence. We’re planning to revisit this with stakeholders and the service assessment team to give our service a name for MVP as we start to build.

Things we’re thinking about include: we’ve seen no definitive references to ‘peer reviews’. ‘Discovery assessments’ and ‘discovery peer reviews’ are used interchangeably by users, we want to understand whether the term ‘assessment’ can be used across all phases in a way that’s comfortable for users.

In the past, we’ve seen the term ‘assessments’ causing users stress and uncertainty, but are aware this can be linked to the lack of transparency around the assessment process. This is something we are tackling head on by creating once source of truth with a step-by-step guide to assessments in DfE with transparency for all roles, whether you are an assessor, a team going for an assessment or a stakeholder.

Completing the report was fairly simple but it didn’t feel obvious what stage it was at

Users added content to the report with no issues, but many people commented that they expected instructive content to tell them what they need to do next and be clear once a task has been completed.

Report ready to submit page in service

Added a timeline pattern to the report to show progress

We’ve used a timeline pattern to show users progress. The pattern shows what tasks come next and when they are completed. It was designed from an early Home Office pattern and developed in research and testing in the Gambling Commission, an arms-length government body. We will test it with users in our next round of research.

Assessment report timeline with visability for user as to what comes next

Example of showing a peer reviewer they have completed the report and the next step:

Image showing success banner for report being sent to service assessment team.

We tested the current user satisfaction survey for assessments and found it to be confusing

Currently, the service assessment team sometimes send out a user satisfaction survey once an assessment is complete. We tested the current content to give us a baseline understanding before making any changes. We saw that all users found the survey confusing and didn’t understand many of the questions, or felt like the questions did not relate to their specific role. We also learned that 6 ratings to choose from is too many and we should reduce the number to get more accurate results.

'hard to understand...’ Participant 4

‘I’m confused as to what this is asking me, I’m not sure what it means.’ Participant 1

Redesigning the service assessment feedback survey to ask relevant questions to relevant user groups

Content and interaction design have redesigned the survey, thinking about what exactly do we need to learn and how will we measure data. We’ve reduced the rating options from 5 to 3, ‘great’, ‘ok’ and ‘not good’ and have reduced the number of questions to 3. One about the organisation of the assessment, one about the day and one about the service, with the chance to gather quant and qual data results.

Screengrab of feedback survey

Share this page

Tags

Content design User research Design Ops