What we did

We tested the user journey for assessors, from when they receive an email to confirm they’ve been added to a discovery peer review panel to filling in and completing the report. Then, completing the feedback survey.

What we found

Users could complete the journey for finding and completing a discovery peer review report but felt confused as to how they collaborate with the assessor panel when working on the report.

Assessors see a lot of information in the assessment dashboard

We followed GOV.UK guidance and design principles for admin interfaces when developing the dashboard design.

For example, starting with needs, more than one thing per page, and using the full width of the browser.

Although, some users felt overwhelmed with the content. Users liked having all their actions to do on one page, but they described it as ‘busy.’

It’s nice and clear.... I’ve got the stuff on the left, right, across the top and main panel... it’s just quite busy.’ Participant 2

When referring to the navigation at the bottom lefthand side, ‘I’m not sure why it’s here for me now.’ Participant 7

‘history’ option is not what I expected.’ Participant 7

Dashboard for service assessors, showing their tasks, a description of the service, the disco team and who else is on the panel

We reduced content on side navigation

We ran a workshop in DesignOps to explore how we might make the dashboard work harder for assessors.

We put the dashboard on a big screen, revisited needs, the user journey and thought about the context of the assessor. Questioning, what do assessors need to see in their role to do the things they need to do? If content or an action did not fulfil a need, we removed it.

This resulted in us:

  • deleting the ‘All assessments’, ‘book’ ‘upcoming’ and ‘reports’ options from the lefthand nav

  • deleting the ‘history’ option

assessor dashoard with action based tasks to do for the assessment, such as 'complete assessment report'

Followed the GOV.UK style guide

We iterated the date at the top of the dashboard to reflect the GOV.UK style of time, day, date.

And we made sure the assessor task started with a verb, changing it from ‘assessment report’ to ‘complete assessment report’.

Assessors questioned when artefacts would be in the service

We saw some participants referring to the email, reflecting that it stated that they could, ‘read about what’s been done in the service’, but that artefacts might not yet be in the service on the dashboard yet, as teams may not have finished them. It was considered that this could be clearer. The content needs to reference what assessors will see and can find out in the dashboard.

Created clear content referencing artefacts at the right time in the user journey

We iterated the assessor confirmation email to explain that artefacts will be added closer to the discovery date.

We will also create a nudge email to delivery teams to be sent 3 to 5 days before the date of their discovery peer review as a reminder to add final artefacts.

assessor-email.png

Assessors found the timeline pattern clear but not all the steps on it

Users appeared to find the timeline pattern clear but the last two stages ‘Report’ and ‘Complete’ caused some confusion.

Is Complete a step? I’d assume if I’d submitted the report I’m finished.’ Participant 2

Iterated timeline content to show what users need to see and do

We deleted the ‘Complete’ step, as it isn’t needed for assessors. Once they write and submit the report, their tasks are finished.

We changed ‘View report’ to ‘Submit report’ in the report section of the timeline.

Timeline for assessors with active things to do, such as Confirm the overall outcome of the peer review and Submit teh report.

Assessors not clear as to how to complete the report

In the last round of research with assessors we learned that assessors did not need a prescriptive notes template to collaborate on, as they wrote notes individually. Using a template was seen as duplication, so we removed it.

In this second round, although navigating through the report was relatively straightforward, users were unclear as to how to write the report and collaborate as a team. The prototype needs to explain how the report should be written in conjunction with all assessors, before being submitted by the lead assessor.

Guidance to write the report for assessors, explains that the disco peer review can get a red, amber or green status

Added relevant guidance to guidance page

Using the evidence from needs gathered from user research analysis, we pair wrote report guidance a with lead assessor and ran a design crit with DesignOps. We also did a tech spike to test real time editing which we will introduce post-MVP for assessments to test. We referred to this in our previous design history.

We added content that explained:

  • the panel can collaborate in a way that suits them best

  • how to complete the peer review report in the step-by-step journey

  • how assessors should prepare and think about standards 1 to 7

  • who submits the final report

  • what happens after the report is submitted

Also, in the service itself, only the lead assessor will have the functionality to submit the report. If an assessor added their content directly to the report – which we saw in some user behaviour – once they have finished adding all content, they would see that only the lead assessor can make the final submission and would not be given the option to submit.

Guidance for assessors completing the report with links to the relevant peer reviewer advice in the step-by-step guide

Some users found each question in the report to be a bit of a blank canvas

Users referenced using Apply the Service Standard in DfE for guidance as to how to answer the two questions asked in the discovery peer review report. Which are, what has the team done well, and what do the team need to improve. But, we also heard:

‘It would be useful to have some broad areas of criteria..... It’s a bit too much of a blank canvas’ Participant 3

Added a content pattern to be more active with tasks to frame users thinking

The user-centred design team reflected that assessors want to know what standards they should be assessing in line with. So, as well as adding guidance to preparing for a disco peer review on the previous guidance page for the report, we agreed and confirmed a content pattern to try for discovery peer reviews. One we will continue to use for service assessment reports and will test in future rounds of research.

The content pattern to test is that we will add relevant guidance and links to each assessment question in the report. For discovery peer reviews, this will be linking to the Complete the peer review report page in the step-by-step guide. In service assessments, this will be linking directly to each service standard.

Example of a report question for assessors, What has the team done well? With a link to guidance to help them write it.

Complete the peer review report content in the step-by-step guide:

Guidance for assessors to complete discovery peer review reports, includes what happens during a peer review and the standard points they should consider, which are standards 1 to 7

Lead reviewers submitted report but questioned their next step

Once the report was submitted by the lead reviewer, we saw some users questioning their next step, or whether they would receive confirmation.

‘I’m not sure if I’d get an email when the report had been sent... maybe then it would complete it.’ Participant 4

Refined banner for success with clarification of next steps and sent lead reviewer a confirmation email

Banner for success for lead assessor confirming they will receive an email to say that the report is with the service assessment team.

Add a Notify email to the service to go to the lead assessor to confirm the assessment report is with the service assessment team for checks before going to the delivery team.

Survey questions didn’t always feel relevant

Following the previous round of research with assessors, we redesigned the survey questions and reduced them to 3 questions covering 3 areas, pre-assessment, assessment and the online service.

We saw users feeling that the hint text was not always relevant to each exact question and that the questions needed breaking down more. We also saw that some questions may have been relevant to the delivery team, and some to assessors.

The choice of just 3 options to choose from for how they felt something had gone wasn’t felt to be enough.

Refined questions to make each a statement to address each area within the assessment journey

Working closely with user research, we redesigned the survey to use a likert scale giving users the choice from strongly agree to strongly disagree.

We designed conditional statements – rather than questions – that are shown to the right users.

For example:

The team felt the panel were engaged in their assessment.

This will only be shown to the delivery team.

Collaborating on the report was easy.

This will only be shown to the panel.

Summary of system usability score

Overall the usability of the service tested as acceptable, with a scoring of 75, which is indicative of 'good'.

Usability score of 75 for assessors using the service assurance service to submit a disco peer review report

Feedback from users included things like:

'I came into doing service assessments without the confidence to feel like I could do one. I think this site gives everything I need in one place which would help me feel prepared and less overwhelmed.' Participant 3
'Overall, it worked well and was easy to navigate. I'm not convinced about writing sections of the report in isolation, without being to see the contributions from other assessors. I think I prefer the report in a document.' Participant 5
'The site was easy to use and to navigate but I was confused as to how the report section would work in collaboration with the other panelists.' Participant 4

What’s next

We’ll conduct our second round of research with delivery teams, this time testing the Book and Manage journey.

We’ll also continue our research into the naming of the service.

Share this page

Tags

Content design User research Design Ops