What we did

We conducted research with 5 users to understand if the service assessment team could administer a discovery peer review booking, plan the session and publish the report in the prototype.

Users were a mix of members of the DfE service assessment team and xgov colleagues who also work in service assurance and assessment, including Ministry of Defence (MoD), Home Office and Department of Work and Pensions (DWP).

What we found

Overall, the service was well received, and users could navigate their way from accepting a booking to publishing a report. But, next steps, or the end of the journey, weren’t always clear.

There were tasks that were included in the service that weren’t necessary and equally, tasks or questions that had been needed earlier in the journey, for example, the project code. As a result, we made some changes based on what we learned.

Project codes used to identify services and projects from the start

We’d previously removed the project code question for teams when booking a discovery peer review as in research, none of the teams we spoke to were aware of the code. We knew that the service assessment team had been adding codes at a later stage. In this round of research, the lack of project code was an immediate ‘red flag’ to the service assessment team.

‘No project code is a red flag as this has been recently emphasised by my superiors to have more focus on going forward. When it comes to names, teams sometimes change the name of the project, this can create inaccurate data.’ Participant 1

‘Teams should have this information from the business partners.... in the past it’s been optional but is more mandatory now.’ Participant 1

We’ve re-added the project code question

We learned that going forward, senior leadership are keen to have project codes matched to a service and used as the main identifier throughout a project or service lifecycle.

If a service or project does not have a code, a team’s business partner can create one and add it as an identifier to a project on the DDaT portfolio tracker.

We know through research that not all projects are listed on the tracker, as may not go through spend control, but can still be added to the tracker by the business partner and given a unique project code that can be used as an identifier through the lifecycle of a project.

We’ve iterated the design of the question to include guidance for teams to locate their project code. We added a link to the DDaT portfolio tracker and to finding out who their business partner is. The question is not mandatory and there is a yes/no option so that requests can still go through without the project code, as we do not want to block teams from having a peer review. The code can still be added at a later date as a back-up.

Image showing a question with guidance. The question reads, do you know your project code? The guidance links out to the DDaT portfolio tracker and business partner information.

Users expected the icon next to their names to display information

Users hovered over the icon next to their name to show they were working in the service. The names come from DfE’s active directory. Users expected the icon to show information if they hovered over it. Users refered to being shown next steps or information on the process.

We removed the icon

Removing the icon, combined with us exploring how to make the end point of the disco peer review journey clearer, and the tasks more succinct, should make the service clearer to use.

Top navigation caused some confusion as to where to go

The top navigation is the same for every user, whether you're on a delivery team, a peer reviewer or a service assessment team member. Options are Home/Administer assessments/Manage/Assessors/Reports/Support/Survey.

Top navigation showing the options listed above.

We heard things like:

‘Administer is a tricky word - I do like the dashboard, and with the different stages’ Participant 2

‘Administer assessments - i don't know what this is doesn't explain to me, administer is a vague word - who is that for?’ Participant 1

Iterated the descriptions in top nav and created views relevant to use groups

We’ve redesigned the prototype to have 3 separate views, depending on whether you are on a delivery team member, a peer reviewer or service assessment team member, you will only see the navigation you need.

We’ve got rid of ‘administer assessments’ and ‘Manage’ will cover what it needs to for the delivery team and service assessment team – showing the users the content they need.

Top navigation showing fewer options, including Manage, book, reports and survey

Discovery peer review reports are published internally

In the prototype, we ask a question to the service assessment team as to whether they want to publish the report internally, internally and externally or not at all. This caused some confusion, as presently, all reports are published internally.

'Curious to understand why it might have been marked do not publish.' Participant 2

What we’ve done as a result

From what we saw in research, and looking at the current user journey for the assessment team and reports, we have removed this option. The service is an internal service only, and only used by DfE staff.

Currently, discovery peer review reports are all published internally. We will follow this pattern in the service. We will review it when it comes to alpha and beta assessments as there will be a slightly different journey if they also need to publish some reports externally for Cabinet Office.

Peer reviewers or assessors follow a mental model to writing reports

The report template in the service contains 3 boxes, one is overall comments, they are ‘understand users and their needs’ and ‘understanding the problem’.

This was designed based on evidence from working with lead assessors and HoPs to understand the basic outcomes of discovery peer reviews. Exploring the essence of what teams need to come away with to have ‘done a discovery’ and also, to simplify the report – which currently, refers to all the service standard points. This is not needed for a discovery.

We saw the service assessment team feeling confused as to why we were referring to only 2 of the Service Standard points.

‘You are mentioning some service standards, but why not saying only some standards are applicable and not mentioning the others?’ Participant 1

They also talked about assessors' mental models and liking to have a scatter approach to completing a report, how they like to note here’s what teams have done well and here’s what the team need to work on.

Iterated content in the report to make it clearer

We changed the headings in the 2 boxes to mirror the mental model of the peer reviewers writing the report and will cover in guidance for peer reviewers which Service Standards should be considered for a discovery peer review.

Image showing report with two simplied boxes asking what the team have done well and what they need to work on, to replace asking about 2 service standards.

Role of peer review observer isn’t needed

We currently ask for the role of the peer reviewer and also the role of the observers of the disco peer review to be recorded. While the role of the reviewer is needed, the role of the observer is not necessary and is not data that the service assessment team use.

'Role of observer - interesting, don't always have a role - sometimes just want to look at it from a project perspective.... If from other gov departments, they wouldn’t be listed here.’ Participant 1

Removed the role of the observer and just asked for their name

We removed the action of adding the role of the peer review observer and just ask for their name.

Image showing text area for observer name with no reference to their role

Documenting a Slack channel isn’t needed

One of the tasks is to document the URL of the Slack channel the service assessment team create for the peer review panel.

The team confirmed that it is not a channel they create, but a Slack conversation, this is just for the reviewers during the session and is not something that the service assessment team will look back on or document the data. Also, if they did create Slack channels, people often leave channels due to being in so many, plus, it would mean creating many more channels.

Removed the task page requesting a Slack channel is created

The service assessment team can continue to make a Slack conversation outside of the service, there is no need to create or document a channel.

The conversation is simply a place for the peer reviewers to speak their mind and work out thoughts during the assessment, it does not prevent the assessor team from administering an assessment.

Page in service where assessment team can add slack channel information. No need for this so deleted.

The end point for tasks for the service assessment team was not clear

Once the assessment team publish the report and return to the task list, it is not clear that this is the end of the journey.

'I expect a notification to us to say it's been published - a summary of what's happened - that would be handy - for my files’

Designed a final email to go to everyone involved in the peer review for a distinct end

We’ve added a Notify email to be automatically sent to all users of the discovery peer review service, so the product manager, delivery manager, peer reviewers, observers and service assessment team. Sharing a link to the report, confirming it’s been published and to ask them to complete a feedback survey.

Email content in Notify:

Subject: Peer review report published

Body content:

A peer review report for (name of discovery) has been published.

You can view the report at: (shared link to the report).

Please take a few minutes to complete this survey about your experience of the discovery peer review: (link to feedback survey). It will help to improve the service.

Thanks to everyone who was involved in the discovery and for helping us to build great services in DfE.

Body content ends.

Notify email. Content listed above screengrab.

What's next

  • Testing the user journey for assessing a discovery peer review

  • Thinking about how to show progress in the service and tasks completed

  • Reviewing evidence so far into naming the service and sub service for assuring a service and working with the service assessment team to come to a conclusion to test for MVP

Share this page

Tags

Content design User research