Content design: Michael Soane, Ivy Halstead-Dawber
Interaction design: Laura Power, Dominic Norris
Service design: Kelly Millward
Policy: Tony Leavy
Overview
From September to November 2024, we iteratively designed and tested with DfE Colleagues across various teams. We used our previous designs for superusers as a starting point for colleague designs. We adapted journeys and screen designs based on our assumptions and knowledge about colleagues’ user needs. Through a collaborative and iterative approach, the designs for editing and adding a new task evolved. We are now exploring other journeys such as viewing or cancelling review requests and editing team information.
Round 1
Firstly we explored the journey for colleagues editing a task on DfE Connect. This was a light touch round of research where users interacted with a handful of Figma prototype screens. We asked for feedback viewing a task and for users’ expectations on editing a task to help inform designs for the following round of usability testing.
Insights from this round were critical to refine the user journeys and permissions for colleagues. We reduced the approval process, such that authority approval is not required for edits to existing tasks or teams on DfE Connect. This research highlighted the importance of explaining the review process thoroughly to users and colleagues needing to be able to raise reviews as urgent.
Round 2
After iterating the colleague user journeys and building out designs in more detail, in our next round of research we tested colleagues editing a task for a second time.
In this journey we added a question asking users if the change they’re making is also on gov.uk, explaining that “We cannot announce a date change until the change has been published or scheduled to be published on gov.uk.” Users particularly liked this question and thought it was a good reminder of the relationship to gov.uk.
A few pain points arose from this round of research, such as viewing a DfE Connect page and one of its tasks. Similarities between the screens resulted in some users getting confused what they were viewing and what interactions were available. Moreover, the relationship between marking a review as urgent and select a publish date caused some confusion.
We discovered more from users about their review expectations (such as timeframes, providing a justification for urgency and confirmation of next steps) which informed user journey alterations. Content and interaction design iterations from this research fed into the next round of research for further testing and refinement.
Round 3
In our third round of testing, we designed for colleagues adding a new task. This journey involved more steps and questions for users to answer which allowed us to test additional content and interactions.
We adjusted the Superuser designs for colleagues, such as grouping questions together into logical sections like Dates and Audience details to reduce the cognitive load. In addition, content was tailored to avoid terminology that colleagues find confusing (from previous research), such as removing the term Tags.
Based on previous rounds of research for editing a task, we iterated the review and publish journey for adding a task. We split the questions in a step-by-step approach, putting more emphasis on whether a request is urgent and asking why it’s urgent. Selecting an urgency reason aligned to users’ expectations and they liked being able to enter an ‘Other’ reason. Some users expressed that the journey felt quite long but they understood the importance of prioritising urgent requests for timely review.
We also tested an iterated design for viewing a DfE Connect page and its tasks. Using the table component to show tasks rather than a summary list, we’re able to pull through more data about a task and enable users to edit a task more quickly. This was positively received in research and minimised confusion previously discovered between a DfE Connect page and a task page.
We also tested updated content on the confirmation page. We highlighted in the confirmation banner if a user had selected a request as urgent and tried to explain what happens next more clearly as this is important to users. The terminology ‘as soon as possible’ was interpreted differently by users and some users found it too ambiguous.
What's next?
The team continued to refine designs for editing and adding tasks based on all three rounds of research and user insights.
The following round of research in January 2025 focuses on colleagues viewing and cancelling review requests in Manage your DfE Connect data.