Background to the change

All users who receive allocation statements want to see how their numbers have changed – it’s their top priority. That’s why we developed the variance indicator feature, which shows ‘at a glance’ where there are changes in £ funding value.

This becomes a bit more complicated with the General annual grant (GAG) allocation. Its statements are unique in that:

  • they can be revised multiple times, especially when a school is in the process of converting to an academy
  • they’re made up of several sections, covering areas such as the school budget share and the local authority’s minimum funding guarantee
  • each section consists of multiple tables, some of which include upwards of a dozen rows, all of which need to be analysed to understand a change in funding

Digitising statements to save users time

The existing GAG statement is issued in PDF format, which makes it very time-consuming for finance and data officers in schools to understand where there are funding changes. By bringing GAG into MYESF, we wanted to make it easier for users to understand where changes have taken place from one statement to the next, by introducing a statement comparison feature.

Screenshot described in caption under image Our starting point for the comparison options page included a heading, a sub-heading, help text and radio buttons - but we needed to check whether they made sense to users.

We understood there were 2 comparison scenarios:

  • in-year - a user receives an updated allocation statement for the academic year, which tells them about changes to that year’s funding
  • year-on-year - a user wants to compare a new or updated statement with the previous year’s statement

Digitising the statement also enables us to walk users through more complex funding calculations, by including mathematical operators.

Screenshot described in caption under image A digital statement allows us to show a funding calculation as a series of table rows, each with a mathematical operator. This makes it easier for all users to follow our workings.

As in other parts of the service, digital statements make it easier to:

  • keep track of statement revisions
  • navigate easily between sections in the service, using tabs

Our design approach

Our tasks as a design team were to check:

  • does the statement comparison journey used for other allocations in the service work for these users too?
  • does the use of tabs make the journey more complicated?
  • does heavy use of variance indicators in a GAG table make it easier or more difficult for users to understand what’s going on?
  • do variance indicators also work alongside calculations?
  • do users understand the comparison options?
  • can they navigate between them without getting lost?
  • are they familiar with acronyms such as GAG and AY (academic year)?

We started by plotting the end-to-end user journey in Lucid, using screenshots from an older prototype (the work to digitise GAG statements was begun a while ago but then paused).

We wanted more realistic table data in the statements for user research, so we then analysed a set of PDF statements spanning 2 academic years, to see how frequently changes occurred from one statement to the next.

This told us we’d need to include variance indicators on virtually every line of some of the statement tables. Before creating these in the prototype, we did some lo-fi mockups in Figma, to check that we were heading in the right direction.

Screenshot described in caption under image We quickly created a data table in Figma, populated it with sample content, and added variance indicators to each line of the table, to give us an early sense of how these elements would work together.

Round 1 of user testing

Once we’d built prototypes for new and updated statements, we put them into user testing. We wanted to:

  • see how users navigated through the screens with minimal prompting
  • ask users what they understood from looking at the new comparison options page, and whether this helped them to select the right option
  • see which comparison option users wanted to view first, and whether they could navigate to the other one when prompted
  • see how users responded to the variance indicators, and whether they helped them to understand the funding changes
  • check if users found content difficult to read, and ask how they thought it could be improved

Round 1 quickly revealed there was more work to be done on user journeys. Nearly all users were slightly thrown by the appearance of the comparison options page when they were expecting to be shown a funding table. However, they took it in their stride once they’d got over the surprise.

Later in each testing session, we also saw that many got lost when trying to find their way back to the comparison options page. Some would try to access this from the allocation history page, while others went back to the start of the journey. They missed the embedded link to change their comparison view. And it seemed as though the breadcrumb link back to the page wasn’t helping much either.


The above video clip shows a user testing participant navigating through the GAG prototype, and illustrates some of their difficulties with switching between comparison options.

It wasn’t all bad news. Users were overwhelmingly positive once they’d reached their chosen comparison. Here are a couple of our favourite quotes:

“The comparison view does what I’d do with a calculator!”

"I think it's really good. I think it would be beneficial to every school and trust to have it in this format where you can vary things. It’s much more beneficial than just receiving the GAG and having to do a lot of the work yourself, comparing year on year."

Iterating for round 2

We thought it would help if we did some work on the comparison options page content before round 2. We’d seen that the heading was too wordy for users to quickly digest its meaning and locate the statement date. And some found it difficult to distinguish between the comparison options on offer. So we refined the content to make it more succinct, and used sub-headings to help users distinguish between the year-on-year and in-year options.

Screenshot described in caption under image We made changes to the comparison options page heading, to explain the purpose of the page more clearly. We also introduced sub-headings and more descriptive radio button labels, to make the comparison options easier to understand.

We also made other content changes. These included a single style for displaying statement dates in the page title content block, so they were easier to spot.

Screenshot described in caption under image Statement dates now have a consistent style throughout the GAG journey.

While the GOV.UK style guide tells us that zeroes and other whole numbers should be shown without decimal places to make them easier to read, our users wanted all funding numbers shown in the same way, so we added 2 decimal places to all money amounts.

We also added variance indicators to the weightings column in the data tables (we’d left them out in round 1 because we were afraid the tables would look too cluttered). Users were very clear that they wanted to see changes highlighted throughout.

Screenshot described in caption under image Each row of a funding table was packed with information after we added decimal places to whole numbers and variance indicators to the Weighting column. Fortunately, users told us they're still easy to scan and read.

Round 2 – a good (partial) result

The variance indicators continued to generate excellent feedback in round 2. Users can instantly see how this feature will save them time, and make it easier to explain changes to school colleagues:

“That’s really good! The red and green arrows are really helpful - I can see how rates, pupils and weightings have changed.”

“I think this is really useful. This statement is not straightforward. I think having the variance analysis is key for everybody. Minimum funding guarantee calculations are always confusing. You've set out how they work VERY clearly… it won’t take long for a finance person to get used to this way of working.”

“This is great. In my job, I want to see what's gone up, what's gone down and what the drivers are. This gives us the full story.”

However, the work isn’t quite done. The feedback from round 2 told us again that users were struggling to find the comparison options page. The journey still wasn’t intuitive or logical to users; searching for the comparison options page through a link to a subsection of the statement meant they were forced to learn a new pattern.

As an interim measure, we showed 1 or 2 users a screenshot of an alternative comparison options page with simplified content. They responded well to it, but we haven’t yet tested it with the rest of the journey.

Screenshot described in caption under image In the latest design for the comparison options page, we shortened the heading and the radio button labels, to make the content easier to scan.

Refining the comparison journey

Given the issues with the comparison options page in round 1, we were really keen to try out other ideas for providing this functionality before round 2. Unfortunately, time constraints meant we had to improve the current journey as best we could.

Then, 2 participants in round 2 independently suggested a toggle feature in the statement, instead of having a separate page. So we knew it was a user need. The good news is we’ve found more time to refine the comparison journey. We’ll be doing this next, along with research to better understand users’ requirements for saving, printing and downloading their statements.

Screenshot described in caption under image Our next round of research looks at design and accessibility for the save, print and download functionality.

We’ll report back on what we find.