Goal 1: discovery peer review

At the end of our discovery, we will have a peer review with digital peers. A peer review is a way to get feedback and recommendations on the work we have achieved.  

During this sprint, we agreed on the format of our peer review. We are using a Lucidspark board that is structured as a story that covers:

  • what we did 
  • how we did it 
  • our recommendations 
  • our reflections on the discovery

We are also going to iterate the Lucidspark board template which can be used for future discoveries. We have prioritised this artefact because it helps us to reprocess work from early sprints which will help us to write our discovery report. Lucidspark is also having planned downtime in the run up to our peer review.

We are having daily short collaboration team meetings to fill in parts of the peer review. This is helping to break down the workload without distracting the team from our sprint tasks. 

Peer review - improving onboarding grants.png

Goal 2: discovery report and documentation

We are working on a formal discovery report for teams who pick up our recommendations in future. We are using PowerPoint, because it is a tool available to all within the Department for Education. We plan to focus on this report when Lucidspark is unavailable.  

We are expecting the report to cover:

  • a summary of our findings and recommendations 
  • what a discovery is 
  • what service the discovery is for 
  • what existing work took place before the Discovery 
  • what change initiatives are already in progress
  • each journey stage: steps, people, pain points, user research findings, business analysis findings, recommendations
  • a list of our artefacts
  • appendices with detail

Goal 3: refine data source maps and quality assurance

We reviewed our process map with 16 to 19 and adult funding stakeholders to validate the data sources we have identified, and the quality assurance processes involved in the onboarding journey.  

Adult funding

We were informed that adult funding uses individualised learner record data that is received from data science for allocation modelling, rather than any census data.

A technical specification document is uploaded to Document Exhange and Manage your education and skills funding (MYESF) to illustrate the allocation calculation models to providers. 

During stakeholder interactions, we highlighted that maintaining a standard data set to compare against the new system outputs for validation purposes is essential, especially as the inputs for the new systems (adult funding data being migrated to FDS and CFS currently) are new for the upcoming year. 

Business Process Maps v3- Grant onboarding - 16-19.png

We were asked about adult funding being migrated to Funding Data Service (FDS) and Calculate Funding Service (CFS) and whether there is an anticipation of reduced workload.

We explained that despite moving to new systems, the quality assurance effort may increase initially as there will be a need to compare outputs from both the current excel model and the new system outputs to ensure accuracy. However, this increased effort will provide greater reassurance of the data's accuracy. 

16 to 19

Our discussions with 16 to 19 stakeholders clarified that Allocation calculation toolkit (ACT) plays a major role in allocation calculation.  

These files are uploaded to document exchange and Manage your education and skills funding service (MYESF) to illustrate the calculation elements used to model and create allocations to providers. For 16 to 19, individualised learner record and census data are equally important.

Business Process Maps v3- 16-19.png

National Audit Office audits

It was mentioned that National Audit Office (NAO) audits often require tracking data back from payment to source, covering multiple financial and academic years. While there have been challenges, quality assurance processes have generally met audit requirements.  

The last NAO report suggested improvements in governance, particularly around the decision-making processes in the 16 to 19 technical advisory group. 

Sprint 7 goals

  • analyse our user research, articulate user needs and summarise our findings 
  • continue validating our business process maps and summarise our findings 
  • identify our second round of opportunities 
  • finish our peer review materials ahead of the Lucidspark downtime

Share this page