Context
Our primary user group consists of individuals within early years settings who are responsible for hiring new staff and verifying qualifications. This typically includes nursery managers, HR managers, recruitment managers, and nursery owners.
Initially, we decided to focus on early years managers first as our primary user group, with the option to include early years practitioners at a later stage if needed.
As a result, during the Beta phase, the Check an early years qualification service was developed primarily around the needs of managers. However, shortly before moving into Public Beta, we began investigating whether practitioners might also benefit from using the service.
Discovery research
Although we had already developed a service, we decided not to test it directly with practitioners. Instead, we conducted semi-structured interviews to understand their awareness of ‘full and relevant’ qualifications, whether they checked their own qualifications, and what their motivations and experiences were when doing so.
We also used these interviews to assess whether practitioners had access to the information our service requires (such as start and award date, location, level, and awarding organisation). This helped us determine whether a qualification checking service could meet their needs.
We found that practitioners showed limited familiarity with the concept of ‘full and relevant’ qualifications and the Department for Education’s (DfE) role in approving qualifications.
Most practitioners assumed that holding a certificate was sufficient proof that their qualification was valid or recognised. As a result, none reported ever checking their own qualifications. Some believed that managers carried out checks during recruitment, but there was limited understanding of how or when this happened.
Practitioners also mentioned that not all the information required by the service — such as the awarding organisation or start date — was always easily available to them.
Later research with practitioners
Survey
Although our initial research showed that practitioners do not check their own qualifications, we decided to run a sector-wide survey to validate these findings. The aim was to determine whether additional educational initiatives were needed and whether the service should expand to include practitioners.
It was agreed with policy that if practitioner awareness of the term ‘full and relevant’ and the need to check their qualifications was below 10%, only light content design tweaks would be made to the service, with most of the work led by policy through external marketing and branding. However, if awareness was above 10%, more substantial changes would be built into the service to accommodate practitioners checking their existing qualifications.
We found that most checks or research take place before obtaining a qualification. Managers had less influence on qualification decisions than anticipated, while peers and colleagues played a more significant role. Awareness of both the term ‘full and relevant’ and the need to check qualifications was just above 10%.
Since more than 10% of respondents reported using the service to check their qualifications, this indicated a meaningful level of awareness worth exploring further. As a result, we recommended considering more substantial changes to the service to better support practitioners who want to verify existing qualifications.
Moderated testing
Following the survey, and before making any changes to the service that had previously tested well with managers, we conducted moderated usability testing with practitioners. The goal was to understand whether they used the service to check their qualifications, how well it supported their needs despite not being designed for them, and to identify areas for improvement.
We found that half of the participants had never checked their qualifications, while the other half assumed that their qualification automatically made them eligible to work in early years settings. Those who had checked their qualifications mentioned that they did not always have access to all the information required by the service, and that some of the content did not feel relevant to them.
Overall, the service performed well, and participants were able to reach the results page. However, participants found the content harder to understand than managers, largely because they were unfamiliar with some of the terms used in the service. They also encountered content intended for managers, which caused confusion when it appeared to apply to them. This suggested that some areas needed clearer explanations, while others felt overly detailed or not relevant to practitioners.
An example of information not being relevant to practitioners is the ‘Your responsibilities as a provider’ section that appears on the qualification result page:

This section was only applicable to managers, but some participants assumed it applied to them as practitioners. This highlighted the need to tailor content for each user group.
Other data insights around practitioners
Data from Ecctis — the service practitioners previously contacted with questions about their qualifications — showed that over a third of the queries received were from practitioners. This suggested that some practitioners were checking qualifications. However, it is worth noting that managers sometimes refer to themselves as practitioners, which could affect the accuracy of this figure.
These insights reinforced the need to consider how the service could better support practitioners directly.
Considerations for our next steps
Running a design crit to explore options
Since research suggested that practitioners could benefit from a tailored journey, we first wanted to explore whether the existing content could be adjusted to make it more neutral and suitable for both user groups, before deciding to build a parallel journey for practitioners. To do this, we ran a design critique with other designers in the portfolio to gather their insights.
For the session, we highlighted areas of the service originally designed to meet managers’ needs but found to be less relevant for practitioners. We also shared key feedback from our research to guide the discussion.
One of the main outcomes was the need to consider how far we could adapt the managers’ experience to accommodate practitioners, and the potential risks of doing so compared with designing a parallel journey for practitioners.
Identifying areas that could cause distress
After the session, we reviewed the service in more detail and identified areas that could potentially cause distress for practitioners — for example, discovering that their qualification was not considered ‘full and relevant’ by the DfE, or that they could not be counted in the staff:child ratios they expected.
These findings highlighted the need to review how the service supports practitioners at key points in their journey, particularly when results may be confusing or disappointing.
Playing back our options to policy and leadership
Drawing on our research and data, we considered several potential options, but each carried risks or constraints that made it unfeasible.
With the Ecctis service for qualifications achieved in the UK due to close to new enquiries by the end of the year, building a parallel journey was not feasible given the time and team capacity available.
Softening the existing language within the service to make it suitable for both practitioners and managers also carried risks, as it could compromise the clarity of the journey for managers.
As a result, we recommended taking a minimalist approach in the short term, focusing on supporting practitioners in 3 key scenarios:
- when they do not understand the final result
- when they find out that their qualification is not full and relevant
- when their qualification is not on the early years qualification list (EYQL)
Initial design change
We identified the pages where practitioners could face the greatest risk of distress — the ‘I cannot find the qualification’ and result pages — and decided to focus on those for the Minimum Viable Product (MVP).
While tailoring the service content for practitioners on those pages, we also decided to quickly release a new page to start capturing data on how the service was being used. Before making any major changes — given the amount of development work and policy sign-off this would have required — we wanted to understand how many users were checking their own qualifications versus someone else’s.
To do this, we added a new page at the start of the journey asking whether users were checking their own qualification or someone else’s. This allowed us to gather early insights to inform future design decisions while continuing work on the practitioner-focused journey.

Having this page also allows us to provide different content later in the journey for users checking their own qualifications, and monitor how many users are checking their own qualifications overall.
We chose to frame the question in this way because managers often identify themselves as practitioners, so asking about their role (for example, ‘Are you a manager or a practitioner?’) would not have produced clear or reliable data. Asking instead whether they were checking their own qualification or someone else’s made the question clearer for users and provided more accurate insight into the task they were trying to complete.
Following design iterations
As research showed, we needed to better support users checking their own qualifications and ensure they were provided with relevant information.
We began by improving the ‘I cannot find the qualification’ guidance pages and the result page, as these areas were the most likely to cause practitioners confusion or distress. The new content focused on where users could go if they needed further help and what next steps they could take to develop their career in early years. We also added links to the Early Years Careers website so users could access trusted advice and guidance from the DfE about training and professional development opportunities.
Introducing practitioner-focused content on the qualification result page
On the qualification result page, we replaced the ‘Your responsibilities as a provider’ section, which applies to managers, with a new ‘Developing your career in early years’ section. This provides practitioners with general advice on the next steps they can take to progress in their career, based on the qualification level they are checking for, its start date and the result they get.
The example below shows the version of the new ‘Developing your career in early years’ section displayed when users check a level 3, 4, or 5 qualification, started on any date, that is considered full and relevant:

Improving content to support both user groups
We also made several content changes that would benefit both managers and practitioners — for example, adding information about level 6 qualifications to the matching qualifications list page so users would not need to navigate to the guidance page to find it, and updating the ‘If you cannot find the qualification’ section to make the language more neutral and relevant to both user groups.
The example below shows the new section on the matching qualifications list page for users checking a level 6 qualification started before September 2014, along with the updated ‘If you cannot find the qualification’ section:

Building the new tailored ‘I cannot find the qualification’ pages
Since we had already used Contentful’s reusable component functionality to build modular content blocks for the managers’ journey in the ‘I cannot find the qualification’ pages, we were able to apply the same approach to create practitioner versions. This allowed us to develop variations tailored to the qualification’s start date and level, as well as to whether the user was checking their own qualification or someone else’s.
Next steps
Ideally, we would have liked to test the new journey with practitioners before the changes went live. However, the upcoming Ecctis UK arm switch-off and limited research capacity in the team meant we decided to release without testing first.
Once research capacity in the team is restored, we plan to test the new journey with practitioners. This will allow us to gather insights and make any necessary changes as part of our iterative design process. We will also keep monitoring how many users check their own qualifications versus someone else’s to inform future design decisions.