We knew that one of our biggest challenges was how we could support users with varying needs to start focusing in on their potential options. Fundamentally this is about defining and then potentially refining their search in order to find the best results for them.

The challenge

Users have different preferences and prioritise different factors when approaching finding the right course for them. Returning to our common mindsets, those people with an exploratory mindset could struggle with direct search due to not knowing what to search for, for instance.

We wanted to test our assumptions that young people often aren’t aware of all their options and value seeing things they hadn’t previously considered and also that some people might prefer to rule out options than to actively describe what they want to do.

Due to the technology-led approach to this project, it was also particularly important to consider technical feasibility and how we could prevent users feeling overwhelmed by too many results but ensure the results they did get were as relevant to them as possible.

What we tested and found

As we described in our post about mindsets and user journeys, we came up with the idea in early alpha of a ‘guided journey’ to help those users who might feel overwhelmed by their search options to come up with useful results.

whiteboard showing early sketch of guided journey pathway

In the best agile fashion, the idea started as a scribble on a whiteboard based on a conversation between 2 members of the UCD team. The idea was to help users set up useful filters in advance of seeing their search results, so that the results would be more immediately relevant and less potentially overwhelming.

The alternative approach – browsing nearby courses – would return search results quicker than the guided journey but without having applied any filters in advance, they could be overwhelming in terms of number of in the variety of subjects and course types they’d cover.

So in testing we wanted to understand whether we needed both the guided journey and browse options and which types of users would be drawn to each.

Here are some of the things we explored.

Choosing between the journeys

how would you like to search screen

We tested having a question after users click ’start now’ to specify which user journey they wanted to use. In early prototypes we’d had multiple calls to action on the homepage itself but we found this was confusing users and the single ’start now’ button, similar to other government services, was the preferred option.

We found that this question was well understood, with most users drawn to the second option. One participant, with admirable honesty, said "I was a bit lazy so went for personalised results as thought this would be easier. I'd obviously go for that one."

Understand location preferences

enter a town, city or postcode screen

how far would you be happy to travel screen

Having experimented in early prototypes with having the location question after the prompt about what someone would want to study, we swapped these round as location felt like the starting point for all the journeys. This was an important finding from discovery and was reinforced by one participant who said ”This is a really important question.”

Interestingly, young people didn’t tend to think of distance in terms of miles. Their mental models generally focused more on travel time. In practice, no-one had a problem in selecting an option and continuing the journey.

Another participant said that "At the age of 16 you don’t have a car and you can only get so far. Most realistic option, maybe 5 [miles] but only if you know you could get a lift."

Taking inspiration from social media

help us improve your search results screen

During ideation in early alpha, we wondered if we could use an approach from social media around allowing users to say they wanted to see ‘more results like this’ or ‘less results like this’. Tapping into their snap judgements on a course results – in a similar way to swiping left or right on potential dating matches – could enable the AI to understand the user’s preferences and tailor their results accordingly. We also suspected that from a psychological perspective, many people find it easier to make comparative judgements rather than absolute judgements.

We introduced 3 screens at the end of the guided journey to show courses with a range of distances, subjects and course types. Users could respond to these to say whether they wanted to see more of less like these examples.

What we found was that most users got stuck at this point int he journey and were unsure of how to continue, often thinking that these courses were the search results rather than exemplars to inform the results.

As one participant said "It might be better to show suggestions for other courses as part of search results."

Iterating on learning style

what's your preferred learning style screen

How we asked about learning style in the early Figma prototype.

do you prefer academic or practical courses screen

How we ask in the final alpha prototype.

We knew from discovery that young people have different preferences around learning style. Some prefer academic courses and others are much more suited to vocational courses. So we wanted to ask about this as part of the guided journey, particularly as we knew that young people were often biased towards one or other option (for example, due to attitudes from their school or parents) but with hindsight wished they’d known about all different types of courses before they’d made their decision.

We suspected that the words ‘academic’ and ‘vocational’ wouldn’t be well understood by young people. But we didn’t know what would work better and there was no research on this from other parts of DfE.

We experimented with ‘study-based’ vs ‘hands-on’ courses, which users didn’t have a problem with, though we realised these terms weren’t specific enough as even hands-on courses could have study-based elements.

After exploring terminology in more detail in a focus group of 12 teenagers in Leeds, we found that the term ‘academic’ was widely used and understood. But no-one in the focus group knew what a vocational course was, even though their own course was vocational! As soon as we mentioned these courses were more practical, they immediately understood. So in the final alpha prototype we’ve used ‘academic’ vs ‘practical’ wording but importantly included examples in the options. In general, we’ve found that including examples of this sort is really effective with young people at reducing cognitive load when answering a question by minimising any potential ambiguity about the categories.

Participants found the revised question easy to understand. One said "I quite enjoyed how you can choose a mix of both. That's really good."

Originally we asked a separate question about whether users were interested in earning money as we hypothesised this could help determine if they’d be want to see apprenticeships. But this caused confusion, as many participants thought it related to having a part time job while studying. In the end we removed this question as we grouped apprenticeships with other practical courses.

Additional needs

do you have any additional needs or barriers screen

In our early Figma prototypes we explored the idea of asking as part of the guided journey if users had any additional needs or barriers. The idea was that the search results could then filter out courses that weren’t suitable.

Those participants who saw this question felt that it made the service feel more inclusive but we didn’t test this with NEET users who were the group most likely to have such additional needs.

After speaking to data science colleagues, we also realised that the datasets we’re using wouldn’t support this question as providers don’t currently specify whether courses are suitable for students with specific needs. So we’ve removed this question for now but might return to it in the future.

Post-results filtering

filter options

In both the guided journey and browse options, users would be able to filter their results after seeing them. This would be particularly important for browse where users hadn’t specified any filters in advance.

We’ve explored and tested a limited range of initial filters, with the idea of expanding on these in beta.

The entry requirements filter was based on the finding that virtually all users are really interested in quickly understanding whether they met the requirements for a course. There are various ways we could explore this further. For example, some participants suggested it would be useful if they could add their GCSE results so that courses they didn’t qualify for could be automatically hidden.

In testing, overall user behaviour was to look at the search results and then narrow them down with filters. In general, the filters we presented them made sense to users and they were able to use them easily.

One participant said "I think that is what the main filters need to be, especially the keyword to narrow it down."

What we’ll do next

Based on what we’ve learnt, our initial priorities for beta are to:

  • explore other options for the distance filter, for example by showing a map or specifying maximum travel time
  • look at whether there’s any scope to ask providers to start specifying whether courses are suitable for students with additional needs
  • consider whether any additional filters are required for the minimum viable product