Introduction

Early in the research phase of the project there was a need identified to summarise in some way what “progress” or a “good school” looks and how implementing recommendations could be visualised over time.

Driving concept

Part of the underlying service is …

Prototype 1.1

Initial concept Initially the dashboard was a junction that enabled users to directly access specific areas of content and data entry points available throughout the service. specifically:

  • School profile connected directly to a view where users could review and adjust high level information about the school
  • Objectives connected the user to a screen where the school could define its objective set, which would in turn narrow down the number of questions that were asked during the Audit (also called assessment) process.
  • Audit connects the user to an interactive Q&A system that takes them through a range of category questions, this would become active when the objectives are set.
  • Results connects the user to a series of recommendations generated by the service thet is based on the questions asked during the Audit, this would become active when the Audit is complete.

SCREEN OF 1.2 DASH IN VARIOUS CONFIGURATIONS RESULTS SCREEN VARIATION

Prototype 1.2

Audit to Results

Text

Prototype 1.3

Results to Dashboard

“I would do the assessments first and then set objectives”

As the design of the service evolved so did the functionality of the “dashboard/check your progress” page itself, “school profile”, “objectives” were removed from the display and a enhanced version of results became page itself became the focal point where users would see a visual representation of the schools technical maturity, this visual representation is fragmented into a selection critical measures driven by the underlying rules derived from the technology support model. The result was the following:

  • School profile was removed as an editable feature
  • Objectives was removed as a configurable item and placed as filter in the recommendations section
  • Audit was renamed technology assessment and continues to connect the user to an interactive Q&A system that takes them through a range of category questions, this would become active when the objectives are set.
  • Results was merged into the dashboard view.

New and Returning user experience: Pre threshold

Text

Functions and features: Post threshold

After the school has completed enough questions for the dashboard to become active the user is presented post login with a view displaying the following*:

  • Global navigation system
  • System messages “important” box
  • High level indicator of digital maturity
  • Granular measures and narrative
  • Recommendations metrics
  • Self-assessment progress display
  • Help access

*not all features are proposed for MVS

Several iterations of the Dashboard have been produced for evaluation A B C D E

Dashboards and GDS

Share this page

Tags

Service design User experience