PROGRAMS

A complete refresh of a 'Programs' product offering for FutureLearn. 'Programs' are a collection of online courses that can lead to professional or academic accreditation.

 

Challenge
'Programs' offering was introduced by FutureLearn in 2016 with the intent to bridge the gap between short courses and online degrees. However, throughout the years the performance of the offering struggled.

Approach
We used foundational research to identify key areas of improvement. Having time limitations we plotted these areas on an impact vs effort matrix to narrow them down. We then conducted hypothesis led user interviews to discover user needs and validate our assumptions. Finally, we used remote user testing and quantitative data to iterate on our new designs.

Outcome
This resulted in a new, up to date, the visual look of the offer and its core pages. Complete restructuring of the IA, which span from module components to whole pages. We also introduced a taxonomy to the offer and new naming conventions. 

image0day
image

Defining user needs

The project began with hypothesis led qualitative interviews. The first half of the interview included thoughts and feelings about current offering discovery experience on a platform. The second part included rapid prototypes, displayed below, to test any assumptions and potential directions we could take. My thoughts on hypothesis led approach are explained further in my blog post here.

image2

First iteration

Concluding user interviews and having a washup with the team, I used common themes discovered in the research to inform our first iteration on the offering. This included improving key information required on each decision point, emphasizing visual drivers when choosing a course and restructuring main taxonomy of the offer.

IMAGE5


Further improvements

Having worked out exactly how the new offer should look like, including updated core pages, IA and taxonomy. We conducted remote user testing on our new designs, this method gave us wider reach to an international audience. This research indicated a few pain points around pricing of the offer, having addressed that with updated designs we launched the offer and used live data to inform future iterations.

image69
image300

Final thoughts

This has proven to be a great project with a really good mix of quantitative and qualitative research techniques, with multiple iterations on the design. I'm happy with the overall improvements, which laid the foundations to scale this offer going further.

If I was to do it all over again, I would perhaps question our methods to define user needs. I would consider a survey or a diary study, which would allow deeper or more reaching insights than a controlled face to face interview environment.  However, this project has thought me a lot about analyzing research and quantitative data.

2020 © BENAS SKRIPKA