Designing for Impact: A User-Centred Approach to Data Visualisation

For the past year, I have been working with B Team and Sustrans on an exciting new dashboard to allow users to explore the data of the bi-annually released Walking and Cycling Index. This index explores behaviours and attitudes in cities across the UK with the aim to understand infrastructure needs and address policy challenges. By putting the data in the hands of policy-makers, they will be empowered to create the exact analyses that they need to create effective interventions in their local area to enable more citizens to walk, wheel, run or cycle. Reducing pollution and benefitting public health. The first phase of the dashboard is now live. More dashboards will be developed in the coming months.

Working with B Team meant that this was a data visualisation project with a thorough user experience and design process. This isn’t that common when designing data visualisations, and it presented an opportunity for me to learn a lot. I discovered some valuable lessons that you can benefit from in your projects.

User-centric Design Techniques

1. Conducting user research

The first step in the process was to conduct a survey as well as extensive interviews with carefully selected potential users, to understand what they would want to achieve with the dashboard. These interviews were conducted by user experience experts, who collated and analysed the results. Together with Ben Coleman from B Team I used these insights to create the design drafts that we presented to the Sustrans team.

Having clearly defined user needs was incredibly helpful throughout the process. If ever there was a mismatch of opinions, we could refer back to the research and try to determine which route would best meet the needs the users had expressed.

2. Running design workshops with the team

Not strictly part of the user research, the design workshops were an approach that I had never used before but will implement in future projects. Once we had agreed on the topics and structure for the dashboards we brought together about ten members from different departments and disciplines within Sustrans. Working through a Miro board, each member of the team sketched out an idea of how they might visualise an aspect of the dashboard. All images were collated, presented by the sketcher and discussed. Finally, we voted on our favourites.

This gave me further insight into the ways that different people were thinking about the data and the ways that they were used to interacting with dashboards. It also gave us useful starting points to incorporate into our designs.

3. User testing the interactive dashboard

We went through two rounds of user testing. With users receiving tasks on a video call and completing them while Ben was watching. This allowed him to observe common difficulties that users faced in finding specific information or completing actions they were interested in.

The insights from testing gave us clear instructions on things that we had to change to make the experience as fluid as possible. It was also the step that taught me the most about how I will design dashboards in the future.

Things I learned

1. Users Don't Read Instructions

Despite our best efforts to provide clear and concise guidance, it became evident that users tended to overlook or ignore instructions when navigating the dashboard. Users tended to rely on their intuition rather than taking the time to read through instructions, even if they were clear to see. Based on this finding, we re-designed some aspects of the dashboard to be more intuitive, using symbols and following the natural user flow that we observed in the testing phase. By minimizing the reliance on textual instructions and maximizing visual cues, we can ensure a seamless user experience.

2. Users don’t always interpret colour as we intend

In the realm of data visualization, colours play a crucial role in conveying information. While certain colour associations and meanings are commonly accepted within the field of data visualization, we found that users don’t necessarily share these. By considering the diverse interpretations users might have and by adding the necessary context to interpret colours, we can create more inclusive and accessible data visualisation.

My suggestions to colour negative values in orange, and positive ones in blue, as we are usually taught as data visualisation best-practice, were rejected on the basis that the orange felt warm, friendly and positive, whereas the blue was cold and was perceived as more negative. In the end, we didn’t go with this scale at all, and all the final colours were chosen by Sustrans’ designer, making changes based on user feedback throughout the process.

3. The importance of labels

Labels used to be an afterthought for me. I would just go with whatever felt right in the moment. In this project, we spent a huge amount of thought on the right way to refer to and describe actions and warnings and how exactly to display the questions asked in the survey. For instance, user testing showed us that the term “aggregate” wasn’t readily understood, so instead we used the more accessible “combine”.

These changes seem small but can make a significant difference to the way that a user navigates and ultimately feels about the dashboard.

The Power of User Experience Techniques

Working on a project led by a user experience expert and incorporating user testing significantly enhanced the overall design and functionality of the final dashboard. Through user feedback, we were able to identify pain points and areas for improvement that we would have otherwise overlooked. This collaboration allowed us to iterate and refine the dashboard, resulting in a more user-friendly and effective tool. In future, I will make more use of similar techniques to achieve these benefits in other projects.