Put the right people in control of shaping digital and data-driven technologies

Blog | 2020-09-30 | By: Digital Council for Aotearoa New Zealand
Robot and human hand reach out to touch each other
World Image/Bigstock.com

Last week we talked about using the digital dilemmas of COVID 19 to inform design. Understanding how someone might solve a challenge they face will surface new insights for design. People become a resource, not a passive recipient of a service.

Which is where participatory design comes in.

This week we share the participatory design methodology we’re using to bring attention and energy to the digital dilemmas we currently face in Aotearoa. We’re using this methodology to bring a diverse range of people and experiences into our conversations and deliberations about automated decision making before we take our advice to the government.

Participatory design

Participatory design helps those using or impacted by a system to have a say in how it works.

In partnership with @Toi Āria we’ve been travelling the country looking at how people whose lives are impacted by automated decision making (known to many as algorithms) define trustworthiness. Trust and trustworthiness are key factors in unlocking the potential of digital technologies for social and economic wellbeing.

What are their current levels of trust, and what do they need in order to trust these technologies?

The ‘Comfort Board’

These levels of trust are measured using the ‘Comfort Board’. The Comfort Board helps people to locate themselves and their level of comfort in relation to a number of algorithm-related scenarios on a trust-benefit matrix.

Image of the comfort board with the trust-benefit matrix - vertical axis shows "trust" from 1-7, with 7 being high. Horizontal axis shows "benefit" from 1-7 with 7 being high. Matrix shades from dark blue in the bottom left hand corner to light blue at the top right.

Imagine you’re on the list for a kidney transplant and an algorithm gives you a priority score based on how urgently you need the surgery and how much you will benefit from it compared with other people.

Or an algorithm helps identify your daughter’s risk of needing to be on a benefit when she leaves school so that she can be referred to community providers to help her avoid this.

Where would you locate yourself on the trust-benefit matrix? What would increase your level of comfort in the scenarios? What themes and concerns do you consider most important to your level of comfort?

13 pairs of feet standing on the comfort board which is laid on the ground. Most are grouped top right.
The Comfort Board in use

The Toi Āria team, who developed the Comfort Board approach in 2016, say their role is to support workshop participants to identify, in their own words, the most important factors that would raise their level of comfort (or discomfort) with the use of algorithms. The scenarios have deliberate ‘information gaps’ or ambiguities built in to elicit discomfort and generate the participant’s own voice in expressing their concerns or desired criteria for comfort. Toi Āria then make sure the research findings maintain the voice of participants with integrity and authenticity.

Next week we’ll share some of the interim insights from this approach.

Brokering safe spaces and places to examine big issues

Council members have felt hugely humbled by the people who have shared their backgrounds, life journeys and lived experiences with us. “What blew my mind was the direct impact of automated decision making algorithms on their lives in ways that I could not have ever imagined — and I have imagined many situations and lived through many of them first hand in my life journey so far. This is exactly why we are doing this research,” says Council Chair, Mitchell Pham.

The Council is honoured to have partnered with Toi Āria to create space and place for participatory design. A place to locate ourselves and our lived experience in safe, respectful ways alongside others. A place where we could be heard without judgement. Express vulnerability without shame. Examine our beliefs without influence.

If only the internet could be such a place. If only we could plug people into that kind of matrix…

 

*******

ACTIVITY SUMMARY

  • In line with having those impacted by policies at the centre of design, we commend Citizens Advice Bureaux for their Leave No One Behind Campaign to Address Digital Exclusion. CAB is an Alliance of the Digital Council and we support this initiative to put people’s needs at the centre of public service design and policy decisions.
  • Anyone who’s interested can enrol to take part in an online participatory design workshop at NetHui on 12 October.
  • Last week VOYCE Whakarongo Mai, ethnic communities, Māori and youth were part of participatory design workshops in Christchurch and Palmerston North. This week we’re looking forward to workshops with BLENNZ and with taxi drivers.
  • We have summarised insights from two town hall events with Māori stakeholders and will be sending these to participants. We’re particularly interested in exploring ways of bringing rangatahi together with technology organisations and those working in the digital skills space. We heard that lots of great things are happening, but they often occur in pockets and in isolation. We’re interested in how might we partner to bring this mahi together and explore scale opportunities.
  • In between workshops we had an evening Zoom session with the Centre for Data Ethics and Innovation in the UK. Being a similar body to the Digital Council we were interested in their experiences of being an advisory body, their approach to providing an advisory function, and the issues in AI they were grappling with. The CDEI has released an AI Barometer analysing the most pressing opportunities, risks and governance challenges associated with AI and data use across five sectors in the UK.