Dr Nejra Van Zalk is Head of the Design Psychology Lab at Imperial’s Dyson School of Design Engineering.
The link between social media and online harms for young people has been much debated, and the current pandemic has underlined the particular threat of online harms for vulnerable users. In contribution to a report on the COVID-19 ‘infodemic’ by the Digital, Culture, Media and Sport Committee, I have provided evidence of the addictive features of social platforms being of concern regarding the spread of misinformation to children.
Alarmingly, exploiting vulnerabilities in the human psyche is a common feature of the design process for many digital innovations. For example, addictive features such as harmful or factually inaccurate content is often added by design rather than accident so as to increase usage.
Information generated by clicks or smart device commands is now used as a proxy for understanding how an individual is feeling, thus making them the perfect target for advertising and misinformation, akin to emotional manipulation.
To make such innovation easier, tech companies have adopted a user experience research method called A/B testing, similar to Randomized Controlled Trials (RCT’s), as a way to deliver continuous interventions that change the experience of platforms and increase use time. Unlike RCT’s, however, these tests are conducted behind the scenes and without consent, and without the rigorous ethical considerations that form the cornerstone of research.
Despite these gloomy facts, there are positive developments. Recently, the Information Commissioner’s Office (ICO) released their Age-Appropriate Design Code aimed at companies whose content is likely to be accessed by children and young people. It includes 15 standards aimed at increasing children’s online privacy, such as a ban on disclosing data to third parties, high privacy settings by default, and refraining from nudging techniques for increased usage.
Together with Ali Shah, ICO Head of Technology, I road-tested this code in my “Design Psychology” module at the Dyson School of Design Engineering. Third- and fourth-year design engineering students created browser add-ons that filtered out inappropriate material when accessed by children, as well as digital interventions focused on teaching digital privacy to parents and children built into phone apps. This exercise demonstrated that the oft-repeated maxim by tech companies that such regulations would inhibit growth or creativity does not hold true.
Policymakers must urgently address these issues, including:
- Holding companies accountable to the code
- An increased emphasis on industry to work closely with behavioural scientists
- Treating technological applications as planned behavioural interventions.
Moving forward, I plan to conduct further road-tests of the new design code together with the ICO and my Master as well as undergraduate students. This exercise, besides for providing an investigation opportunity, helps to enforce in students the importance of considering children and young people in technological innovation. I am also conducting research in my lab on emotional privacy together with colleagues from Design Engineering, which will help further understanding about perceived privacy transgressions in people’s emotional lives.