Loading Events

« All Events

  • This event has passed.

October meetup: The ethics of digital well-being

October 27, 2020 @ 6.50 pm - 8.05 pm

Digital technologies are changing the way we understand and manage our health and well-being.

Many of us already share health-related information on social media platforms or use smart devices and wearables to track details about our fitness and overall well-being. However, the use of digital technologies may also impact our well-being in more subtle ways. And, whether or not specific technologies, such as our smartphones, are beneficial or detrimental for our well-being remains an open question. In this talk we will present several examples of what can be called ‘digital well-being technologies’.

The purpose of this meetup is to explore a series of ethical questions associated with the technologies, including ‘how private is your sensitive information?’, ‘do digital technologies undermine our autonomy and decisional freedom?’, and ‘who has the right to use digital technologies for the purpose of intervening in our health and well-being?’.

Over the course of this talk, you will learn about a) the techniques that designers use in the development of the respective technologies, b) what ethical issues the technologies give rise to, and c) how we should respond to these developments

We will be using Zoom for the meetup. Upon registration, you will receive a confirmation email. Please scroll to the bottom of the email to the ‘Additional Information’ section to see instructions for joining the webinar.


6:50 – 6:55pm: Arrive and join us for a virtual drink and snacks while we wait for people to arrive

6:55 – 7:00pm: Welcome and introduction of speaker

7:00 – 7:20pm: Presentation of topic

7:20 – 7:40pm: Discussion in breakout rooms with feedback

7:40 – 7:50pm: Report back by break out groups and further discussions

7:50 – 8:00pm: Wrap-up and mentions of relevant upcoming events etc

About the speaker: Dr Christopher Burr

Dr Christopher Burr is an Ethics Fellow within the Public Policy Programme at the Alan Turing Institute—the UK’s national research institute for data science and artificial intelligence (AI). His research focuses on the ethical design, governance, and use of data-driven technologies and the interaction between humans, technology, and society. He works closely with policy-makers and regularly advises public sector organisations about their use of data and AI.

Christopher is the editor of a recent collection titled ‘Ethics of Digital Well-Being: A Multidisciplinary Approach’, and has also published articles on topics in bioethics (e.g. how AI technologies should be used to support or deliver mental healthcare), cognitive science (e.g. how to understand the risks of intelligent systems influencing and shaping human judgement and choice behaviour), the study of well-being (e.g. how can we use digital technologies to measure and promote individual and social well-being), and human-computer interaction (e.g. how to design intelligent systems that promote intentional use and self-determination).

Prior to starting at The Alan Turing Institute, Christopher held postdoctoral research posts at the Oxford Internet Institute, University of Oxford (2018–19) and the Department of Computer Science, University of Bristol (2017–18). He completed his PhD in Philosophy of Cognitive Science at the University of Bristol in 2017.

Join us

If you haven’t already, please join the Ethical Reading movement! Individual membership is free. Simply click on the ‘Login / Register’ link at the top of any page of the website. You can sign up to join our mailing list by filling in the form on the homepage.

We also welcome organisations to join us as Partners. Please contact us at [email protected] to find out more.


October 27, 2020
6.50 pm - 8.05 pm
View Event Website


Ethical Reading
View Organiser Website

Receive updates from Ethical Reading

Do you want to keep up to date with the latest developments at Ethical Reading?

Sign up for regular news and updates by filling in the form below.