3 December 2019
6.00 – 7.30 pm
Admission is free and all are welcome to attend. Refreshments will be available.
Data collection is part of everyday life but are we really aware of how our personal data may be used and have we truly given our consent to this process? What are the implications of data driven technologies potentially influencing all areas of our lives; from the standard of healthcare we receive, to the news we view online and even how we decide to vote in elections?
Follow us on Twitter @HrbdtNews
We need to think about how Big Data and Artificial Intelligence affect our human rights. The digital age has brought about a global pattern shift in how we communicate. Nearly everything you do through your smart phone, your computer or your home assistant produces data eg who you talk to, where you go and what you like. All this data can be scrutinsed by complex forms of analysis using algorithms, artificial intelligence and other digital tools. This in turn yields highly personalised insights about ourselves, our habits, desires and role in society. It can help you decide what to buy online and recommend you places to visit. However, data collected in the past can also be used to make predictions about your future. One of the big problems is that you don’t always know what data is being collected or even who is collecting it, how it is being used or shared.
Being connected and digital is practically indispensable today. However, the consequences of living in a digitally connected society also means the risks are magnified and reverberate through many areas of our lives. Speakers from The Human Rights and Big Data Project will explore the challenges and opportunities that big data and artificial intelligence are bringing to human rights. We hope you can join us as we chat about these issues.
Clicking Those ‘I agree’ Buttons – The Challenges and Implications of Giving Consent Online
Sabrina Rau (Senior Research Officer, HRBDT Project)
‘Datafication’ of life has led to individuals continuously generating data through their online and offline activities, without being fully aware of the kind of data they generate, how that data is collected, retained, or processed and what the implications of such uses may be. Consent plays a vital role in various actors being able to collect this information and use it for a variety of purposes and new regulations heavily govern its use. To what extent, however, is the agreement we give really freely given, specific, informed, and given by unambiguous indication? Developments of consent online seem to be at odds with the norms underpinning the central role assigned to individual consent for data gathering and use. In this talk we will explore why consent plays such a central role in data processing and what control we really have when it comes to our data and how it is used.
Google: Making Us Healthier or Exploiting Our Data?
Amy Dickens (Doctoral Student, HRBDT Project)
Data-driven technologies like artificial intelligence hold the promise of revolutionising the NHS, providing more efficient and better-quality care. With their vast wealth and access to huge datasets, Google and the other technology giants are at the forefront of this technological revolution. Increasingly, the resource-stretched NHS is sharing data with these companies in a bid to gain access to the latest innovations in data-driven technology. However, as the collaboration between DeepMind Health and the NHS illustrates, public-private partnerships of this kind can exploit patient data in ways which not only threaten patient privacy but pose broader risks to the sustainability and accountability of the health system. In this talk, Amy explores whether Google is making us healthier or exploiting our data.
‘Living In a Bubble’: Disinformation and Online Targeting During Elections
Dr Elena Abrusci (Senior Research Officer, HRBDT Project)
Every time we open our social-media or we look on the web, we are surrounded by misinformation and disinformation, commonly known as ‘fake news’. Catchy and shocking titles are everywhere on our Facebook and Twitter feeds. It is often extremely difficult for us to distinguish a reliable news source from a false or altered one. Thanks to the huge amount of data that tech companies, ad agencies and private actors have about us, the way we receive these information can also be biased and we can end up in so-called ‘echo chambers’ and ‘filter bubbles.’ This can mean we always see the same kind of information on the basis of our profile and behaviour online. With general elections coming up, selective funnelling of information could have a significant impact on our human rights, potentially threatening our right to participate in public affairs and ultimately our democracy.