UN austerity and human rights report highlights Big Data risks in the UK

Big Data and artificial intelligence are, perhaps surprisingly, a featured highlight in the UN expert’s report on extreme poverty in the UK. Technology is frequently proffered as a solution to poverty and other social rights failings (see for example the recent Astana Declaration on Primary Health Care). However, Philip Alston, the UN Special Rapporteur on extreme poverty and human rights, focuses instead on its potential to erode democracy.

Alston, an independent UN expert, examines the impact of the UK commitment to transform the public service sector to full automation, “aided by data science and artificial intelligence”. He warns that while the government is talking about a system which is ‘digital by default’, in reality, a digital welfare state is emerging. “The impact on the human rights of the most vulnerable in the UK will be immense,” and the report documents “the gradual disappearance of the postwar British welfare state behind a webpage and an algorithm.”

Even in a modern, high income state such as the UK, a digital divide exists. Alston presents alarming statistics about people who cannot easily participate in a digital system despite government assurances that “most people are at ease and competent online”.

  • 47% of those on low income use broadband internet at home
  • 42% of those who are unemployed do their banking online
  • 43% of those on low income do their banking online
  • 21% of the UK population do not have five basic digital skills
  • 16% of the population is not able to fill out an online application form
  • 52% of Citizens Advice clients found the Universal Credit online application process difficult
  • The Department for Work and Pensions’ (DWP) survey in June 2018 found only 54% of claimants could apply online independently without assistance.

The digital barrier to welfare support (Universal Credit) was found by Alston to be particularly hard on women, older people, people who do not speak English, and the disabled. At the same time that people are turning to public libraries to seek support with online applications, the libraries are experiencing severe budget cuts. The report states, “In Newcastle alone, the first city where ‘full service’ Universal Credit was rolled out in May 2016, the City Library has digitally assisted nearly 2,000 customers between August 2017 and September 2018.”

The digital divide is not a problem identified in the move to an automated government. There is also a lack of transparency in automated decision-making processes. The Real Time Information system integrates various UK government agency data including the tax on earnings information submitted by employers. When errors are made in the data, resulting in inconsistent information about a claimant, the programmed assumption is that the automated system is correct and claimants are wrong. According to DWP, a team of 50 civil servants work full-time resolving the 2% of millions of monthly transactions that are incorrect. Alston was told by claimants that they often have to wait for weeks to get paid the correct amount, even when they have written proof in the form of payslips that the IT system is wrong. Similarly, the investigation of fraud is controlled by private sector IT vendors who have built ‘risk-based verification systems which flag claimants for low, medium or high risk of fraud and error, thus allowing local authorities to investigate high risk cases more closely.’ This system is soon to be extended into a “fully automated risk analysis and intelligence system for fraud and error,” to prevent fraud and error by using Artificial Intelligence. But Alston finds an absence of transparency, and he states, “The existence, purpose and basic functioning of these automated government systems remains a mystery in many cases, fueling misconceptions and anxiety about them.” Public awareness and understanding of these systems is imperative. He writes:

There is nothing inherent in Artificial Intelligence and other technologies that enable automation that threatens human rights and the rule of law. The reality is that governments simply seek to operationalize their political preferences through technology; the outcomes may be good or bad. But without more transparency about the development and use of automated systems, it is impossible to make such an assessment. And by excluding citizens from decision-making in this area we may set the stage for a future based on an artificial democracy.

Alston also raises the limitations of newly established institutions in the UK which use ethics, not human rights, to address the challenges of an automated government. “Ethical concepts such as fairness are without agreed upon definitions, unlike human rights which are law. Government use of automation, with its potential to severely restrict the rights of individuals, needs to be bound by the rule of law and not just an ethical code.”

Paul Hunt, in his analysis of the UN report, praises Alston for ‘exposing the gaping hole at the heart of our national human rights system: its failure to explicitly recognise social rights.’ He endorses the UN report for insisting that there is a need for legislative recognition of social rights. Another social rights risk posed by big data and artificial intelligence has been identified by the Human Rights, Big Data and Technology project as the use of data sharing partnerships in healthcare. Social rights risks could, however, be mitigated by conducting human rights impact assessments before embarking on artificial intelligence projects. The project has also commented on the global digital divide and the likelihood of increasing inequities if digital technologies are employed to address poverty and other Sustainable Development Goals.

Alston states that a machine learning system may be able to beat a human at chess, but it may be less adept at solving complicated social ills such as poverty.

Carmel Williams is the executive editor of Health and Human Rights Journal, and a senior research officer in the Human Rights, Big Data and Technology research project at the Human Rights Centre, Law School, University of Essex, UK.

Disclaimer: The views expressed herein are the author(s) alone.