UN Forum on Business and Human Rights 2017 – Addressing Access to Remedy in the Digital Age: Corporate Misconduct in Sharing and Processing Personal Data

2017 United Nations Forum on Business and Human Rights 

Addressing Access To Remedy In The Digital Age: Corporate Misconduct In Sharing And Processing Personal Data

15.00 – 16.25, 27 November 2017

Palais des Nations, Room XXII

Summary Report

On 27 November 2017, the Human Rights, Big Data and Technology Project and the Permanent Mission of the Federal Republic of Germany to the United Nations organised a parallel session to the United Nations Forum on Business and Human Rights on ‘Addressing Access to Remedy in the Digital Age: Corporate Misconduct in Sharing and Processing Personal Data’.


  • Ms Nighat Dad, Founder and Executive Director, Digital Rights Foundation
  • Dr Krisztina Huszti-Orban, Senior Research Officer, The Human Rights, Big Data and Technology Project, University of Essex
  • Professor Sheldon Leader, Co-Investigator, The Human Rights, Big Data and Technology Project and Director, Essex Business and Human Rights Project, University of Essex (Moderator)
  • Mr Bernard Shen, Assistant General Counsel, Microsoft
  • Professor Changrok Soh, Director of the Human Rights Centre, Korea University and Member of the Advisory Committee to the Human Rights Council

Professor Sheldon Leader, the moderator, outlined the aims of the panel and introduced the panellists. He pointed out that the session would focus on the problems of transfer of data in two directions: first, from one business to another business (horizontal transfer) and second, between government and business (vertical transfer). The session looked at these two sets of relationships, underlining their common features and their differences as well as the human rights challenges surrounding these relationships through a series of questions directed to the panellists. The discussion was structured around two rounds of questions posed by the moderator, the first one addressed issues related to data-sharing among businesses and the second round focused on issues related to data-sharing with governments. The discussion was followed by a Q&A session with the audience.

Data-sharing between business relationships

Professor Changrok Soh reflected on the broader human rights implications of (unlawful) data-sharing, including but also beyond the right to privacy. First, he highlighted that new technologies that observe and analyse our behaviour in a specific environment (such as consumer’s behaviour in supermarkets) or rapidly developing technologies (focusing on machines’ emotion awareness) may be designed to safeguard our privacy by concealing our identity, but they manipulate our decision-making processes, threatening our entire existence (agency). Second, he warned that data-sharing technologies are shifting social interactions and structures by disrupting the dynamics of information sharing. He argued that certain developments will eventually force the reconceptualisation of certain human rights or even the creation of new rights.

Responding to a question on the challenges that States face when regulating data-sharing by businesses, Dr Krisztina Huszti-Orban warned that current regulation may not be equipped to deal with today’s challenges, potentially leaving a protection gap. She pointed out that the already established human rights framework may not be adequate to describe human rights implications in the digital age. Recent attempts to update regulation relating to data protection, such as the EU data protection package including the General Data Protection Regulation (GDPR) and the Directive on data processing with regard to criminal offences, that offer individuals more control over their data might not be sufficient. She added that data protection so far has been based on the premise that each individual is not just responsible for their data but has the requisite amount of control over their data (based on the idea of data self-management). However, today it is difficult or next to impossible for any individual to track their data. None of us is fully aware of the data collected, retained and used about us – directly or indirectly. Additionally, the traditional distinction between personal and sensitive data and other (often anonymised) forms of data is challenged today and should be re-evaluated. For instance, it has been repeatedly shown that the combination of anonymised datasets may reveal personal information. Data-chain obscures causalities and potential harm. Finally, cross-border data-sharing adds an additional level of difficulty.

Following up, Mr Bernard Shen focused on measures taken by companies like Microsoft to ensure the protection of the privacy of the users. He underlined that a core principle for Microsoft is to earn and maintain customers’ trust. First, with regard to control and consent over their data, companies have been taking measures to provide meaningful and understandable privacy information to users that allows them to exercise control over their data. For example, Microsoft provides easy to use tools, such as the privacy dashboard where users can look at their data, copy their data and reach out to correct their data. In some circumstances, further use of data may be appropriate without the need of further consent (e.g., reasonable expectation of use; minimal or no impact on users’ privacy; or sufficient safeguards are taken). Furthermore, he added that Microsoft has committed to respond within 30 days to any questions or concerns that users may have. When it comes to cross-border transfers in the EU context he explained that Microsoft supports the privacy shield and the standard contract clauses, and programmatically flow data protection requirements to its suppliers. Microsoft also works with the applicable data protection authorities (as needed) to addresses any complaints. Outside the EU context, Mr Shen stated that Microsoft strives to engage with the relevant privacy regulator in order to address any concern.

Finally, Ms Nighat Dad pointed out that companies do not uphold the same privacy standards in developing countries and the Global South as they do in developed countries. One of the reasons for this divide she explained is that developed countries have stricter legal standards than the Global South countries. She brought the example of Pakistan where, despite the constitutional guarantee to the right to privacy, there are no data protection laws. Indeed, she highlighted that there is no culture for respect of privacy, even at home. As a result, there is also no debate on practices and abuses by companies when it comes to privacy. She recognised that governments need to put in place relevant legal frameworks but also highlighted the fact that often governments do not have the knowledge and understanding of data mining practices of big tech companies. Therefore, she concluded that companies should apply the same standards in Northern and Southern countries and specify where data is stored, who has access and put in place an accountability process. Furthermore, she added that it is important to ensure that developing countries and the Global South should become part of the ongoing discussion on privacy and data protection standards that are gradually developing.

Data-sharing with governments

Dr Krisztina Huszti-Orban addressed a question with respect to the necessity to introduce an adequate protection framework that would also guarantee the right remedy in case of unjustified requests. She underlined that the basic and most important requirement is to have a legal framework that complies with international human rights norms and standards. If the State’s legal framework is not fully compliant with international standards, then companies are not able to ensure that they comply with those standards. For data-sharing requests, there must be a clear legal basis in law that is sufficiently accessible to everyone, clear, precise for the public and the implementing authorities and, additionally, the law should set up adequate safeguards.

Furthermore, Dr Krisztina Huszti-Orban maintained that the same protection standards should apply for cross-border sharing of data. Currently, data-sharing between countries are generally covered by mutual legal assistance arrangements that have been criticised for providing for lengthy, inefficient and over-bureaucratised processes. There have been attempts to reform these arrangements or find other more time- and resource-efficient arrangements. One way is side-stepping these agreements altogether and asking for information directly from companies or requesting companies to store all data within the respective States’ jurisdiction. Such practices, she underlined, are worrisome as they tend to remove protection. More recently States have been negotiating agreements that allow for one country’s authorities to directly request information from a company in the other country without going through the respective country’s authorities, such as the US-UK bilateral arrangement. These may potentially be helpful as they speed up processes, but they should not eliminate the protection standards. It is crucial that such arrangements are sufficiently transparent.

Lastly, Dr Krisztina Huszti-Orban mentioned two additional challenges. First, the fact that different kinds of data enjoy different levels of protection and maintained that these discrepancies should be remedied. For instance, if only metadata is collected in most jurisdictions there is no requirement for judicial authorisation or oversight, while it is more likely that this would be necessary for accessing content data. But today we can identify persons through their metadata. Second, when data collectors are based outside the jurisdiction of the State where the data is collected then the State has limited power to regulate such practices. These are very real human rights implications.

Responding to a question regarding the measures companies can take to certify the lawfulness of government requests, Mr Bernard Shen referred first to the process followed to handle such requests. He underlined that there is no unhindered government access to user data. There are processes governments need to follow. The request needs to comply with the rule of law and there is no blanket access to data. The company provides only the data requested. Another way to protect data from potential snooping is by encryption. Finally, to ensure transparency, every six months Microsoft publishes information on government requests and statistics. However, he pointed out that requests for user data may come with secrecy orders (or gag orders), prohibiting the company from publicly disclosing information about the order. There are legitimate circumstances where such secrecy is justified (e.g., risk of harm to witnesses or other people; undermining legitimate law enforcement investigations; or destruction of evidence), but the concern is when secrecy orders are overused. For example, Microsoft looked at an 18 month period and found 2,576 requests from the U.S. government with a secrecy order. From these requests, 68% came with no expiration date. Microsoft filed a lawsuit in 2016 against the U.S. government, advocating that except in limited circumstances, consumers and businesses have a right to know when the government accesses their emails or records. In October 2017, the U.S. Department of Justice announced a new policy with respect to secrecy orders to reduce the number of secrecy orders, end the practice of indefinite secrecy orders, and ensure that every secrecy order is carefully and specifically tailored to the facts in the case. Another example of Microsoft taking legal action is the ‘US search warrant’ case, where the company received a request to search and hand over email data stored outside the U.S. The company filed a complaint that it won before the Court of Appeals for the Second Circuit (the case is currently pending before the Supreme Court). This case highlights the urgency of governments and other stakeholders working together to find constructive solutions that protect public safety and human rights.

To the question regarding the possibility of future-proofing regulation and relevant safeguards in order to ensure that human rights are adequately protected against the adverse effects of technological developments, Professor Changrok Soh underlined that the recent phenomena of governments making demands of companies to obtain data indicate that private companies increasingly collect and hold more data than State agencies. While there are often legitimate reasons for such government requests, the privacy of individuals should still be protected. For instance, encryption helps safeguard the privacy of individuals. At the same time, however, companies cannot be trusted to self-regulate. Efforts should be focusing on establishing global standards and codes of conduct outlining the responsibilities of companies in the digital age. The harmonisation of standards and pressuring States to increase transparency can protect individuals against future abuses. In addition, the best tool to ensure the protection of privacy is a vibrant and healthy civil society to pressure companies to comply with human rights standards.

Ms Nighat Dad addressed the question on the possibility of achieving a balance between the right to access to remedy for government and corporate wrongdoing and the necessity of confidentiality, particularly in a national security context. She underlined that constant requests in the name of national security are the ones that put confidentiality mostly at risk. She brought again the example of Pakistan where measures in the name of countering terrorism and national security are no longer questioned by the public. She warned that national security should be narrowly defined. The Pakistani government is using various laws referring to national security to request information from companies. Companies then comply under the threat of being requested to leave the country, which has happened in the past. A balance should be found.

Public discussion

  • The issue of possible contradictions between national and international standards was raised with regard to government practices that violate human rights standards, either by imposing national legislation that does not comply with international standards or by pressuring companies to disregard those standards. Questions were asked about possible solutions to overcome such challenges.

Dr Krisztina Huszti-Orban argued that there is not one solution that fits all situations. In certain countries, there is a robust system through which companies can challenge such requests and they should use this possibility. In other countries where safeguards are not available or efficient, companies should look for other ways to put pressure on governments. As a last resort, if there are serious human rights concerns and if the company has no way to protect the privacy and interlinked human rights of its customers, then the company should consider disengaging. Another aspect that needs to be taken into account is that when faced with requests in the name of national security companies rarely have sufficient information to assess the legality of such request. Professor Sheldon Leader underlined the emergence of a duty to care even for parent companies when it comes to protection.

  • Another question raised the matter of data ownership and if it is consumers that can recall their data. Furthermore, it was asked whether companies can operate as custodians of these data. A follow-up question asked whether the UN Guidelines is the only source of international obligations for companies. Do we have to always rely on national legislation to identify companies’ obligations?

Mr Bernard Shen pointed out that the question expressed the concern over the ability of users to have control over their data. Microsoft has been taking steps to ensure that users have access to their data and are able to control it. He then noted that legal orders from government (e.g., warrants or other court orders) generally do not contain details of the law enforcement investigation and it is the role of the government (e.g., the judiciary) to ensure that the rights of users are protected under the rule of law when issuing such legal orders for user data. Professor Sheldon Leader closed the response by underlining that the answer to the question who owns the data will keep changing as we move forward.

  • The final group of questions focused on the direct obligations of companies. Concerns were raised regarding the risks involved with companies been targeted by hackers who acquire large amounts of personal data of users of these services. Lastly, a commentator suggested that the discussion made companies appear as the trustees of data that are forced by States to disclose their users’ data but that this is misleading, many of the data processing companies are involved in the commission of human rights violations.

Professor Changrok Soh underlined that there are international mechanisms in place that could be used but called for stronger inter-State cooperation. Dr Krisztina Huszti-Oban added that when it comes to privacy violations that result in other infringements of human rights, for instance when a company shares the information about human rights defenders that leads to their arrest, enforced disappearance and other violations or abuses, it is often difficult to draw the causal link between the infringement of privacy and the other violations. However, she argued that when there is clear information that human rights defenders, political opposition and journalists are persecuted, then the company should refrain from sharing this information. She also underlined the usefulness of regional binding human rights mechanisms that could be used next to the universal monitoring mechanisms to ensure the right to privacy. Mr Bernard Shen highlighted the importance of cybersecurity. He emphasised that many companies, including Microsoft, are committed to respecting human rights in their products and services. Ms Nighat Dad underlined the need to include the Global South and developing countries in these discussions where the laws that aim at tackling blasphemy and similar ‘offences’ have increased together with technological developments. Governments are using such laws to violate privacy and it is impossible to talk of remedies if we don’t know how this information is shared and with whom.

Disclaimer: The views expressed herein are the author(s) alone