Open Rights Group https://www.openrightsgroup.org Protecting the digital rights of people in the UK Tue, 14 May 2024 10:17:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.openrightsgroup.org/app/uploads/2023/01/cropped-Wordpress-Icon-32x32.png Open Rights Group https://www.openrightsgroup.org 32 32 Smart meter data: the Government’s at it again https://www.openrightsgroup.org/blog/smart-meter-data-the-government-at-it-again/ Wed, 08 May 2024 11:43:09 +0000 https://www.openrightsgroup.org/__preview__/blog/21196

Back in October 2022, ORG exposed Government plans to snoop on UK residents’ smart meters and energy consumption data. In July 2023, the Government backtracked on their plans following ORG’s campaigning, in particular by reducing the frequency by which smart meters’ and energy consumption data would be collected.

All’s well that ends well, if it weren’t that we were all deceived: on August 17, 2023, the Government updated their privacy policy to reflect that “energy consumption data is being collected more frequently than monthly”, thus reverting the improvements they had previously made. By collecting energy consumption data granularly, the Government is making it possible to understand which home appliances are being used and at what time, thus revealing lifestyle choices and potentially sensitive information about targeted households.

There is much to learn from this: the Government is also planning to introduce the Data Protection and Digital Information (DPDI) Bill, which deregulates data protection in the UK and makes it easy for the Government to collect information about you held by private sector entities — whether it’s your energy provider, your local supermarket or your General Practitioner. This story also disproves the Information Commissioner’s Office approach to public sector enforcement: pointing and shaming obviously fails to bring any results when the Government can count on a weak ICO that refuses to take legal action against public sector bodies.

The problem with smart meters

When are you usually away from home? Did you get a good night’s sleep or did you drive while sleep deprived? What time did you leave your home? Did the time it took you to get from to your workplace mean that you broke the speed limit to get there? Or, in a custody battle: Have you ever left your 11 year old child home alone? How often, and for how long? If you are claiming benefits while looking for work, can you explain why you have been away for a couple of days? These are just a range of questions that energy consumption data can answer, or at least guess: maybe you just fell asleep with the lights on, but data is against you and it’s now up to you to demonstrate that you didn’t drive while sleep deprived.

In other words, smart meter data can reveal your lifestyle habits and choices, and be used by third parties to make inferences whose validity or fairness may be doubtful. This is why we previously complained about Government plans to collect this information for “fraud detection” purposes under the Energy Price Guarantee. The Government initially planned to use smart meters to collect this data granularly, store it for ten years, and share it with credit reference agencies, local authorities and debt collectors. In our legal correspondence with them, the Government also failed to address concerns around the lack of documentation concerning this data collection and the lack of transparency about the amount of data they wanted to collect.

These risks and shortcomings are not a given or unavoidable, but a political choice: smart meters can deliver their benefits and functionalities without collecting granular and revealing data about their users, nor does our Government need to know how many times you use your microwave or when you turn off your bedroom’s lights in order to combat fraud.

The Data Protection and Digital Information Bill

Speaking of political choices, the Government is advancing legislation that would make it easier for Ministers to turn any private company into their an informant: the DPDI Bill will make it always legal to share data for law enforcement, national security, fraud detection, or to answer to a request made by a public sector body. This is a significant departure from the legal safeguards provided by the UK GDPR.

Data sharing today can happen only if the potential impact of this information on individuals has been considered before, and if doing so does not expose individuals to unjustified risks. Would you expect that data given to a sexual health clinic is to be shared with the Government for immigration control purposes, or with private insurers for assessing your lifestyle’s risks? Would sharing this data in this manner affect your right to access healthcare services? The answers to these questions would determine the lawfulness of data sharing under the UK GDPR, but not under the DPDI Bill, where data sharing is always legitimate insofar the goal being pursued—for instance, immigration control— has been sanctioned by the Government.

Also, the DPDI Bill would give legislative making powers to UK Ministers, which would allow them to introduce new purposes that make data collection and sharing always legitimate via Statutory Instruments (SI), which are never properly scrutinised by Parliament: indeed, the last time the House of Commons voted against the Government to reject a SI was 1979, almost 45 years ago.

The ICO approach to public sector enforcement

New technologies inherently have the potential to be weaponised and exacerbate power imbalances, raising delicate and complex questions around their impact and interferences with human rights and societal expectations. This is why, in the UK, we have the ICO: a public watchdog, independent from the Government, with powers that allow them to ask these questions and enforce against public and private entities that abuse personal data.

However, the decline in public sector’s standards and the institutional failures that have characterised the past five years of UK political history are biting in the ICO as well: following the highly-politicised appointment of the new Information Commissioner, the ICO adopted a new strategy for public sector enforcement that relies on public shaming and “very angry letters” rather than legally binding enforcement actions and penalty fines. The result of this approach are rather obvious: the watchdog that should enforce the law is leaving the Government off the hook, and thus public sector bodies can just ignore the law and get away with it.

Indeed, the smart meter saga is a good example of the consequences of this approach: the Government first rolled out smart meters and pledged they would never share this information without the consent of their users. Then they started collecting this data for fraud detection purposes. Finally, they were pressured by ORG to reduce the amount of data being collected, only to revert this decision as soon as we turned our head away.

Learning from mistakes

Smart meters are just one of the many new device and appliances that can be connected to the internet and collect data about us. They raise issues and questions that are valid elsewhere: we don’t need to sacrifice our privacy in order to reap the benefits of innovation, nor expose ourself to patronising guessing games by power entities, nor we should trust the Government to do the right thing.

As technology progresses and their risks multiply, we need stronger protections against abuse, more accountability against our decision-makers, stronger oversight and guardrails. These lessons need be brought into the current debate around the DPDI Bill: against a Government that wants to do undermine our rights on their way out, the House of Lords have the opportunity to save the day, oppose Government dangerous plans, and restore the functioning of the ICO.

]]>
Online Safety proposals could cause new harms warns Open Rights Group https://www.openrightsgroup.org/press-releases/online-safety-proposals-could-cause-new-harms/ Wed, 08 May 2024 10:30:43 +0000 https://www.openrightsgroup.org/__preview__/press_release/21200

Ofcom’s consultation on safeguarding children online exposes significant problems regarding the proposed implementation of age-gating measures. While aimed at protecting children from digital harms, the proposed measures introduce risks to cybersecurity, privacy and freedom of expression.

Ofcom’s proposals outline the implementation of age assurance systems, including photo-ID matching, facial age estimation, and reusable digital identity services, to restrict access to popular platforms like Twitter, Reddit, YouTube, and Google that might contain content deemed harmful to children.

Open Rights Group warns that these measures could inadvertently curtail individuals’ freedom of expression while simultaneously exposing them to heightened cybersecurity risks.

Jim Killock, Executive Director of Open Rights Group, said:

“Adults will be faced with a choice: either limit their freedom of expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites.

“Some overseas providers may block access to their platforms from the UK rather than comply with these stringent measures.

“We are also concerned that educational and help material, especially where it relates to sexuality, gender identity, drugs and other sensitive topics may be denied to young people by moderation systems.

“Risks to children will continue with these measures. Regulators need to shift their approach to one that empowers children to understand the risks they may face, especially where young people may look for content, whether it is meant to be available to them or not.”

Open Rights Group underscores the necessity for privacy-friendly standards in the development and deployment of age-assurance systems mandated by the Online Safety Act. Killock notes, “Current data protection laws lack the framework to pre-emptively address the specific and novel cybersecurity risks posed by these proposals.”

Open Rights Group urges the government to prioritize comprehensive solutions that incorporate parental guidance and education rather than relying largely on technical measures.

]]>
ORG Response to the ICO “consent or pay” consultation https://www.openrightsgroup.org/publications/org-response-to-the-ico-consent-or-pay-consultation/ Tue, 30 Apr 2024 16:14:21 +0000 https://www.openrightsgroup.org/__preview__/publication/21168

Open Rights Group welcomes the opportunity to respond to the Information Commissioner’s Office consultation on the “consent or pay” model, which is or could be relied upon for processing personal data on the basis of consent.

‘Consent or pay’ consultation

ORG’s response to the ICO consultation on processing
personal data on the basis of consent.

Download now

‘Consent or Pay’ and Data Protection

Interferences with our right to privacy and data protection are admissible only and insofar they represent a necessary and proportionate mean to achieve another, worthy objective. This principle does not only originates from the rights-based approach of the UK GDPR, but from the broader obligations that stem from the European Convention of Human Rights and the western legal tradition the UK adheres to, which is rooted in the rule of law and the protection that individuals, their subjectivities and their personalities enjoy.

In practice, this principle becomes material in the UK GDPR by requiring that data processing is based upon a legal basis, that authorises such processing a) for the pursuit of a legitimate aim, b) within given boundaries that ensure necessity and proportionality, and c) upon conditions that protect affected individuals from abuses.

The compatibility of a “consent or pay” model as a mean for online service providers to obtain consent for behavioural profiling and personalised advertising manifestly fails to meet this test. In summary:

  • Data processing for behavioural profiling is not necessary to protect a vital interest or pursue a legitimate aim. Relying on behavioural profiling to fund an online platform via advertisement is a deliberate decision to adopt a funding model that interferes’ with users’ privacy and right to data protection in the absence of a justifiable reason.
  • As such, subjecting individuals to behavioural profiling is is an unjustified interference with one’s right to privacy and data protection, unless the individual freely exercises their agency to accept such interference. It is, thus, unlawful unless individuals freely provides consent, and the conditions laid out by the UK GDPR for consent to be valid are met.
  • The adoption of the “consent or pay” model for the practical purpose of enabling behavioural profiling and funding an online service via advertising violates individuals’ agency by forcing them to consenting to such processing, and denying them the opportunity to withdraw their consent without detriment. Thus, it cannot constitute valid and freely given consent.
  • Indeed, reliance on the “consent or pay” model reveals that online service providers themselves do not believe users would provide their consent if they were able to refuse to be subject to behavioural profiling free of charge. The objective of circumventing the law is not legal nor legitimate, and does not deserve protection.
  • Behavioural profiling inherently exposes individuals to predatory advertising, discrimination, and potential violations of their habeas corpus and fundamental rights—such as in the case of women’s reproductive freedom and health. Forcing individuals into accepting these risks is immoral.
  • Finally, condoning the “consent or pay” model for behavioural profiling would not only violate individuals’ fundamental rights, but it would expose legitimate advertisers to unfair competition. Consent or pay would allow ads providers and online platforms to force-feed advertisement to its users and siphon advertisement revenues from other legitimate and law-abiding businesses that respect individuals’ dignity and authonomy.
  • It follows that the ICO proposed regulatory approach fails to recognise the inherent abusive nature of “consent or pay” in the context of behavioural profiling and personalised advertising. The ICO should not condone such practices, and should instead enforce against unfair, deceptive, predatory or otherwise illegal adtech practices.

The problem with ‘Consent or Pay’

We substantiate our reasoning in the paragraphs below:

The UK GDPR already allows interferences with one’s right to data protection to protect their vital interest, or in the pursuit of a legitimate aim enshrined in a legal obligation, or a public task, or if the existence of a legitimate aim that overrides the interest of the individual can be demonstrated.

Each of the legal bases discussed above provide conditions to ensure that data processing does not unduly interfere with one’s fundamental rights. It follows that processing that cannot be justified under these conditions is unnecessary and unjustified to achieve any of such aims—be it the protection of vital interests, a public task, a legal obligation, or legitimate interest (henceforth “qualified legitimate aim”).

The UK GDPR allows interferences with one’s right to privacy or data protection when it is not necessary to protect a vital interest or to pursue a qualified legitimate aim. This is the case of data processing based upon a contractual necessity or the consent of the individual. Absent a substantial justification to authorise such interference with one’s fundamental rights, its legitimacy rests in the agency and free will of the individual—in other words, in their willingness to be subject to such interference. In the UK GDPR, this is reflected in the conditions that require the existence of a valid contractual obligation, or the conditions that allow consent to be relied upon as a legal basis. In both cases, reliance on this legal basis is constrained to data processing that is necessary to achieve what the individual has consented to, or to perform the contract they willingly entered into. For the legal basis of consent, this must also be freely given, specific, informed, and as easy to withdraw as it is to give. Individuals cannot be subject to negative consequences for withdrawing their consent, but can be given incentives to provide their consent.

While it is true that the Court of Justice of the European Union has recognised that “consent or pay” may in principle be admissible to legitimise data processing, this is a statement of principle that reflects the horizontal applicability and the broad range of uses cases the GDPR applies to. Principles must be translated into practice by checking their application against the conditions set forth by Article 6(1)a and Article 7, and interpreted in line with recital 42.

In Case C-252/21 Meta Platforms and Others (General terms of use of a social network), the Court of Justice has already ruled that data processing for behavioural profiling and personalised advertising does not constitute a legitimate interest, as it fails the balancing test against the interest of the individuals affected. It has also ruled that behavioural profiling and personalised advertising are not necessary for fulfilling a contract between an individual and an online service provider. It is also undisputed that behavioural profiling does not protect a vital interest, constitutes a legal obligation, nor it is necessary for the performance of a public task. Thus, behavioural profiling and personalised advertising constitute an interference with one’s right to data protection which is not justifiable to achieve a qualified legitimate aim, nor it is expression of one’s decision to enter a contractual obligation

Based on the above, a service provider that decides to fund themselves via behavioural profiling and personalised advertising takes a deliberate business decision to interfere with one’s right to data protection in a manner that is not justified by either the pursuit of a legitimate objective or contractual will: the online service provider could have chosen to fund themselves with means that are respectful and compatible with the rights and freedom of an individual—such as contextual advertising— but deliberately chose not to do so. The deliberate business decision to override an individual’s fundamental right is inadmissible and illegal, unless it can be demonstrated that the individual consented to such interference, and the condition for consent to be valid are met. Accepting otherwise would mean accepting that individuals could legally be coerced to relinquishing their rights in the absence of a valid justification.

Article 7 of the UK GDPR provides conditions upon which an interference with one’s right to data protection can be considered legitimate. However, the practical implementation of the “consent or pay” model, as being adopted by Meta and other online service providers, obviously fails to meet these requirements: individuals who who want to enjoy and not relinquish their right to data protection must face adverse financial consequences to do so and are “unable to refuse or withdraw consent without detriment”.

The absence of free agency and free will in such arrangement is not only revealed by its incompatibility with GDPR legal standards, but it is embodied in “consent or pay” as a business model for personalised advertising. Despite claiming that personalised advertising is useful and provides benefits to users, service providers are trying to force the provision of consent via “consent or pay” because they know their users will not consent to it, and thus they need to bypass individual agency and free will in order to make their business decision profitable. Indeed, individuals do not want to consent to online profiling and personalised advertising, they refuse to provide such consent when possible, and they boycott such attempts to interfere with their privacy when they can’t refuse.

It is also worth mentioning that individuals have very good reasons to be willing to avoid behavioural profiling and personalised advertising. Individuals behaviour inherently reflects one’s addictions or vulnerabilities (in other words, their compulsive behaviours), sensitive traits (such as the places they go to worship or engage in political activities) and other, very personal and sensitive choices (such as sexual health, medical conditions, or reproductive behaviours). Behavioural profiling exposes such information to commercial exploitation as well as interference and abuses from the State—for instance, it allows gambling companies to prey on problem gamblers, or State authorities to persecute women for exercising their reproductive rights such as the right to abortion. Accepting to subject oneself to the risks posed by behavioural profiling is a highly sensitive and consequential choice: restricting and individual’s agency, and trying to coerce them into taking this choice is immoral even before any consideration concerning its legality.

Finally, it is worth emphasising that allowing behavioural advertisement providers to serve ads to individuals against their will does not only violate the right of the individuals, it violates the rights of legitimate advertisers and service providers who compete into the market and comply with the UK GDPR. By not honouring an individual’s choice not to be behaviourally profiled, adtech companies that illegally target users with personalised advertising are allowed to siphon advertisement revenue from legitimate and law-abiding advertisement channels such as contextual advertising.

Recommendations

Upon these basis, the ICO regulatory approach to “consent or pay” risks legitimising reliance on “consent or pay” in an area where such reliance is an obvious attempt to violate human rights, coerce individuals and trump their agency and autonomy. We therefore recommend the ICO:

  • To clarify that “consent or pay” cannot be relied upon to justify behavioural profiling or personalised advertising.
  • To enforce against against unfair, deceptive, predatory or otherwise illegal adtech practices, with the aim of removing unsafe products from the market, protect the public and their right to privacy and data protection, and restore fair competition in the online advertising space.

Open Rights Group remains available for further comments or clarifications at the contact below

Mariano delli Santi, Legal and Policy Officer: mariano@openrightsgroup.org

]]>
ORG’s Submission to the House of Commons Public Bill Committee Investigatory Powers (Amendment) Act 2024 https://www.openrightsgroup.org/publications/orgs-submission-to-the-house-of-commons-public-bill-committee-investigatory-powers-amendment-act-2024/ Tue, 30 Apr 2024 15:30:03 +0000 https://www.openrightsgroup.org/__preview__/publication/21163

ORG’s Submission to the House of Commons Public Bill Committee

 

1. Executive Summary

· Open Rights Group, the UK’s largest grassroots digital rights organisation, is submitting evidence about its human rights concerns relating to the IPA Bill.

· The requirement that operators must notify the Secretary of State before making security changes to their products raises serious privacy, security and free expression concerns.

· This new requirement would allow the UK to prevent secure services from launching in the country without clear independent oversight measures in place.

· Provisions expanding access to Internet Connection Records are not compatible with Article 8 of the European Convention of Human Rights (ECHR).

· ORG recommends the following:

◦ Remove provisions that allow the government to prevent companies from adding or improving security features;

◦ Notify those who are involved in data surveillance once there is no longer a need for operational secrecy; and

◦ Ensure any new powers are overseen by an independent, impartial body to ensure proportionality and necessity of decision-making.

2. Introduction

Open Rights Group (ORG) is the UK’s largest digital rights campaigning organisation, working to protect people’s right to privacy and free speech online. We have over 40,000 supporters across the UK and active member chapters in ten cities. Our work includes policy research and analysis, legal challenges, and public campaigning, all in the defence and promotion of digital rights. 

ORG has been following the UK government’s proposed changes to the Investigatory Power Act since their inception. In July 2023, we wrote a thorough response to the Home Office’s consultation on the amendments. We also made a submission of evidence to the Joint Committee on Human Rights. We are submitting evidence to the Public Bill Committee because we continue to have serious concerns about the impact of the proposals on privacy, security, and free expression.

3. Evidence

A. User security and rights can be weakened in the name of disproportionate surveillance

The introduction of a notification requirement for operators to inform the Secretary of State if they propose to make changes to their products or services that would negatively impact existing lawful access capabilities raises serious privacy, security, and free expression concerns.

The objective of this notification requirement appears designed to impose a “freeze” on changes to the service while consultations are taking place. The intention appears to stop user security improvements from being rolled out.

While this objective may appear reasonable, it would allow the Secretary of State to prevent secure services from launching in the UK, even where they are deployed elsewhere. This provision would allow the Secretary of State and the Home Office to place itself in a position of power over the provider as soon as it hears about updates that might make data less accessible than it is currently. This situation would take place without reference to an independent authority to assess the rationale or proportionality. Such a move might not be proportionate, for instance, if the security technology had already been introduced safely and with demonstrable benefits to users in other parts of the world.

Open Rights Group is concerned these powers could deny people access to technological developments, upon which their rights to free expression and privacy rely. For example, major tech providers such as Apple have stated that they would pull certain services from the UK rather than compromise their security if this power was used to prevent them from rolling out security updates. [1]

Our main concerns in the proposed revisions relate to weakened privacy and expanded government surveillance. Under the new proposals, the UK government could prevent a communications services provider from fixing software vulnerabilities through essential security updates [2] or applying advanced protections such as end-to-end encryption to their services at a global level. Requiring prior approval before rolling out a security patch is not a proportionate response.

B. Secure communication methods are being undermined

ORG is also concerned that the changes are meant to reduce the possibility of the introduction of encryption to protect user data from unwanted access.

The Home Office appears to regard encryption, especially “end-to-end encryption” (E2EE), where a service provider is unable to see the contents of communications they facilitate, as a threat to its capabilities and, by extension, to national security. It appears to be seeking to extend its powers to prevent E2EE from being used at scale, despite its benefits to users and vendors.

E2EE is a significant protection for the right to privacy against everyday criminality, abuse and intrusion. Encrypted messaging apps are routinely used by politicians, doctors, business leaders, lawyers, and others who need to exchange large amounts of personal data securely while complying with UK data protection legislation, and protecting themselves from acts of cybercrime.

Journalists rely on E2EE and access to secure global technologies to communicate with their sources and exercise freedom of expression rights. E2EE protects vendors from being a vector for potentially massive data loss.

British Businesses rely on E2EE to protect against acts of corporate espionage, and to provide secure communication methods to conduct international business in a secure and private environment. Preventing tech companies from rolling out security improvements or updates without prior authorization could put British companies and industry at a disadvantage compared to countries that allow the free use of communication technology. This measure would inhibit rather than promote ambitions for the UK to become a world-leader in the tech industry.

In addition, the proposed measures in the Investigatory Powers (Amendment) Bill are poised to profoundly impact political dissidents and opposition figures residing in the UK. Refugees, political exiles, and human rights advocates who have sought refuge within the UK deserve the assurance of digital safety and security.

LGBTQ+ individuals from refugee and migrant backgrounds who have fled to the UK heavily rely on digital tools to maintain connections with their families, friends, and social networks. These individuals will face heightened vulnerabilities to hacking and privacy breaches if the proposed changes are implemented.

Undermining security updates and patches is especially concerning for exiled activists who have been compelled to leave their home countries and now reside in the UK. These individuals may become susceptible to digital transnational repression attacks from their authoritarian regimes. Such attacks, coupled with a sense of deprivation of digital safety and security, will inevitably lead to severe consequences on their freedom of expression, potentially resulting in silencing their voices.

The exiled diaspora from countries such as Iran, Saudi Arabia, and Hong Kong has historically faced harassment and digital threats from their authoritarian regimes, even beyond their national borders. The proposed measures in the Bill could significantly exacerbate this situation, providing authoritarian governments with unprecedented opportunities to control, silence, and punish dissent across borders. [3]

Forensic Architecture, a London-based research agency, has documented 326 incidents of digital transnational repression between 2019 and 2021. [4] This number is likely to rise, especially in cases where security updates for technology are undermined. Numerous refugees and diaspora from Hong Kong, along with prominent activists, have expressed feelings of insecurity following online threats and harassment in the UK from their government. [5]

Additionally, three UK-based civil society leaders and human rights activists have reported their mobile devices being infiltrated with spyware by their regimes. [6] These examples underscore the urgent need to reconsider the potential ramifications of the proposed amendments and their impact on the safety and security of those who seek refuge and advocate for justice within the UK.

Recently, in the landmark ruling Podchasov v. Russia , the European Court of Human Rights (ECtHR) ruled that the weakening of encryption “can lead to general and indiscriminate surveillance of the communications of  all  users and violates the human right to privacy.” [7] The ruling stemmed from an incident in 2017, when the Russian government required “internet communication” providers to store all communication data and content for specific durations and to supply law enforcement authorities with users’ data and communication content as well as all relevant decryption tools. Telegram opposed the order, and as a result, Russian courts fined the company and ruled that the app should be blocked within the country. A Russian citizen then brought the issue to the ECtHR, arguing that the forced decryption of user communications would infringe on the right to private life under Article 8 ECHR. The court agreed, finding that encryption is important for protection the right to private life and other fundamental rights like freedom of expression. The judgment also noted that encryption acts as a shied against abuse and that there are alternative methods to decryption.

While the government may have particular reasons to seek access to data and systems in certain limited circumstances, it should neither assume that all data should be easily accessible nor seek legal regimes to ensure that data is kept easily accessible.

We reiterate that encryption does not prevent lawful access per se. It may require law enforcement to access a device covertly or to seize it and demand passwords; however, these approaches are likely to be more proportionate than simply preventing security measures from evolving for the population at large. 

C. Impacts of the bill on journalists, journalistic sources, and journalistic material

Information Security is essential for journalists who need to protect their work and sources. [8] Any attempts to delay or halt security updates or improvements to software could have an adverse impact on journalists by making them more susceptible to attacks from hostile actors or foreign states, as we saw occur with the Pegasus scandal. [9]

This could have particular impact on journalists in the UK, who could find their communications with sources restricted if tech providers were prevented from deploying software already adopted by journalistic colleagues, and sources in other countries.

D. Data surveillance is widely expanded without proper safeguards

Provisions expanding access to Internet Connection Records are not compatible with Artice 8 of the ECHR. Our own legal challenge at the Court of Justice of the European Union (CJEU), as a party to a case brought by former MP Tom Watson, [10] showed that the court understood the sensitivity of this data. We are particularly concerned that Clause 14 expands the use of Internet Connection Records, essentially for pre-crime target detection, or “network analysis”. There is considerable scope for fishing expeditions, targeting of people for associations, and other practices which are neither wise nor proportionate.

The appropriate safeguard with most data surveillance is to notify those involved where it is safe to do so. The UK government was asked for this change in the Watson case, but it was not implemented. Such a change would allow people who had been surveilled to challenge the abuse of their privacy. Without the knowledge that surveillance has taken place, anticipating the need to challenge is extremely hard. Notification is an evolving requirement but an obvious one where capabilities inevitably grow with technology.

4. Recommendations

· Remove provisions that allow the government to prevent companies from adding or improving security features . Instead of jeopardizing the security, free expression and privacy of the population at large, the Home Office should instead seek to use more proportionate investigative methods that do not infringe upon people’s human rights. 

· Notify those who are involved in data surveillance once there is no longer an operational need for secrecy. This safeguard would be the strongest and most effective option to ensure data surveillance is compatible with Article 8 ECHR.

· Ensure any new powers are overseen by an independent, impartial body to ensure proportionality and necessity of decision-making  

March 2024


[1] Apple slams UK surveillance-bill proposals https://www.bbc.co.uk/news/technology-66256081

[2] Changes to UK Surveillance Regime May Violate International Law https://www.justsecurity.org/87615/changes-to-uk-surveillance-regime-may-violate-international-law/

[3] The Digital Transnational Repression Toolkit, and Its Silencing Effects – https://freedomhouse.org/report/special-report/2020/digital-transnational-repression-toolkit-and-its

silencing-effects

[4] Digital repression across borders is on the rise – https://www.technologyreview.com/2022/07/08/1055582/digital-repression-across-borders-is-on-the-rise/

[5] ‘We don’t feel safe here’: Hongkongers in UK fear long reach of Chinese government- https://www.theguardian.com/global-development/2023/oct/17/we-dont-feel-safe-here-hongkongers-in-uk-fear-long-reach-of-chinese-government

[6] Bindmans launches legal action in the United Kingdom on misuse of Pegasus spyware- https://www.bindmans.com/knowledge-hub/news/bindmans-launches-legal-action-in-the-united-kingdom-on-misuse-of-pegasus-spyware/

[7] European Court of Human Rights Confirms: Weakening Encryption Violates Fundamental Rights- https://www.eff.org/deeplinks/2024/03/european-court-human-rights-confirms-undermining-encryption-violates-fundamental

[8] Why journalism needs information security https://reutersinstitute.politics.ox.ac.uk/calendar/why-journalism-needs-information-security

[9] Pegasus scandal: Are we all becoming unknowing spies?

[9] https://www.bbc.co.uk/news/technology-57910355

[10] Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:62015CJ0203

]]>
Europe warns of threat to adequacy agreement https://www.openrightsgroup.org/press-releases/europe-warns-of-threat-to-adequacy-agreement/ Thu, 25 Apr 2024 11:30:50 +0000 https://www.openrightsgroup.org/__preview__/press_release/21106 LIBE committee raises concerns about Data Protection and Digital Information Bill to UK government and European Commission.

The Committee on Civil Liberties, Justice and Home Affairs (LIBE Committee) has written to the Chair of the European Committee in the House of Lords to warn that “UK divergence from EU data standards, [is] putting the validity of the adequacy findings into question”. The Committee notes that it has been closely following the progress of data reform in the UK, as both adequacy agreements with the EU “contain sunset clauses, and thus will automatically expire four years after their entry into force”.

The key areas of concern noted by the LIBE Committee are:

  • Change to the definition of personal data
    The Committee notes that: “the Bill modifies the concept of “personal data” that is at the heart of the EU data protection regime”.
  • Threats to the independence of the Information Commissioner’s Office
    The letter notes that, “the Bill appears to further undermine, not merely the effectiveness, but beyond that the independence of the ICO”.
  • Data transfers
    The committee is “strongly concerned” that the Bill would lead to the “bypassing of EU rules on international transfers to countries or international organisations not deemed adequate under EU law”.

The letter also reminds the UK parliament of its human rights obligations and notes that, “any possible changes to the UK’s Human Rights Act or the UK’s departure from ECHR jurisprudence would, in the opinion of the LIBE Committee, have a negative impact on the UK adequacy”.

Mariano delli Santi, Legal and Policy Officer for Open Rights Group said:

“Losing the adequacy agreement would be a disaster for the UK economy. As the Bill enters its final stages, the Government needs to act urgently to protect our data protection rights.

The concerns of the LIBE committee highlights how the data rights of people in the UK will be reduced compared to people living in Europe. This should not be acceptable to our parliamentarians.”

Cost of breaching the adequacy agreement

After Brexit, the European Commission adopted two adequacy decisions on, which allowed for the free flow of personal data between the UK and the EU, without requiring additional safeguards. Losing such status would add significant red tape for UK businesses, disrupt trade, and would risk undermining important cooperation initiatives between the EU and the UK such as data sharing for research (Horizon), law enforcement (Prum) or immigration control purposes (Frontex). The cost to UK businesses has been estimated at £1-1.6 billion in legal fees alone.

Warning to the European Commission

The LIBE Committee previously wrote to the European Commissioner, Didier Reynders, in March, noting that the DPDI Bill still had “many controversial provisions unchanged (such as lack of independence of supervisory authority or the removal of the Biometrics and Surveillance Camera Commissioner, which oversees the sharing of DNA and fingerprint data under Prüm cooperation with the EU)”. Another MEP tabled a written question only some weeks before, asking if the Commission intends to revoke the UK adequacy decision. The European Commission have already confirmed that some changes of the DPDI Bill would raise concerns in a response to a written question and to the LIBE committee last year.

Civil society concerns

In July 2023, 28 civil society organisations wrote to the European Commission to raise concerns that to raise concerns that UK’s data protection reform would undermine European citizens’ data protection rights, thereby threatening the adequacy agreement.

]]>
Why Migrants Need Digital Sanctuary https://www.openrightsgroup.org/blog/why-migrants-need-digital-sanctuary/ Tue, 23 Apr 2024 10:03:19 +0000 https://www.openrightsgroup.org/__preview__/blog/21039

When individuals migrate, their data migrates with them. When people leave their countries to travel and live in different places whether as migrants, refugees, or asylum seekers, they are not only seeking physical safety, they also need to be sure that their digital identity and information will be safe.

However, if they are not careful or protected, their data could unwittingly leave a trail of their movements, potentially exposing them to various threats. Whether they are fleeing war, authoritarian regimes, or other adversaries, this data could inadvertently connect their identities to their pursuers.

Migrants should enjoy the same human rights as everyone else, including digital rights protected by similar principles. This is what we call Digital Sanctuary.

What do migrants think?

Last year, Positive Action in Housing (PAIH) and Open Rights Group collaborated to assess the fears, needs and knowledge of migrants, refugees and asylum seekers regarding their digital rights through a survey within the PAIH community.

The survey revealed that 66% of the sample group expressed concerns that the UK government would share their data with third parties. Meanwhile, 68% were unaware of the Data Protection and Digital Information Bill, currently in Parliament, which will impact their data rights.

The survey numbers reflected the distrust between migrants and the government, particularly over how the government handles their data. It raises questions about why their information is being shared and whether it’s done properly and securely.

“[Sharing data online] is not safe for us, because we are not safe from where we came from.”

Respondent to the Migrant Digital Rights Survey

The survey also revealed a gap in the information and knowledge of migrants regarding keeping their data safe, such as choosing strong passwords to protect their online accounts, or using secure search browsers while browsing the Internet.

Migrants, refugees and asylum seekers who participated in this survey explicitly expressed their concerns regarding data sharing without consent. They emphasised the need for strong safeguards for their personal information.

Many respondents highlighted their lack of control or understanding about how their data is used or shared. This lack of transparency has led to apprehension, fear of misuse, and distrust. Some participants were also aware of the risk of digital identity theft. Additionally, they expressed concern that sharing their information without consent or proper safeguarding could compromise their safety by disclosing their whereabouts.

“Asylum Seekers must lay themselves bare, providing data that indirectly strip them off any right to privacy, just for them to stand the slightest chance of seeming credible.”

Respondent to the Migrant Digital Rights Survey

Migrants are often deprived of rights

While the principles of the digital sanctuary seem straightforward and direct, many, including migrants themselves, may not be aware that they do not have the same rights protecting their data as others.

In theory, migrants have the same rights as everyone else in the UK. However, in practice, they are often deprived of these rights, especially undocumented individuals or people in immigration detention. For example, the UK subjected asylum seekers to GPS tag ankles to get out of their detention centres, a policy which was found to be unlawful.

Despite this, the Home Office still subjects asylum seekers to these practices.

Another example of migrants not enjoying the same digital rights as others is the barriers they face in accessing online services due to digital or language barriers and the lack of required documentation online.

Many migrants and asylum seekers cannot fully exercise their rights, as it depends on their individual circumstances, skills, and legal status.

MIGrant digital justice newsletter

Keep up to date with the latest developments affecting the digital rights of migrants, refugees and asylum seekers.

Subscribe now
]]>
The DPDI Bill Threatens the Integrity of the UK General Election https://www.openrightsgroup.org/blog/dpdi-bill-threatens-integrity-of-general-election/ Mon, 22 Apr 2024 16:07:57 +0000 https://www.openrightsgroup.org/__preview__/blog/20903

In our latest briefing, Open Rights Group raises the alarm over changes in the Data Protection and Digital Information Bill (DPDI Bill) that could open the floodgate for the abuse of data analytics and new technologies for electoral manipulation.

Information is power, and new technologies amplify governments and corporations’ ability to collect, use and weaponise personal data. Thus, data protection is an important first-line of defence against practices that could mislead voters, manipulate public opinion, and threaten the integrity of the electoral process. However, the DPDI Bill would change UK data protection laws, making it easier for the government, private companies and political parties to spy, collect and use information about you, but more difficult to investigate, unveil and hold to account those who may try to weaponise data analytics for political purposes.

The Bill has the potential to undermine legal standards that protect all of us from the ever-growing threats to the integrity of elections arising from technological developments — including the use of data analytics and micro-targeting. With a general election on the horizon, this poses an immediate risk. We are calling on the House of Lords to oppose dangerous proposals in the DPDI Bill that would:

  • Reduce transparency over how personal data is used
  • Multiply avenues and caveats political parties can rely on to profile individuals against their will
  • Make it more difficult to scrutinise data uses and hold law-breaker to account
  • Reduce access to redress and the independence of the Information Commissioner’s Office (ICO)
  • Increase risks that political parties engage in novel, and exploitative methods to obtain your personal data.

Why data protection matters in politics

Modern technologies allow the collection of vast amount of personal data that can be used to guess or infer individuals’ personal opinions. Since 2016, the use of data analytics for electoral campaigning has steadily increased. For instance, in our 2020 “Who do they think we are” report we found that political parties were combining commercial data and information bought by data brokers with electoral rolls, to build detailed profiles of individuals and target them with “tailored” political messaging, also known as “micro-targeting”.

Our report unveiled that these profiles are usually wrong and factually inaccurate, but this does not make this practice any less dangerous: in the Cambridge Analytica scandal, political actors already exploited this system to target individuals’ with different electoral messages, raising concerns over the ability of such systems to manipulate public opinion. The investigation that followed this scandal unveiled that political parties in the UK had been using electoral register data to draw detailed profiles about UK voters’ lifestyle, ethnicity and age, raising several concerns around the legality, transparency, fairness and accuracy of these depictions.

More recently, it has been suggested that in the Rochdale by-election, George Galloway MP could have targeted Muslim voters with different canvassing information — something that, if confirmed, would have provided his constituents only a partial view of the topics and issues he campaigned for. It has also been revealed that Chinese-backed actors breached the UK electoral register – raising questions about foreign interferences in the upcoming UK electoral process. In the US, this kind of political targeting has been used target Black voters with robocalls that would provide them wrong information about how to register to vote, or wrong voting information about how to vote such as suggesting the wrong date.

Micro-targeting threatens the integrity of the electoral process in at least two different ways. On the one hand, it enables political candidates to be two-faced and make different electoral promises to different political audiences, in a way that lacks transparency and leaves the public unaware of this double game. On the other hand, it allows bad-faith actors to target constituents with the wrong information which can trick voters into making mistakes during the electoral process, such as by missing the registration deadline or showing up at the poll station on the wrong day.

The DPDI Bill: wrong answers only

The UK Government are proposing changes to UK data protection laws that would lower legal standards, weaken data protection rights, reduce accountability, and undermine the independence of the ICO, the UK data protection authority that oversees, among other things, the use of personal data by political parties. These proposals have made a lot of people very angry and have been widely regarded as a bad move: as our briefing shows, this is no less true in the context of data analytics for political purposes.

Indeed, the Government is proposing to lower accountability standards and record-keeping requirements, as well as to make it more difficult for UK voters to exercise their right to access their personal data held by political parties and any other organisation. None of these changes will help provide a defence against the growing weaponisation of data analytics for political purposes. Records and other accountability documentation has proven to be an invaluable tool for journalists and public interest organisations who investigate and unveil malpractice. Likewise, the Cambridge Analytica investigations, or our own report on political profiling, would have never been possible without the right to access personal data, which enabled people to ask political parties or consultancies firms about what data they held about UK voters and for what purposes.

Furthermore, the DPDI Bill would also introduce a very wide, vague and unconditional basis for data processing for “democratic engagement” purposes, and allow online tracking and other high-risk profiling of our browsing habits without our prior consent. It would also give Ministers the power to appoint loyalists of their choice to the board of the ICO, or to dissuade it from investigating matters that the Government does not want to be investigated.

We need another Bill

The UK data protection reform is currently being scrutinised by the Lords, whose Constitutional committee has raised an eyebrow, among other things, on the new “democratic engagement” legal basis. At the same time, Open Rights Group has supported peers in tabling amendments that would foster the independence of the ICO, and several other civil society players have helped in drafting amendments that would restore high legal standards, protect the right to access personal data, and retain effective accountability requirements.

All this underscore a deeper issue: the DPDI Bill is a collection of bad proposals the Government formulated after a lopsided consultation process, a stubborn determination to ignore the opinions of independent experts and public interest organisations and, by last, the decision to table 150 pages of amendments a few days before their discussion in the House of Commons. In turn, Members of Parliament have been denied a fair chance to scrutinise Government proposals, and the Lords are now left with the unpleasant job of cleaning up this mess.

]]>
Extremism Redefined: Caught in a Mouth Trap https://www.openrightsgroup.org/blog/extremism-redefined-caught-in-a-mouth-trap/ Wed, 17 Apr 2024 16:01:11 +0000 https://www.openrightsgroup.org/__preview__/blog/20895

Last month, the Secretary of State for Levelling Up, Housing and Communities, Michael Gove MP, announced a new and expanded definition of extremism as part of the Government’s Counter Terrorism Strategy.

The announcement came after months of speculative media reports about government plans to tackle extremism and, as Gove told parliament, “grave concerns that the conflict in the Middle East is driving further polarisation”. Over the last six months, there has been a deeply worrying increase in antisemitic and Islamophobic attacks in the UK but there are also concerns that this new definition of extremism, like the previous one, is vague and broad, raising concerns about the potential abuse of power and infringement on civil liberties.

The new definition of extremism has been introduced against a backdrop of widespread protests in solidarity with people in Palestine and public anger at the government’s complicity with Israel’s actions in Gaza, which to date have killed over 33,000 civilians, including thousands of children. Despite the fact that these protests have been largely peaceful with few arrests, they have been described as “hate marches.” by the former Home Secretary Suella Braverman MP.

What’s changed?

Extremism is now defined as: “the promotion or advancement of an ideology based on violence, hatred or intolerance, that aims to:

  1. Negate or destroy the fundamental rights and freedoms of others; or
  1. Undermine, overturn or replace the UK’s system of liberal parliamentary democracy and democratic rights; or
  1. Intentionally create a permissive environment for others to achieve the results in (1) or (2).”

While to some this may seem like a reasonable measure to protect against threats to democracy, the imprecise language leaves too much room for interpretation and potential misuse. The third of these – aiming to “intentionally create a permissive environment for others” is especially subjective and problematic: merely stating that legitimate grievances or drivers of extremism need to be tackled could be interpreted as falling within this definition.

It is crucial that we ask who gets to define extremism and what does this mean for human rights and free expression?

The definition also reinforces the age-old idea of a liberal nation state, which must be securitised against a threat. This threat is constructed as foreign and antithetical to “the UK’s system of liberal parliamentary democracy.” Muslims have long been scapegoated as this outside threat to the nation state.

The scope of the guidance

The new guidance is non-statutory, meaning it will not be enshrined in law and will only affect parliamentarians and civil servants who will no longer be allowed to engage with groups that supposedly meet the new definition. The government’s independent reviewer of state threat legislation, Jonathan Hall KC, has warned that this defines people as extremists by “minsiterial decree”.

It is important to ask why the government have renewed their focus on extremism since 7 October, without proposing legislative changes and therefore denying parliamentary scrutiny. This may be because a previous attempt to redefine extremism by the Cameron government, failed to find a “legally robust” definition. However, regardless of how the new definition of extremism is applied, there will, no doubt, be a wider chilling effect on free speech.

What does the new definition of extremism mean for organisations that fall under its scope?

Gove went on to name groups which will likely fall under the new definition of extremism for holding ‘neo-nazi ideology’ or being of ‘Islamist orientation.’. The impact of this public censure is that they will be banned from meeting with parliamentarians, from receiving government funding, or from having people appointed to government boards.

It is unclear what evidence is being used to determine whether a group is “extremist” and there will be no appeals process if a group is labelled as such. Instead, they will have to challenge such decisions in the courts, with two of the named groups already challenging Gove to repeat the allegations without parliamentary privilege so that they can take legal action. The lack of right of reply for groups that fall under the new definition shows the government is using it for political point scoring and playing to the populist gallery, rather than any meaningful attempt to protect public safety.

Extremism in the context of Prevent

A key concern is that while the earlier definition of extremism focussed on action, the new definition focusses on ideology. As the previous definition was already used to surveil thousands of people, the new and expanded definition may sweep even more people under its scope. First introduced in 2005, Prevent is the Government’s flagship counter-terrorism programme with a stated aim “to identify and intervene with people at risk of committing terrorist acts.” The Prevent Duty imposes requirements on public bodies, including schools and healthcare providers, to refer individuals that they suspect are “susceptible to radicalisation”. Rather than functioning as a safeguarding tool, Prevent has facilitated mass data-collection and the wholesale surveillance of those referred to it.

A wide cross-section of civil society organisations have raised concerns over human and digital rights implications with Prevent. In Open Rights Group’s report Prevent and the Pre-Crime State: How unaccountable data practices are harming a generation’, we found that the data of people who are referred to Prevent is being widely shared and retained for years, even when referrals are marked ‘no further action’.  Other civil society organisations, such as Prevent Watch and Amnesty UK, have highlighted how Prevent disproportionately harms marginalised communities, including neurodiverse people and Muslims.

With the Government’s renewed focus on so-called “extremist beliefs,” we must be wary of how many more people may face surveillance under Prevent, despite posing no terrorist risk. Liberty has warned that “It is unclear at this stage how this new definition will be adapted and applied in the context of identifying individuals for Prevent; however… it will create confusion among public bodies who may use the new and broader definition.”

It has been reported that schools and other educational institutions are discouraging free expression for fear of falling foul of the new guidance. What’s more, some people may be inclined to self-censor the expression of their beliefs to avoid the risk of state intervention. Just last week the National Education Union (NEU) reported that the number of Prevent referrals for pupils “showing solidarity with Palestine” has risen. This again raises the question of who gets to define extremism. Does it make one an extremist to express solidarity with people who are being killed and to call for an end to the humanitarian catastrophe in Gaza? In making it difficult for people to speak freely about the shocking events and images reaching us, government policy and practice is suppressing legitimate dissent.

Sowing division at a time of crisis

Ultimately, the Government’s expanded definition of extremism must be understood as part of a wider move to narrow the parameters of political acceptability. In recent years, we have seen new restrictions on the right to protest, sweeping new powers for the police, as well as new censorship of online expression. It is necessary to ask: whose free speech is protected and who is obliged to keep quiet under the threat of being labelled an extremist? Paradoxically, in its supposed attempt to bring about social cohesion, the Government is weaponising the new definition of extremism for political point scoring, while sowing division at a time of crisis. In exploring this question, we can see the government’s new and expanded definition of extremism in a wider context of successive lurches towards authoritarianism.

]]>
Briefing: The impact of the DPDI Bill on data use for political purposes https://www.openrightsgroup.org/publications/briefing-the-impact-of-the-dpdi-bill-on-data-use-for-political-purposes/ Thu, 11 Apr 2024 14:06:53 +0000 https://www.openrightsgroup.org/__preview__/publication/20852

In this briefing, we explain how the proposed changes to the UK data protection framework that would be introduced by the Data Protection and Digital Information Bill would weaken legal safeguards around the use of personal data for political purposes, in particular by:

  • Hindering individuals’ right to access their personal data, thus reducing transparency and scrutiny over how political parties use personal data;
  • Reducing legal safeguards around online tracking and profiling, thus making it easier to use personal data for political purposes against voters’ consent or legitimate expectations;
  • Watering down accountability requirements, thus making it more difficult for journalists, civil society and regulators to scrutinise political parties’ uses of personal data;
  • Undermining the right to lodge a complaint and independent supervision, thus making it more difficult for individuals to react to an infringement of their rights, and for the ICO to investigate without interferences from the Government.

READ OUR BRIEFING

The impact of the UK data protection reform on the misuse of data for political purposes.

Download now

WHY DATA PROTECTION MATTERS

The use of personal data and data analytics for electoral campaigning rose prominently in recent times. Modern technologies allow the collection of vast amount of personal data, that can be used to guess or infer individuals’ personal opinions. These systems, also known as surveillance advertising, were exploited by Cambridge Analytica to target individuals’ with different electoral messages, raising concerns over the ability of such systems to manipulate public opinion and affect the integrity of the electoral process.

Open Rights Group have long investigated the risk of data collection and political profiling. With the “Who do you think you are” project,1 we relied upon the rights provided by the UK GDPR to shed a light on what information political parties store and use to target people in the UK during election campaigns. The findings of our Report were damning: political profiling is usually leading to targeting based on wrong information or inferences, in ways people are largely uncomfortable with.2

Both the Cambridge Analytica scandal and ORG’s “Who do they think you are” projects are revealing of a state of play that requires a regulatory sweep, and a change of paradigm into how political parties use personal data about UK voters. However, the UK Government are proposing changes to the UK data protection framework that would reduce legal safeguards and encroach the current state of affairs. The Bill has undergone committee stage at the House of Commons, and it’s waiting to be rescheduled for report stage. The following sections are based on Open Rights Group full analysis of this Bill.3

THE RIGHT OF ACCESS

Under the UK GDPR, individuals enjoy the right to obtain a copy of the personal data an organisation holds about them, as well as other information such as the reason they store this data and who they received this data from. This is known as the right of access.

The right of access played a significant role in the year-long investigation that revealed how the data analytics firm Cambridge Analytica used data harvested from 87 million Facebook users without their consent. Likewise, the right of access allowed members of the public in the UK to obtain a copy of the profiles political parties had gathered about during ORG’s “Who do they think you are” campaign.

HOW THE BILL HINDERS THE RIGHT OF ACCESS

Clause 9 of the Data Protection and Digital Information Bill would lower the threshold that allows organisations to refuse to act upon a data rights request from “manifestly unfounded” to “vexatious”. This could

  • Permit organisations to intimidate individuals by inquiring or making assumptions about the reasons for their request. Vexatious has been interpreted as to require a “reasonable foundation” or “value to the requester”.4 Further, organisations could refuse to act upon requests that “are intended to cause distress” or “are not made in good faith”. Organisations could frustrate the exercise of the right of access by engaging in lengthy correspondence, or by making unreasonable assumptions about the intentions behind a request.
  • Exacerbate a sense of powerlessness amongst individuals and hinder their ability to exercise their rights. The Bill provides a non-exhaustive list of circumstances to determine if a request is vexatious, including “the resources available to the controller” and “the extent to which the request repeats a previous request”. However, a lack of resources or organisational preparedness to deal with a request does not indicate inappropriate use of data protection rights. Also, individuals may repeat their requests more than once to react to a similar violation of their right, or to compare the two responses. Yet, an organisation could use these grounds as a loophole to refuse a request to their advantage.

ONLINE TRACKING AND PROFILING

Profiling consists of the collection and analysis of personal data for the purpose of classifying a given individual into a certain category or group. Profiling is also at the core of “micro-targeting” — the practice of tailoring messages and political adverts to an individual’s characteristics such as political opinions, demographics and other personal beliefs— that was exploited by Cambridge Analytica.

Under existing rules, individuals have the right not to be tracked or profiled with the use of cookies or other similar technologies unless they provide their free and informed consent. Further, the Information Commissioner’s Office issued regulatory guidance that clarifies the interpretation of existing data protection rules around the use of personal data for political profiling.5 In summary, a consent-based opt-in model of political profiling seems the only viable option to legally carry out political profiling. In detail:

  • Political Parties should obtain valid consent before engaging in profiling that involves the use or inference of special category data, such as political opinions, religious beliefs of ethnic status.
  • Profiling that does not involve the use of special category data would still need to be based on consent, unless political parties can demonstrate that the impact of such processing does not override the rights and freedom of the individuals concerned. However, the ICO emphasises that “profiling is often invisible to individuals” and that people may not expect, understand, or trust that their data is used in such manner. Thus, profiling will likely override the rights and freedom of the individuals concerned.
  • The Data Protection Act 2018 provides the lawful basis of “public task – democratic engagement”, that legitimises the use of personal data for the exercise of a task “laid down in domestic law”. The ICO clarifies that such law exists for the use of electoral register data, and campaigners have a responsibility to demonstrate the existence of a legal basis to use “non-electoral register’” data. Further, this exemption cannot be relied upon “if you can reasonably achieve your purpose by some other less privacy intrusive means”, thus ruling out political profiling.

HOW THE BILL REDUCES SAFEGUARDS AROUND ONLINE TRACKING AND PROFILING

Clause 5 of the Data Protection and Digital Information Bill introduces a new power for the Secretary of State to designate “recognised legitimate interests”, namely data uses that are always considered legitimate. Schedule 1 of the DPDI Bill designates the recognised legitimate interest of “democratic engagement”.

Further, Clause 114 of the DPDI Bill would empower the Secretary of State to provide an exemption from a direct marketing provision for a case where communications are liked to the purposes of democratic engagement, whose definition broadly mirrors that of Clause 5.

Finally, Clause 109 of the DPDI Bill would remove cookie consent requirements, thus switching to an opt-out model of online tracking and profiling, against the existing opt-in model.

These changes would significantly expand the avenues to use personal data for political purposes without the need to obtain the consent of the individuals affected, including in the context of profiling. Contrary to the existing “public task – democratic engagement” exemption, a legitimate interest does not need to be enshrined in domestic law and could be relied upon well beyond the use of electoral register data. Further, the definition of “democratic engagement” provided by the Bill is extremely vague, and encompasses “the person’s or organisation’s democratic engagement activities” such as “assisting with a candidate’s campaign for election as an elected representative”.

It is also worth stressing that these regulatory-making powers would be Henry VIII clauses, thus they allow to amend or repeal an Act of Parliament by using secondary legislation. In practice, the use of these powers would lack meaningful democratic or Parliamentary scrutiny. As a matter of fact:

  • Only 17 statutory instruments (SIs)—the most common form of secondary legislation—have been voted down in the last 65 years.
  • The House of Commons has not rejected an SI since 1979.
  • Not a single SI was defeated during the process of legislating for Brexit and Covid-19.

STOP THE DPDI BILL

Your data rights are under attack. Tell the government to get their hands off your data.

Email your MP

ACCOUNTABILITY

The UK GDPR establishes a duty for organisations, including political parties, to ensure and be able to demonstrate that they are using personal data in accordance with the law. This ensures that principle and obligations enshrined in legislation will apply into practice. Further, carrying out assessments helps organisations to anticipate and prevent harmful or discriminatory outcomes.

Notably, accountability requirements require the production of internal documentation, such as with “Records Of Processing Operations”, “Data Protection Impact Assessments” and “Legitimate Interest Assessments”. These are important documents that can be used by journalists, civil society and Regulators to hold organisations to account, such as by investigating and revealing malpractice.

HOW THE BILL WATERS DOWN ACCOUNTABILITY REQUIREMENTS

The Data Protection and Digital Information Bill will reduce the availability of comprehensive records or other documentation concerning data uses, thus reducing transparency and scrutiny over how political parties use personal data. In detail:

  • Clauses 5 and 6 of the DPDI Bill would designate data uses and reuses that do not need a legitimate interest assessment or compatibility test in order to legitimise data processing. In turn, this would reduce documentation that can shed a light on political parties’ decisions to override the rights and freedom of individuals when using their personal data.
  • Clause 18 of the DPDI Bill would remove the need to keep records unless the processing is likely to result in high-risk. The clause also replaces the existing requirement for a comprehensive record of processing activities with less extensive “appropriate records”. This will lead to fewer and less comprehensive records of what political parties are doing with personal data.
  • Clause 20 of the DPDI Bill would remove the requirement to carry out Data Protection Impact Assessments and require an “Assessment of high-risk processing” instead. The new assessment would exclude the need to include a systemic description of the envisaged data uses, the need to consult with those who are affected by high risks data processing, and remove existing prescriptive requirements as to when an assessment must be conducted.

OVERSIGHT AND REDRESS

The UK GDPR ensures that individuals have access to judicial and administrative remedies, including the right to lodge a complaint with the Information Commissioner’s Office when their rights are breached. Formal complaints are an essential avenue to enable data subjects to challenge violations of their rights, and hold organisations to account.

Further, the independence of the Commissioner from the Government and other political actors is pivotal to ensure that investigations and complaints are dealt with integrity and impartiality — even more within the context of electoral campaigning.

HOW THE BILL UNDERMINES OVERSIGHT AND REDRESS

The Data Protection and Digital Information Bill would introduce a new requirement for complainants to try to resolve their complaint with the organisation who’s responsible for the breach before contacting the ICO. At the same time, the Bill would empower Ministers to interfere with the objective and impartial functioning of the ICO, such as by issuing instructions to the Commissioner. In detail:

  • Clause 32 of the DPDI Bill would insert new sections into Part 5 of the 2018 Data Protection Act, empowering the Secretary of State to introduce a “Statement of Strategic Priorities” to which the Commissioner must have regard to when discharging their function. Further the ICO would be obliged to publish a response explaining how they would have regard of this statement.
  • Clauses 44 and 45 of the DPDI Bill would introduce a requirement for the complainant to attempt to resolve their complaint directly with the relevant organisation before lodging a complaint with the ICO.

These changes would reduce individuals’ access to an effective redress when their rights are infringed: the data rights agency AWO estimates that complaints could take up to 20 months or more to resolve under the new proposed framework.6

Further, Ministerial powers to issue orders and set up priorities to the Information Commissioner’s Office would inherently undermine their independence as a watchdog, and give to the party in Government significant powers to interfere with the objective functioning of the ICO.

CONCLUSION

Data driven technologies have already revealed their potential to undermine confidence and trust in the democratic process. With the next UK general election scheduled to be held no later than 24 January 2025, the Data Protection and Digital Information Bill makes several steps in the wrong direction:

  • Individuals need more transparency and greater control over how their personal data is used, but the DPDI Bill would diminish existing rights;
  • Political profiling has become more sophisticated and invisible, but the DPDI Bill would multiply avenues and caveats political parties can rely on to profile individuals against their will;
  • Accountability and public scrutiny are in an ever-increasing need, but the DPDI Bill would make it more difficult to scrutinise data uses and hold law-breaker to account;
  • The right to an effective redress and the existence of an independent watchdog are cornerstones of a democratic society, but the DPDI Bill would reduce access to redress and the independence of the ICO.

1 See: https://www.openrightsgroup.org/campaign/who-do-they-think-you-are/

2 See: https://www.openrightsgroup.org/publications/who-do-they-think-we-are-report/

3 See: https://www.openrightsgroup.org/publications/analysis-the-uk-data-protection-and-digital-information-bill/

4 See: https://ico.org.uk/for-organisations/guidance-index/freedom-of-information-and-environmental-information-regulations/dealing-with-vexatious-requests-section-14/what-does-vexatious-mean/

5 See: https://ico.org.uk/for-organisations/guide-to-data-protection/key-dp-themes/guidance-for-the-use-of-personal-data-in-political-campaigning-1/profiling-in-political-campaigning/

6 See: https://www.awo.agency/files/Briefing-Paper-3-Impact-on-Data-Rights.pdf

]]>
Home Office CCTV: free mass surveillance? https://www.openrightsgroup.org/blog/home-office-surveillance-cctv/ Mon, 08 Apr 2024 11:11:02 +0000 https://www.openrightsgroup.org/__preview__/blog/20715

The Home Office has for several years run a programme to supply Mosques, temples and Synagogues with security equipment including CCTV cameras. This has recently been extended with more than £117m allocated for mosques, Muslim faith schools and community centres. This follows a £70m package for Jewish groups. The CCTV camera systems are installed through a single vendor, chosen by the Home Office, called Esotec Ltd.

The problem with cloud-based storage

While the details of the Esotec systems are not known, it is common for CCTV systems to include cloud based storage of footage. The reasons are obvious, not least that the files are large, and management can be more easily done by a third party, for instance deleting files after a fixed period.

Cloud-based storage opens up the potential for secretive surveillance powers to be used without the end user’s knowledge. In short, it creates the possibility for the security services such as MI5 to bulk seize camera footage across all sites without the knowledge of the CCTV system owners.

There are safeguards provided in the Investigatory Powers Act 2016 (IPA): bulk supply of data to the security agencies would require a warrant, and the purposes would be limited for instance to matters of serious crime and national security. However, whether these safeguards would prevent the seizure or abuse of any data that is seized is a matter for debate.

At this point, we want to draw attention to the possibility that cloud stored CCTV data could be seized, and to ask for greater transparency over the operation of this Home Office scheme. In particular, potential participants looking at the scheme should consider if they would want this possibility to exist in systems they may install, and whether they should ensure local rather than cloud storage of any footage collected.

These concerns are exacerbated by the acquisition of Esotec in 2021 by a US company, Johnson Controls Inc. Esotec’s privacy policy is now that of Johnson Controls and can be found here, which notes the possibility of data seizures under the US FISA s 702. Among other items we would want to know is the way that data is retained and for how long, information which the policy explains is contained in “product specific” data sheets.

Legal powers for secret surveillance

When CCTV is stored and contains personal data it must be stored securely and can only be used for specific purposes. These are normally defined in GDPR and the Data Protection Act. Warrants, and production orders for footage would be available to the police, subject to conditions.

Bulk data set powers, however, are only available to the intelligence agencies, and not to the police. The arrangements and conditions are described in Part 6 of the IPA. Such warrants are subject to approval by judicial commissioners. Some bulk data sets can be acquired commercially or supplied voluntarily but conditions nevertheless also apply. (Some revisions are being considered in the current Investigatory Powers Amendment Bill now in the House of Lords.)

Obtaining this type of data without the knowledge of the owner or vendor would require an Equipment Interference warrant under Part 5 IPA or Part 6 Chapter 3. Here too double-lock approval by judicial commissioners is required.

The criteria for obtaining bulk datasets are for the purposes of preventing or detecting serious crime or the economic well-being of the UK in so far as such are relevant to the interests of national security.

How bulk CCTV footage may be used

Analysis of CCTV footage is becoming easier over time, thanks to pattern recognition and learning technologies, often called ‘Artificial Intelligence’. It is possible to try to recognise individuals from the way they walk, facial recogition, clothing, and so on. While these have limitations, these reduce as definition increases. The likely purposes of analysing CCTV in this way could include identifying where persons of interest attend, and to establish social maps of contacts and other people who appear to be associated with a person of interest. Having large amounts of data available therefore increases the value of the material to the intelligence agencies.

While the criteria for using bulk datasets should in theory offer safeguards, agencies can find themselves under political and institutional pressure to produce results which lead to mistakes.

A trust risk

The seizure of CCTV data from this Home Office scheme would constitute the mass surveillance of people at a place of worship, merely for attending that place of worship. We may hope that this would be taken into account in production of a warrant allowing the data to be seized, but it is not clear that it would prevent a warrant from being issued. It is also not possible for the government to credibly rule out such seizures, as the powers are exercised in secret, and governments change quite regularly. The possibility of a deeply damaging breach of trust are therefore significant. This increases the need for transparency from the Home Office, but also highlights the dangers of the bulk data powers that the government has awarded itself, and the potential limitations of oversight.

Retention and oversight

If data was seized by security agencies, it would be important to know how long it was kept for and what the justification is. This would be part of the discussion made with judges issuing warrants. It should also be something discussed with parliamentary oversight bodies. A further concern is that retention periods requested by agencies are likely to be long or indefinite, which changes the nature of its use. A Technical Advisory Board assesses the risks when warrants are requested. As time goes on and machine learning (artificial intelligence) techniques improve, the intrusiveness of the uses the data retained can be put to increases.

Oversight of agencies and police use of CCTV is becoming more difficult, which would be a problem if, for instance, police were to collect any data from Estotec. The current DPDI Bill will allow police databases containing CCTV images collected for ANPR or other general purposes to be simply repurposed for national security purposes and redeployed, beyond the sight of the ICO. Meanwhile, the Biometric and CCTV Commissioner is also to be abolished. It is unclear if any oversight for such repurposed data will exist in practice.

What you can do

  • If you already have one of the Esotec systems installed, you can get it touch and send us details of the system. In particular, we would like to know if it is always cloud based, locally stored, or if there are options that can be configured by the system owner
  • If you are considering using the Home Office scheme with Esotec, please ask for details of their data storage options, and if you are able, supply details of these to us at ORG.
  • If you have been considering using the Home Office scheme, or attend a place of worship where Home Office funded CCTV has been installed, please raise these concerns with the relevant decision makers.
  • Ask for clarity on this scheme from the Home Office by writing to your MP. They can ask the Home Office to explain how the system works, if it is cloud based or if it can store footage locally, and in what circumstances powers under the IP Act might be exercised.
]]>