Ctrl + Shift + Delete: The GDPR’s Influence on National Security Posture
from Net Politics

Ctrl + Shift + Delete: The GDPR’s Influence on National Security Posture

While the GDPR's "right to be forgotten" expands personal data privacy rights, it could complicate intelligence agencies' data collection efforts by allowing terrorists to request to have social media platforms delete identifiable data.
A 3D-printed Twitter logo is seen in front of a computer screen on which an Islamic State flag is displayed.
A 3D-printed Twitter logo is seen in front of a computer screen on which an Islamic State flag is displayed. REUTERS/Dado Ruvic

Lexie N. Johnson is a Cyber Officer in the U.S. Army. 

The views expressed in this article are solely those of the author and do not necessarily reflect the perspectives of the United States Army Cyber Command.

More on:

Cybersecurity

Europe

Digital Policy

The European Union (EU) shook the information technology and business worlds when implementing its 2018 General Data Protection Regulation (GDPR). In the effort to take data privacy rights to the internet, the new GDPR enacts several provisions that return the powers of data discretion and consent to EU citizens. One of its more notable provisions lies in Article 17, or more commonly known as “the right to be forgotten”—a modern fundamental right that highlights the legal tension between human rights and security measures. In the case of Article 17, the balance falls in favor of human rights.

While human rights activists rejoice with the expansion of data privacy rights, Article 17 complicates intelligence agencies’ efforts by removing data relating to an identifiable person who can be discretely discerned by referencing data such as: name, identification number, location data, online identifier, or specifics of a person’s physical, physiological, genetic, mental, economic, social, or cultural identity. This information collected by private companies in social media, telecommunications, medical information, banking, and academia was previously accessible to government agencies without restriction or with a warrant, subpoena, or court order. Due to this data access and pressure to be more transparent in their collection methods, European intelligence agencies like Europol and INTCEN have further budgeted and bolstered their open-source intelligence (OSINT) capabilities. Yet, these agencies may find their OSINT efforts legally encumbered as EU data subjects exercise their newfound Article 17 rights.

For example, the threat of homegrown terrorism throughout Europe has increased as groups suffering territorial losses, like Al-Qaeda and the Islamic State, increase media operations to survive their causes on the internet, leading to EU citizens encountering more Salafi-jihadi propaganda and radicalization operations online. As terrorist organizations face continued resistance from Western security forces, jihadi operations will likely rely more heavily on the anonymity and ambiguity of cyberspace for information, fundraising, radicalization, and planning. By covering personal data footprints, data privacy legislation like the GDPR’s Article 17 will join the ranks with encrypted messaging services, the Dark Net, and password-protected forums in assisting terrorists electronically evade intelligence forces.

Take Twitter as an example. An EU citizen with terrorist aspirations can erase identifiable information from the individual’s Twitter account by requesting Twitter to delete the data it previously collected and inferred. Data that a terrorist would be interested in erasing, and would be of value to intelligence communities, may include: the metadata, conversations and media shared in direct messages between users, device tokens, email and phone number contact lists, IP address audits, and Twitter’s inferred data like user age, gender, languages spoken, and interests.

For an EU citizen undergoing Islamic radicalization via social media and considering devotion to jihad, Twitter’s collected and inferred data, based on likes, tweets, and ad engagement, may be unsettling. The aggregate of collected and inferred data reveals enough about the terrorist that an agency with sophisticated OSINT capabilities can identify the terrorist either physically, physiologically, economically, socially, or culturally—even if the terrorist maintained an anonymous account. To avoid identification, the terrorist may exercise their Article 17 rights by withdrawing their consent to Twitter’s data collection and requesting that the data already collected be permanently erased. Once Twitter processes the request and erases the data, intelligence agencies can no longer use Twitter’s existing data or its flow of incoming data to follow the terrorist, possibly hindering EU security agencies’ efforts to identify, monitor, and prevent cases of homegrown terrorism before an attack occurs.

More on:

Cybersecurity

Europe

Digital Policy

One of the GDPR’s central problems is its lack of specificity on how collectors should process Article 17 requests. It provides six legitimate conditions by which an EU citizen’s personal data, by request, must be erased from all of a collector’s storage mediums.

The only exception to this obligation is the national security exemption housed in the GDPR’s Article 23. It provides that collectors subject to EU law can disregard an EU citizen’s right to data erasure on grounds of national security, defense, and public security. However, Article 23 fails to outline exactly how, to what extent, and for how long collectors can eschew Article 17 rights under such conditions. This lack of specificity leaves collectors in the dark when carrying out their due diligence in vetting Article 17 erasure requests against the Article 23 national security exemption. Moreover, companies are incentivized to quickly fulfill Article 17 requests. It takes more labor not to fulfill an erasure request on the basis of Article 23 and companies face stiff financial penalties for abjuring a legitimate request. It is less time-consuming, less risky, and less costly for private entities to fulfill data erasure requests as they receive them rather than vetting data erasure requests for the national security exemption.

With Article 23 underdeveloped, Article 17's power remains largely unchecked and forfeits significant data erasure power to data subjects and private platforms. There is not a larger governmental or EU executive body to assist collectors in determining the legitimacy of an erasure request, recording requests, or most importantly, vetting the application of Article 23’s national security exemption. Practicing Article 17 as it stands today further normalizes the growing power-sharing arrangement between public and private sectors. It decentralizes the responsibility of personal data protection so that independent collectors can make decisions that affect national security. Examples of this power-sharing model between government and industry include Facebook’s expanding duty to detect and remove extremist content from its platform and Google’s obligation to remove illegal and unsavory web content.

As the GDPR shock settles, states must decide how to practice Article 17 with consideration to security. They must either reject the current Article 17 procedure before it normalizes private sector data responsibilities, or they should prepare their national security agencies and international agencies like Europol and INTCEN for more complicated OSINT collection procedures, increased information sharing with commercial entities, and fewer meaningful search results.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail