The GDPR and (not) Regulating the Internet of Things


The European Union’s General Data Protection Regulation (GDPR) has been described as a “gold standard” for protecting personal privacy in the Internet age. Among its core principles is a requirement for the consent of individuals to the collection and processing of their personal data. Consent must be freely given, specific, informed, and unambiguous. I argue here that the possibility of such consent is undermined by increasingly ubiquitous Internet of Things (IoT) devices which collect a vast array of personal data, and the use of automated data processing that can produce significant social and legal impacts on individuals and groups. I outline the requirements of consent under the GDPR, and describe the challenges to the GDPR’s privacy protection principles in a world of rapidly evolving IoT technologies.


The General Data Protection Regulation (GDPR) was approved by the European Parliament in 2016 following four years of deliberation, and enacted in 2018. It supersedes the 1995 European Union’s Data Protection Directive, written to address privacy concerns in electronic communications (InfoLawGroup, n.d.). The GDPR establishes important new privacy rights for citizens of the EU in the context of the digital data systems and the Internet, including rights to be informed of personal data collection, to access the data so collected, to rectify inaccurate or incomplete personal data, to request its erasure (the “right to be forgotten”), and importantly in today’s context of the Internet of Things (IoT) and artificial intelligence (AI), to not be subject to automated data processing and decision-making such as the use of complex and opaque algorithms for profiling that may shape life opportunities and limits for individuals and groups (Intersoft Consulting, n.d, European Union, n.d). Since personal data collection and processing are common practices across the globe, the GDPR applies to non-governmental organizations from any nation whose operations involve the personal data of EU citizens (European Union, 2018). As a result, the GDPR has been setting standards for Internet-based business practices affecting privacy around the world. Its provisions have had substantial influence on privacy legislation passed in several U.S. states, including the California Consumer Privacy Act of 2018 (State of California, 2018), Virginia’s Consumer Data Protection Act (Commonwealth of Virginia, 2021), and the Colorado Privacy Act (State of Colorado, 2021).

The Internet of Things (IoT) is a general term for a set of technologies promising an abundance of network-connected services and conveniences, such as smart homes, smart energy management, smart speakers, smart watches, and smart technologies for shopping, health care, transportation, traffic control, logistics, utility management, agriculture, and environmental monitoring. There are a growing number of law enforcement uses of IoT to monitor public spaces like CCTV cameras linked with facial recognition software, and military uses like smart drones and military robots. IoT devices use sensors to collect data about their environment and the people and activities in them. Personal IoT devices collect the user’s identity, location, behaviors, habits, preferences, and biometrics, often without the individuals’ awareness, and share data with network systems and services that aggregate and process data.

The GDPR does not mention the IoT in its text. It is technology-agnostic, as it was intended to regulate privacy-affecting technologies now and in the future. But since the GDPR was passed, the number of IoT devices worldwide has grown from about 4.6 billion in 2016 to about 13.8 billion today, and is projected to total more than 30 billion by 2025 (Statista, n.d.). As there is no international cybersecurity standard for the design, security features, and manufacture of IoT devices, they introduce potentially unknowable personal data privacy and security risks.

But as Sandra Wachter (2017) explains, collection of personal data is only one part of the regulatory challenge. Data from IoT systems is often combined with many other datasets. The combined data is subject to analysis by artificial intelligence (AI) applications that generate inferential profiles of individuals, which form the basis for automated decisions about access to social and economic resources and opportunities. “In other words, profiling creates and modifies information about the user, which becomes part of the individual’s identity and shapes her future treatment within IoT systems, and more broadly within the infosphere.”

IoT devices are part of a large array of systems that capture personal data, including web browsers, social media, store loyalty cards, credit cards, and law enforcement, judicial, and financial systems. As we interact with these systems, we unknowingly develop what Luciano Floridi (2017) has termed our “informational identity,” comprised of all information that exists about us. Different aspects of our informational identity are accessed by different systems to automate decisions about our privileges and limits. Many such systems use “black box” AI models, “meaning that humans, even those who design them, cannot understand how variables are being combined to make predictions” (Ruden & Radin, 2019). As Stoica, Han, Chaintreau (2020) and others have shown, black box models can mirror and even amplify bias in high-stakes decisions that inflict harm on already disadvantaged groups.

Scott R. Peppet (2014) identifies four primary concerns for effective regulation within this rapidly evolving scenario: “discrimination, privacy, security, and consent.” While these issues are closely related, I focus here on the question of consent as a priority for regulatory intervention.

Article 7 of the GDPR describes the “conditions of consent” for the processing of personal data, which are further specified in its Recital 32. To be legally valid, consent must be given by “a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her” (EU Intersoft Consulting, n.d. Recital 32). It is the responsibility of the data controller, defined as the entity that determines how personal data is processed, to show that the data subject has expressed valid consent. Further, a written request for consent “shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language” (EU Intersoft Consulting, n.d. Art. 7 GDPR).

These requirements are problematic at the most basic level for IoT technologies, which are often small, screenless devices without manual input mechanisms. Printed labeling on consumer IoT devices and packaging generally does not include any language at all concerning data collection or processing associated with the device (Peppet, 2014). Among consumers interested in purchasing IoT devices, Emami-Naeini & Dixon et al. find that “(t)hose who sought privacy and security information before purchase, reported that it was difficult or impossible to find” (Emami-Naeini & Dixon et al., 2019). Here it is important to consider the language of GDPR Article 22:

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” (EU Intersoft Consulting, n.d. Art. 22 GDPR).

The data controller is responsible for respecting this right. Any valid consent request by the controller would need to provide both an ex ante explanation of the automated data processing (before it occurs), and an ex post explanation of the outcome of data processing (after it occurs), “including both system functionality and the rationale for a specific decision” (Wachter, Mittelstadt & Floridi, 2016). Given the complex nature of AI-driven data processing and the inscrutable logic of its outcome, any IoT consent notice lacking such explanations would represent what Cory Doctorow refers to as “consent-theatre” (Doctorow, 2021).


The GDPR lays the burden of proof for compliance on the data controller, who is placed in a position of responsibility for protecting the rights of individuals in a digital environment with inherent privacy risks. In principle, the controller is obligated to inform data subjects of critical issues that may affect not only the privacy of their data, but also the potential impact of profiling and automated decision-making. In reality, data technology has moved at the speed of innovation, while regulations continue to move at the much slower speed of legislation.

The GDPR established new and important rights for individuals in relation to practices that mine and use their personal data for commercial purposes, and remains a gold standard for privacy protection in the digital age. It was intended to meet the cyber privacy challenges of the present and future. But with the rapid development and adoption of IoT technologies, Big Data, and Artificial Intelligence, it has already fallen behind in important ways identified here.

A number of regulatory steps could be taken to begin addressing some of the known privacy risks from IoT systems and data practices. The GDPR can play an important role in this, but based on the analysis presented here, the GDPR cannot do it alone.

Regulation of IoT devices is a starting point, by establishing cybersecurity standards and requirements for comprehensive and truly informed consent. The EU Cybersecurity Act begins to address this by supporting security-by-design and privacy-by-design principles through the European Union Agency for Cybersecurity (ENISA) (European Union, 2019). The act should be amended to charge ENISA with developing standards for user notification and comprehensive consent concerning the data collected by IoT devices, and how it is aggregated and processed by downstream data systems. Enforcement of these standards must be given real teeth, like the fines levied against companies that violate requirements of the GDPR.

Article 35 of the GDPR requires data protection impact assessments (DPIA) for processing that “is likely to result in a high risk to the rights and freedoms of natural persons” (EU Intersoft Consulting n.d). But as discussed in the analysis presented here, the risks of automated processing may not be knowable. More to the point, a DPIA once conducted is not shared with individual users of the technology, who may have consented to its use without this knowledge. Article 35 should be amended to include user notification of DPIA results, with additional opportunities to accept or decline consent.

These are small steps, and not simple to implement. They are also inadequate to the challenge, as IoT technologies, the data streams they facilitate, and the unpredictable outcomes of processing and decision-making by AI applications continue to gain leverage in our lives.


Commonwealth of Virginia. (2021). Consumer Data Protection Act.

Doctorow, C. (2021, November 26). UK ICO: surveillance advertising is dead.

Emami-Naeini, P., Dixon, H., Agarwal, Y., & Cranor, L. F. (2019). Exploring How Privacy and Security Factor into IoT Device Purchase Behavior. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–12.

European Data Protection Supervisor. (n.d.). The History of the General Data Protection Regulation. Retrieved November 27, 2021, from tion-regulation_en

European Union. (n.d.). General Data Protection Regulation (GDPR) Compliance Guidelines. GDPR.Eu. Retrieved September 27, 2021, from

European Union. (2018, November 18). Does the GDPR apply to companies outside of the EU? GDPR.Eu.

European Union. (2019, April 17). Cybersecurity Act.

EU Intersoft Consulting. (n.d.). Art. 7 GDPR – Conditions for consent. General Data Protection Regulation (GDPR). Retrieved November 28, 2021, from

EU Intersoft Consulting. (n.d.). Art. 22 GDPR – Automated individual decision-making, including profiling. General Data Protection Regulation (GDPR). Retrieved November 28, 2021, from

EU Intersoft Consulting. (n.d.). Art. 35 GDPR – Data protection impact assessment. General Data Protection Regulation (GDPR). Retrieved November 29, 2021, from

EU Intersoft Consulting (n.d.). General Data Protection Regulation (GDPR) – Official Legal Text. (2016). General Data Protection Regulation (GDPR). Retrieved September 27, 2021, from

EU Intersoft Consulting. (n.d.). Recital 32—Conditions for Consent. General Data Protection Regulation (GDPR). Retrieved November 28, 2021, from

Floridi, L. (2011). The Informational Nature of Personal Identity (SSRN Scholarly Paper ID 3840132). Social Science Research Network.

InfoLawGroup. (n.d.). GDPR: Getting Ready for the New EU General Data Protection Regulation. InfoLawGroup LLP. Retrieved November 28, 2021, from w-eu-general-data-protection-regulation

Peppet, S. R. (2014). Regulating the Internet of Things: First Steps Toward Managing Discrimination, Privacy, Security & Consent (SSRN Scholarly Paper ID 2409074). Social Science Research Network.

Rudin, C., & Radin, J. (2019). Why Are We Using Black Box Models in AI When We Don’t Need To? A Lesson From An Explainable AI Competition. Harvard Data Science Review, 1(2).

State of California. (2018, October 15). California Consumer Privacy Act (CCPA). State of California – Department of Justice – Office of the Attorney General.

State of Colorado. (2021). Colorado Privacy Act.

Statista. (n.d.). Global IoT and non-IoT connections 2010-2025. Statista. Retrieved November 27, 2021, from

Stoica, A.-A., Han, J. X., & Chaintreau, A. (2020). Seeding Network Influence in Biased Networks and the Benefits of Diversity. Proceedings of The Web Conference 2020, 2089–2098.

Wachter, S. (2017). Normative Challenges of Identification in the Internet of Things: Privacy, Profiling, Discrimination, and the GDPR (SSRN Scholarly Paper ID 3083554). Social Science Research Network.

Wachter, S., Mittelstadt, B., & Floridi, L. (2016). Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation (SSRN Scholarly Paper ID 2903469). Social Science Research Network.