I'm going to start to make a habit of posting bits and pieces I do here that should be public, in case they are of interest to anyone. I wrote a short, partial response to the recent guidelines from the European Data Protection Board on video surveillance. It doesn't cover all the issues, but resplying to consultations is an important part of counterbalancing industry lobbies and voices, which are typically resourced to reply to these en masse, and drown out overwhelmed academic and civil society voices.

Introductory remarks

This document comments on the draft EDPB Video Surveillance Guidelines, an interpretative document concerning the implementation of the General Data Protection Regulation by regulatory bodies.[1]

Special category data

It is welcome that the EDPB has remained consistent with its previous guidelines in that inferring special category data is the same as collecting it.[2] I have one significant concern however, which is that in practice it is important to elaborate on the term deduce that is used in this context. The guidelines note that.

if the video footage is processed to deduce special categories of data Article 9 applies.

One persistent challenge here is that the notion of inference or deduction in the digital economy is flawed. This guideline assumes some explict intermediate step, where a data controller transforms a video into, say, information about ethnicity or health, and then from there uses that generated data to make a decision (potentially within the remit of Article 22, GDPR) or apply some other measure.

In practice, there has been a large rise in what can be called end-to-end prediction, where the input data would consist of the 'Raw' pixels, and the output data could be a label, such as whether somebody should be stopped and searched for actinig suspiciously or in a way that would be considered 'deviant'.

In these cases, inference is a metaphor or frame of thinking that fails. The correct frame of thinking is an optimisation system.[3] An optimisation system does not explicitly infer the intermediate steps, and is as a result highly amenable to using information that is highly correlated with special category data, but without explicitly attempting to generate it.[4] Consequently, the approach taken by the EDPB to focus on intent is not clearly enough.

The EDPB should clarify in section 5 that processing which relies on informational signals substantially correlated with special categories of data will also trigger a need to seek a condition in Article 9, and that, in accordance with the accountability principle, it is the responsibility of the data controller to evidence that this is not occurring when such profiling is possible through optimisation systems drawing on automated video analysis.

Recital 51 supports this analysis, as an optimisation system can be considered a 'specific technical means' allowing the information in special categories of data to be operationalised.

A further problem is that paragraph 79 of the report could be read incorrectly as meaning those type of systems, which include 'affective computing' systems and other facial analysis systems, do not in general require explicit consent. This may be true by virtue of the lack of biometric data processing, but insofar as those systems attempt to infer or draw upon other special categories, such as health (eg gait analysis), ethnicity (eg facial analysis), political opinions or sexuality (eg, potentially by analysing clothing, or through lip-reading), then this would trigger Article 9.

The EDPB should clarify around paragraph 79 that advertisements based on visual customer analyses may trigger Article 9 on the basis of other categories of special category data than biometric data.

Excessive requests

Paragraph 97 should be changed to reflect the wording of Article 12, which is 'manifestly excessive' rather than just excessive. It additionally must highlight that this relates 'in particular because of their reperitive character'. These provisions refer not to the content of the request (they do not replicate a 'dispropotionate effort' exemption) but they instead refer to the character of the request itself, prima facie.

It should, for example, be explicitly stated that controllers cannot deny a request for data because they do not possess an easy way to send the data to the data subject, such as over the internet or through USB. It is the data controllers' responsibility to be technically able to respond to access requests.

As the Article 29 Working Party has already noted in the context of the right to portability

the overall system implementation costs should neither be charged to the data subjects, nor be used to justify a refusal to answer portability requests[5]

The above clarifications should be emphasised again in the guide for for consistency.

Data Protection by Design and by Default

The Guidelines omit an important factor of Data Protection by Design: that such systems must be designed not only with confidentiality and privacy in mind, but with data subject rights in mind. This is an important factor which distinguishes 'data protection by design' from 'privacy by design'.[6]

The guidelines should mentionL

  • video systems should be designed for effective access, objection and erasure, including searching through videos to find relevant data subjects
  • organisational systems should be designed to give on-the-ground employees the organisational competence to make real right-to-object balancing test decisions on a case-by-case basis, and to have the authority and competence to carry through with the objection. This is very important, else the right to object is illusory.

  1. European Data Protection Board, 'Guidelines 3/2019 on Processing of Personal Data through Video Devices (Version for Public Consultation)' (10 July 2019) link. ↩︎

  2. Article 29 Working Party, 'Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (WP251rev.01)' (6 February 2018). ↩︎

  3. Rebekah Overdorf and others, 'POTs: Protective Optimization Technologies' [2018] arXiv:1806.02711 [cs] link. ↩︎

  4. See generally Solon Barocas and Andrew D Selbst, 'Big Data's Disparate Impact' (2016) 104 Calif Law Rev 671. ↩︎

  5. Article 29 Working Party, 'Guidelines on the Right to Data Portability (WP 242)' (13 December 2016). ↩︎

  6. Michael Veale and others, 'When Data Protection by Design and Data Subject Rights Clash' (2018) 8 Int Data Priv Law 19. link ↩︎

Photo: author (released here under a CC-0 1.0 License).