Data Protection Is Not Enough


In the wake of the Snowden events, the European Union passed the most advanced and thorough data protection legislation humanity has ever come up with, the General Data Protection Regulation (GDPR).

It will start to be enforced on May 25th of 2018 at which time those organisations in non-compliance may face heavy fines. And yet the threat to our liberties does not end there, because there are several scenarios in which the GDPR isn't of much help. Ironically, it does nothing about the problems Edward Snowden informed us about.

Is the EU-GDPR providing
sufficient protection against …

0.1. … individuals breaching the privacy of their peers by exposing social data such as contact data, address books, conversation data?


No, Article 2.2c specifically exempts them from being liable. In theory, companies should reject such data, but will they? What if they define such data as essential to their business model?

0.2. … companies luring individuals into breaching the privacy of their peers?


Apparently not. They are only obliged to tell individuals that their address books will be harvested, leaving it to the individuals to understand and care about the societal implications. At best one could try to enforce the principle of data parsimony ("data minimisation"), but social services would still argue that social data is essential to their business model. Would corporations be allowed to harvest address books as to send unsolicited invitation mails to potential clients? Presumably yes, because the breach caused by peers is legal.

0.3. … government authorities exercising omniscient data collection and harvesting, thus posing a long-term threat to democracy?


According to Art. 2.2b, local governments are typically exempted from the provisions of GDPR. Article 2.2d even specifies how all law-enforcement and related authorities are exempted from GDPR obligations as they fulfil a duty of protecting "public security". Article 23 offers further ways to legislate national exemptions.

We can only hope that better regulation will be created to specifically tackle law-enforcement, considering that the ability to access, falsify or remove any digital evidence — is already creating a serious problem of accountability of the executive branch in regards to both judicial and legislational oversight. It would be a dangerous further imbalance if law-enforcement was allowed to make mass accumulation of data.

There is a vague hope that the principle of moving data processing and storage into the EU will make their abuse more difficult, but we are talking about data whose value is well worth acting outside the scope of legality. Any systems administrator who can walk out the building with a memory stick can make themselves a fortune in Bitcoin.

Paragraph 2.4 then refers to existing old regulation for EU institutions. There is no provision in regards to foreign government operations, probably because there would be no way to enforce them, if the Internet is not encrypted by design in advance, as we suggest in our YBTI legislation proposal. Because 'data protection by design' (article 25) is not as effective as mandatory end-to-end encryption and a technical provision by which social data never ends up on anybody's servers — it just stays in the hands of the people participating.

0.4. … companies secretly breaching GDPR law by passing data on to third parties and governments as, for example, required by US Patriot Act?


The main problem with digitalisation is how there hardly ever is strong evidence, enough to evoke consequences and punishments. Therefore there is little incentive not to prioritise US law over EU law, or otherwise engage in criminal behaviour such as black market wholesale of data collections. If individuals "watermark" their data by, for example, giving out unique e-mail addresses to each service, would that be accepted by any judge to prove illegal data trading? How can someone prove not to have exposed that e-mail address to anybody else? How can you prove that the company sold your data if any service provider operating your e-mail could have done so instead? How are citizen supposed to defend their rights under a condition where all evidence is volatile and falsifiable?

0.5. … companies collecting sufficient profiling data suitable for placement of political advertising in such a strategic way as to influence the subject's democratic vote?


The business of winning elections is too big and too interesting to not simply migrate to the black data market. The data is there — companies are merely required not to do stupid things with it — unlike with the YBTI legislation proposal, GDPR expects no technical provision to make mass abuse impossible.(1)

Some may argue, that the GDPR has plenty of provisions to make the sort of data collection that threatens democracy illegal, but does it indeed keep a company like Facebook from knowing people's psychological weaknesses and place advertisement accordingly? In the face of the importance of elections, how much does it matter what is legal? How would any data protection officer be able to prove any wrongdoing? We believe the only long-term protection is to never let this kind of data be collected anywhere, if we want to recover democracy.

0.6. … companies or governments being able to make mass surveillance, mass real-time prediction of future voting outcomes, empowering political actors to optimise their election results?


In the case of governments engaging in post-democratic or otherwise totalitarian practices, the GDPR has zero provisions to offer as stated in Articles 2.2b, d and 23.

In regard to commercial actors, the lure of the data residing on their cloud storage is great and chances getting caught are small. Yes, GDPR makes many abuses such as using the data in a different way that originally intended (5.1b), illegal. Therefore there may not be a reasonable business case for such kind of GDPR breach, but all of the world's governments will be interested in exercising pressure on such corporations to give them invisible access to data storage following the PRISM model. Also, many governments can directly access cloud storage without ever needing to interact with the companies — so the ideal constellation for them is if both surveillance partners aren't legally infringing the GDPR, just bypassing it spectacularly.

As long as a company is allowed to collect people's "likes", that is probably enough to empower an observing nation state to influence its voters. We haven't seen a research study in that regard yet, but the things a person likes or dislikes may provide enough psychological knowledge to detect their biases and vulnerabilities. We shouldn't wait until we find out harder.

0.7. … companies using Google Docs, Apple OS or Microsoft Windows to do their back-office accounting (employee data etc)?


Given the track record of Microsoft, Google etc in regards to PRISM and the EU Supreme Court ruling against the 'Safe Harbor' provisions, one should expect that such behaviour become obsolete and all companies need to upgrade to Linux systems for accounting.

Our guess however is that companies will most likely infringe GDPR requirements in that regard, since it is hard to prove PRISM ever really happened, or, even better, governments will access their data silently behind their backs.

Reckless companies may continue to do personal data processing on PRISM operating systems and platforms as before. Not because they believe it isn't happening, but because it's unlikely there will ever be strong evidence and therefore a consequential punishment. Even data protection officers are so confused by the general sorry state of technology, that they may not perceive these operating systems as breaching the law — although they of course do.

For all of the above reasons, GDPR is not enough. For a long-term solution that doesn't put democracy at risk, read up on the YBTI legislation proposal making mass surveillance impossible by design.

[1]This paragraph was written before the Cambridge Analytica and AIQ scandals unfolded…
Copyright 2018 by the individual members of the YBTI collective. Released under CC-BY-NC-ND Creative Commons NonCommercial NoDerivatives License 4.0.

See also what Richard Stallman has to say on the subject.

First Version: 2018-02-14. Last Change: 2018-09-05

No advertising, no tracking, no profiling, no data mining, no fancy website.
Viewable as youbroketheinternet.cheettyiapsyciew.onion as much as