Darren Bond

In a post-GDPR world, the majority of businesses seem to be improving the way personal data is handled and processed by taking proactive steps in making users’ rights clearer, and being upfront about how collected data is going to be used. Let’s face it, as much as it’s been difficult to come to terms with, I think it’s made us all a bit more honest.

That said, it seems that the tech giants of the world – those that these laws were arguably designed to impact the most – seem to be inactive at best, and actively breaking the new rules at worst.


A couple of months ago, Google was fined a record €50m by the French Data Regulator for breaches of the rules and specifically for:

  • Lack of transparency
  • Inadequate information
  • Lack of valid consent regarding ads personalisation

I was thinking to myself, surely it’s time for Google to take this a bit more seriously…

Yet just a few weeks ago we found that Google ‘forgot’ to tell people that they had fitted microphones in their Nest security devices. *Face Palm Emoji*

Congress have asked for a face-to-face confrontation citing that:

“Google’s failure to disclose a microphone within its Nest Secure product raises serious questions about its commitment to consumer transparency and disclosure […]”

But unfortunately this doesn’t stop with Google…

None of your Business!

NOYB (None of Your Business) – a privacy activist group – said that they found Amazon, Apple, Spotify and YouTube all allowed people to download a copy of their personal information quickly & easily – which is a step in the right direction – but that only some of the data was “intelligible”. Other parts of the data were supplied in a format that could not be understood by people – effectively breaching GDPR.


Facebook have made all kinds of promises after the Cambridge Analytica scandal and various data breaches last year. This includes upping their data and security teams to provide more visibility to users, and starting to control the spread of fake news more effectively.

However, this hasn’t stopped the investigators, and it’s now estimated that there are 10 major GDPR investigations into Facebook and its subsidiaries.

This year, Facebook will roll out its ‘clear history’ tool, which will obviously have an impact on advertising. Selfishly, my first thoughts were about it making our targeting pools smaller and the data we can use for advertising purposes less effective. But from a privacy perspective, is this really enough?

The real problem

The real problem with privacy is that people aren’t educated (even now) as to how their data is being used. I ran a lecture at the University of Suffolk this week, with Masters level business students. While demoing the Facebook ad platform and showing them how easy it is to identify users by demographics, locations, and interests or behaviour, their gasps were enough to inform me that they have no idea how this stuff works. These are well-educated and driven, experienced professionals, taking an MBA in their own time.

Providing a tool to ‘clear history’ is not going to have an impact on people that don’t understand what the history is, or what it means to them, or what it allows Facebook or us as advertisers to do.

The positive angle on data sharing

My angle on privacy is that I’m happy to give it up if I’m provided a great tool or service that improves my user experience. I’m fully in bed with Google… I know they have tonnes of my data, but I love that they wake me up earlier if there’s a problem with my commute, they provide me alerts to new restaurants I might be interested in, and that they can turn my living room lights on and off via my voice commands – all for free!

When I’m presented with ads online, I prefer that they’re relevant to me and for things that I’m interested in, rather than just spammy, irrelevant rubbish!

But I’m in an advantageous position, I work with ads and data and I know what I’m giving up to get those benefits. Most ‘normal’ people don’t.

In my view, Google, Facebook and other tech giants need to be 100% transparent with their users on this stuff – show them examples of what happens to their data and explain why, so that they can make an informed decision on whether they want to ‘clear history’ – otherwise what’s the point? If people were concerned enough, they wouldn’t use the platforms in the first place.

More on this subject