It doesn’t matter that you’re careful
Just because you aren’t on Facebook, Instagram or WhatsApp doesn’t mean Facebook doesn’t know a lot about you – it does. You may not have consented to Facebook collecting data about you, but the actions of your friends, colleagues and other contacts means your data is probably being collected. Has your photo ever appeared on one of these platforms? Has your name? Your email address? If the answer to any of these questions is ‘Yes’, then you are being profiled. This is known as the creation of “shadow profiles”.
Sounds creepy. You’re right, it is.
Let’s now turn our attention to Clubhouse, the audio-led social networking app du jour. This article by Will Oremus writing for OneZero describes how when you sign up for Clubhouse the app persuades you to let it upload all the contacts on your phone. One way it does this is by refusing to allow you to invite anyone else to the app unless you let it have those contacts. Nice!
Once you succumb and share your contacts (‘share’ is such an innocuous word isn’t it?) the app will push you to invite them as well. It will tell you how many people on Clubhouse you already know and how many other people on Clubhouse know your contacts. These contacts may be your partner or ex-partner, your children, your clients, or a health professional you once consulted. Do they know their details are being shared in this way?
And of course companies like Facebook and others are known to have shared your personal data more widely. Would you know if they did? Can you trust them if they say they won’t? Does Cambridge Analytica ring any bells? I could go on, but hopefully I’ve convinced you that privacy is about more than Instagram knowing that you liked a particular pair of shoes. It’s about much more than just you.
It’s not just about you
You might say that you don’t care about privacy because you’ve got nothing to hide. Fine, but it’s not all about you. Let’s take advertising, local journalism and democracy as a suitably weighty example.
Advertisers used to put their ads where they thought their customers were. Local advertisers would place their ads in local (paid for) newspapers. Now, 80% of digital advertising spend in the UK goes to Google and Facebook. That means your local newspaper makes its journalists redundant, it serves you more and more crappy ads on its website, the quality and variety of its journalistic content goes downhill and eventually it goes out of business. Then who will hold the local politicians to account?
That is advertising spend being taken away from quality journalistic content, which is now being replaced by a site claiming to review the “10 best monitors for home working” supported by “Promoted content” inviting you to “Find out what Shannen Doherty looks like now”.
But the impact of the micro-targeted behavioural advertising world of Google and Facebook doesn’t stop there. Think of all those low quality, bottom feeding website publishers and content networks who scrape content from genuine publishers and repackage it up surrounded by ads supplied to them from who knows where promoting extremist political views, conspiracy theories, or telling you what that star of Beverly Hills 90210 is up to now.
It’s way bigger than you think
All this talk of 1990’s actresses risks distracting you from the seriousness of the situation. Sorry. Let’s crank things up a bit.
The advertising ecosystem created by the likes of Google and Facebook is built on extracting our data and giving advertisers the opportunity to use that data to manipulate our behaviour. Companies collect and use our data without paying for it, build products with it and then sell those products back to us, or offer it to other companies to sell things to us.
This is happening at an incredible scale. The more information about us they have, the more power they have to manipulate us. And remember, it’s not just your data — it’s everyone’s. And it isn’t just Google and Facebook using it. They allow other companies to use that data as well.
Data collection is growing
These companies aren’t content to just collect what we look at and click on. Amazon’s facial recognition system claims it can now recognise fear, together with eight other emotions. Companies now have the potential to micro target our emotions to manipulate us. Are you happy for Amazon to have your face for free? And know your emotions? And use them or share them with others so they can sell things to you, be that a toaster, a way of life or a political outcome?
As Francis Bacon famously said, “Knowledge is power”. What is most dangerous is that much of this information gathering goes unnoticed. Until recently, nobody was storming the seat of government with guns. This power is more sinister than that. It is subliminal. Often we don’t know it is happening until it is too late and we look back at elections or referendums and ask the question, “How did that happen?”.
What kind of society do you want to live in?
A few questions for you.
Do you want to live in a society where black people are more likely to be misidentified by facial recognition technology than white people? Where a political party can target voters in specific streets, or in particular demographics in an attempt to dissuade them from voting?
Do you want technology companies you’ve never heard of to know more about where your kids have been today than you do? Will you turn a blind eye to betting companies exploiting a customer by tempting them with offers of free bets, just when they’ve promised their partner that they are going to lay off the online slots?
Do you want to live in a world where a commercial enterprise knows exactly where you are, what you are interested in, and which bombards you with advertising until you spend money that you don’t have on something that you don’t need?
As Alastair Mactaggart said in a recent Wired article, “This is a lot of weight for one word to bear. ‘Privacy’ is the wrong word for privacy”. Because, remember, it’s not just about you. It’s way bigger than that. Do you care?