When Tristan Harris worked at Google in Silicon Valley, he became disillusioned enough to write a 144-page manifesto: A Call to Minimize Distraction and Respect Users’ Attention. “Never before in history have the decisions of a handful of designers (mostly men, white, living in San Francisco, aged 25-35) working at three companies [Google, Apple and Facebook] had so much impact on how millions of people around the world spend their attention,” he wrote. “We should feel an enormous responsibility to get this right.”
Most entrepreneurs start with an idea or a problem they think they can solve. They draft in engineers to build a product. They build a significant base of customers and clients and make plans for future growth, investment, and features to develop. Data has been collected at every stage. But no one has sat down and had a conversation about privacy. And that should be happening right at the very beginning.
Why? Because privacy really matters. It matters to each and every one of us.
Data flows like water
In her book, Privacy is Power, Oxford University professor Carissa Véliz talks about the philosophical underpinnings of privacy and why it’s important not only for individuals but also for society. She describes it as “absurd” that technology companies have the potential to know our political leanings, who our family and friends are, what we eat, what we drink, our sexual orientation, and more, and then go on to sell that data to brokers or companies that use it to make decisions about whether we get a loan, a house, insurance, or a job.
The dystopian potential for this kind of data sprawl is very real. “It’s very hard to stay in control of data. Data flows like water,” she said at a recent Wired event. “We need to ban the sale and sharing of personal data by companies. The advantages are few and the disadvantages are huge.”
Recent scandals involving Cambridge Analytica, the A Level exam results debacle, and debates around contact tracing via the Covid-19 app have forced some of these conversations into the public domain. People are more aware than they’ve ever been about the importance of privacy and are increasingly looking to support companies that take this issue seriously.
It’s not about having something to hide, or well-crafted terms and conditions about data being used responsibly now. It’s about what that information could be used for in the future. You might think that you are happy to share your DNA with 23andMe, for example, but by sharing your DNA, you are also sharing information about your children. And the reality is that you have no idea how this information could potentially be used in 10 years time.
The need to build well
So how can we ensure privacy is protected?
At a basic level, we need a framework, a checklist, and people from all departments who are focused on this issue. Someone needs to look ahead and ask, what are the implications of what we’re building – in the short, medium and long term? Arguably, if Facebook had put the right people in a room together to think this through, I’m sure the risk of fake news and misinformation would have been identified. Perhaps that would have led to protections being put in place ahead of time.
Innovators have a responsibility to take these factors into consideration. To build well in the first instance, rather than reactively try to mend the dam after it’s sprung a leak. To continuously incorporate ethics assessments into every strategy and ask, what is fair, what is the right thing to do? Rather than proceed with a “collect first, ask questions later” approach that cannot be undone.
Privacy by design demands up-front rigour and regular reviews – from research and conception to design, development, testing, and implementation. Tools such as the Data Ethics Canvas from the Open Data Institute give us a place to start having this conversation.