Humans have a natural desire to retain some privacy, although different individuals have different things that they would rather keep private, and/or different strength of feeling about privacy concerns.
Although we can in theory decide the extent to which we want to let Google, Facebook etc. have our personal information in exchange for their services, we in practice have little choice but to enter into such exchanges.
And although many of us feel that we have 'nothing to hide' from the government, that argument is valid only as long as our view of what is criminal, for instance, coincides with the view of the state. Such coincidence was notably lacking in the 2019 Hong Kong street protests. Many young people might also take a different view to 'soft' drug use to that of British politicians.
Against that background ....
New EU rules - the GDPR - came into force in May 2018.
Facebook was fined the pre-GDPR maximum of £500k in 2018 for allowing Cambridge Analytica to access users' personal data without their explicit consent.
But one wonders whether any regulations can adequately protect the interests of consumers faced with increasing monetisation of personal data. The following extracts from a letter to the FT summarised concerns very well:
... The consumer will never own the data or the algorithms. ... Every moment, your data relating to browsing, calling, online, social media, location tracking and so on is being churned through a multiverse of data warehouses. If you have been browsing about a certain medicine, correlating to a call to an oncologist and a search for a nearby pharmacy, this can consequently be packaged as a data intelligence report and sold to your medical insurance company. This is just one of the myriad ways monetisation is being unleashed on unsuspecting consumers across the world.
The data protection regulations, although a step in the right direction, are usually still heavily tilted in favour of the corporate giants and still focused on cross-border transfers than on the real risks of monetisation. The fines imposed on the Silicon Valley giants are minuscule compared with the money they have made from data monetisation efforts. And this is all achieved in the age that is still a forerunner to the era of artificial intelligence and quantum computing.
The very concept of data privacy is archaic and academic. The tech giants are moving faster than this philosophical debate about data privacy. All the sound-bites from the tech giants are mere smoke and mirrors. Unless we revisit our concepts of what is data privacy for this new age of data monetisation, we will never really grapple with the real challenges and how to enforce meaningful regulation that really sets out to protect the consumer.
Syed Wajahat Ali
Zeynep Tufekci drew attention to one interesting example in 2018. One medical company was buying individuals' temperature data that had been uploaded to another company which had supplied Internet-connected thermometers. It all sounded very benign but Ms Tufekci was concerned that the customers hadn't thought through the implications of having health data sold to anyone at all, including advertisers, and/or integrated into countless databases.
A Safety Code?
One possible way forward would be to recognise that many companies' privacy/data handling rules are far too complex for most of us to understand. It might therefore make sense to develop government mandated 'technology safety codes', just as governments enforce food safety and building regulations, recognising that most of us do not know, when we enter a restaurant, whether we are in danger of getting food poisoning or having the ceiling fall down on us.
Can data be owned?
It is interesting, by the way, to consider whether data can be (or is) owned like other property, It is hard to see that anyone can own the fact that something happened (Ms X bought item Y). And an electronic or other record of that fact can easily be copied - many times. And yet data is sold, which suggests that it has indeed become property. But who does it belong to? Does anyone have the right to control its destination?
Some of us, but far from all, know that our browsing history, saved in the form of cookies, can significantly affect the way in which companies deal with us. The most obvious example are the cookies saved by travel companies, which reveal whether we have previously enquired about a particular flight or holiday. If we have, then the company is less likely to offer us their best deal, believing that our repeated searches suggest that we are very likely to become a customer. (See for instance Travel site cookies milk you for dough in the Sunday Times - 24 June 2018.
Does it matter? I would argue that this is no more than an automated example of the behaviour of any salesperson who has the ability to price a product (such as a car) according to their estimate of the customer's keenness to buy. But it's a bit less obvious - sneaky even.
A March 2019 article in the New Scientist reminded us that we give away some pretty fundamental information when sharing our DNA with companies such as AncestryDNA and 23andMe.