Nowhere is aversion towards transparency deeper than in the tech industry. The result of this corporate culture has been a massive breakdown of public trust. What management cannot fix, engineering must, argues leading privacy campaigner Simon Davies.
Privacy and identity are two sides of the same coin, argues Luciano Floridi. And yet, paradoxically, western governments are now eroding privacy in the interests of their own self-preservation. However, collecting data first and asking questions later is not a policy, says Floridi; it’s an affront to one of the foundations of liberal democracy.
Wojciech Przybylski: What is information?
Luciano Floridi: It’s clearly one of those polymorphic concepts, which, because it’s so widespread and useful, we use in many ways. Basically, there are three main concepts, which are related but different. One is information about facts and reality – I like to call it the queen of all concepts of information, because it’s one of the most widely spread; this concept relates to everything from train timetables telling me when the train leaves and from which platform, to what you find on Wikipedia.
So there is information “about.” Then there is information “as” something; instead of being “about” something, it’s really information “as” part of the world, out there. This relates to patterns in the environment, for example, when we talk about fingerprints, DNA or the tree rings that we count to establish the age of a tree. These are all structures that we describe “as” information. Because, if you think about it, DNA is information – it’s not “about” something, it’s information in and of itself, and we treat it as such. So that’s the second concept.
And you soon realize that there’s also a third concept, which is related. This is information “for” something, and this is what we mean when we talk about, say, an algorithm, a recipe, or some music sheet – these are all instructions. If I say, “shut the window”, that’s not “about” anything, it’s so that something happens. It’s instructional information and we can say, for the sake of simplicity, that it’s “for” something.
So, the three concepts are information “about” something, “as” something, and “for” something. And if you consider these as three dimensions, rather than three separate categories, every piece of information we encounter in the world is always located in this 3D space.
WP: In the light of what you have just said, what is a human being? Are we also “information”?
LF: At this stage of human history, we conceptualize, we understand the world, we model the world in informational terms – and that’s a good thing; it makes a lot of sense. Instead of having a Newtonian conception of a human being, like something made of mass, in terms of physics, chemistry and biology, we describe human beings in terms of networks, informational properties, privacy, and so on.
This means that, when I suggest that we are informational entities, what I’m talking about is an epistemological way of doing metaphysics, pretty much in the Kantian tradition; we understand the world today in informational terms, and therefore we look at each other as informational entities in a network, and that’s why some of the issues that were not there in the past, because we didn’t focus on them, are now very evident, such as the importance of privacy, online security, or education as a way of shaping the informational self. Not that they were entirely absent in the past, but now they are much more salient issues.
WP: This must raise a whole series of questions relating to ethics and information. Is this an area on which you’re working?
LF: There is a tetralogy based on the work that I’ve been doing during the past twenty years or so. I’ve published the first two volumes and I’m working on the third – the fourth is some way off in the future. The first volume, with very little imagination, is called The Philosophy of Information, and is mainly a work of epistemology and metaphysics. The second volume is called The Ethics of Information, and addresses some of the questions you’ve just posed. The third volume, which I’m working on now, is called The Politics of Information, and it looks at the philosophy of law, the philosophy of economics, and the philosophy of the social sciences, from the perspective of the philosophy of information today. So, the short answer to your question is, yes!
WP: Could you expand on some examples to illustrate your work? Perhaps we could look at these from the perspective of two themes, about which there are currently many burning ethical and political questions: privacy and surveillance on the one hand, and the latest round of the propaganda and information wars that we’re experiencing. How would you comment on these themes?
LF: You can find quite a lot of material about the first topic of privacy, for example, in my book The Ethics of Information. I argued there that there should be a move away from an old-fashioned, Newtonian, Lockean view of privacy based on the theory of ownership. It is like an economic model. Broadly understood this view could be expressed as “I own my data, therefore I can sell it”, for example. Instead, we should shift toward a concept of privacy based on personal identity, which means that when I say “my data”, my information is no longer “my” information, as in “my car”, but it is “my” information, as in “my hand”, “my head”, “my heart”, “my feet”. “My body”, it’s not a body that I own, it’s the body that belongs to me, that constitutes me. The same holds true for some information. It is not just mine, it’s me.
So, if you start thinking from that perspective, privacy becomes, to me, clearly crucial, much more so than the ownership perspective would suggest. We talk a lot about privacy in public spaces. It is really hard to make any sense of that if you take an economic approach, as in, “this is my space – do not trespass it”. If I’m on a train or in a public square and someone takes a picture of me then, since I’m in a public space, I can’t possibly respond by saying “you’re trespassing on my space”, can I? But shift to personal identity, and all of a sudden you’re actually digitally cloning me, and instead of trespassing it becomes kidnapping. And kidnapping is wrong even in public spaces.
WP: So, those Native Americans who didn’t allow painters or photographers to create portraits of them were right…
LF: It’s a bit of a stretch, but you have a point. Of course, in that case there is animism and other issues at play. But if you take a picture of someone, digitally or informationally speaking you are reproducing that someone in a copy and fixing it. It’s like being duplicated and “mummified” without being asked whether you want to be duplicated and have your identity fixed in a snap-shot. I’m exaggerating, of course, it’s only a picture – me having an ice cream in a square, for example – but that is the line of reasoning that starts making more sense once privacy and personal identity are seen as strictly related.
On the train, when I’m being bombarded by chitchat or someone next to me is shouting into a mobile phone, that is an intrusion of privacy, understood in terms of that person putting information in my brain, information that in some small but not zero degree shapes me and that I don’t want in my brain. The ownership approach does not help, but the personal identity does. The information I receive is modifying my self – it’s a small point in a way but, without wishing to exaggerate, it’s the beginning of a form of brainwashing. And as I didn’t authorize that particular intrusion, it’s not so much about me being deprived of my privacy in this case, but it’s about being fed information that then becomes me. All of a sudden, I know everything about this guy, without wanting to know anything about him. Roughly speaking, passive privacy in public becomes way more intelligible. But going back to your point, privacy is one of the major, classic issues that everyone understands.
WP: What if we move on directly from here to questions of violence, power and politics and perhaps… education?
LF: Education is a form of imposition. That’s why it has to be carefully regulated. In the United Kingdom, we speak a lot about some schools being the wrong places for young children, because they may be brainwashed there. Today, it’s extremism and fundamentalism of different kinds, but in the past it was something else, e.g. some kind of intolerant nationalism. It could be the worst kind of fascist, extreme rightwing or extreme leftwing beliefs, or some kind of racist position. School is when we basically impress upon young minds a particular set of information that constitutes those minds. And once those minds have been formatted in a particular way, and I use “formatted” intentionally in almost a technical sense, it’s very hard to undo the damage. Psychology tells us a similar story: kids who grew up in violent contexts or in the presence of abusive parents, are deeply affected and may become, in turn, people who abuse children. It just seems to be commonsensical. You just have to turn the page and once you start looking at privacy and identity as two sides of the same coin, everything makes a lot of sense.
WP: It’s a fascinating topic. There are obviously big debates going on right now, recently we heard Tom Cook from Apple defending privacy as a concept.1 Do you think governments are, in principle, on the wrong side right now as regards privacy laws and surveillance, be it the government of the United States or European agencies?
LF: If they are not yet on the wrong side, they are getting there. I think they are eroding one of the foundations of liberal democracy. Collecting data first and asking questions later: that is not a policy. What worries me is that democratic governments seem to be principally concerned about liability. They feel the need to ensure that they have done everything they possibly can so that nothing goes wrong; so that, when something does go wrong (and it will), they can show that they had done everything they could to prevent this from happening. Now, this is self-serving. Basically, governments are eroding privacy in the interests of their own self-preservation – and that is wrong. Clearly, it is not done for the good of society, it’s political, with a small “p”. A mature society knows that there are risks that are unavoidable. Anyone who lives in a big town knows that it’s impossible to be 100 per cent certain of stopping a terrorist attack. Any politician who says otherwise is lying. It’s just not the case. The reasonable course of action would be to explain to everybody that, in order to preserve certain values, some sacrifices have to be made, temporarily. And inevitably, one of the sacrifices is to run some risks. Risks that we saw, for example, in Paris. Those terrorists were known to the police, and yet they still carried out the attack on Charlie Hebdo. So massive data collection clearly doesn’t work. What does work is more tolerance, more openness and enhanced levels of dialogue, not becoming entrenched in a defensive position where you establish a surveillance society and every citizen already feels pre-judged, almost guilty by default. I find that very disappointing, but I think it’s a backlash, after so many disasters have happen all over the place. Insecurity generates a sort of hyper-reaction, which tends to reinforce measures, the real gain of which is exaggerated.2
WP: Clearly, these are the some of the most important topics now being discussed in western civilization. Let’s turn to people like Edward Snowden, who have exposed corruption in the political system, and yet, at the same time, we know that Snowden was basically used by Moscow on the eve of the conflict with Ukraine, and is currently a tool for another kind of threat to western civilization: which is an infowar of huge proportions. Peter Pomerantsev’s latest book Nothing is True Anymore and Everything is Possible goes to the heart of today’s infowar. I don’t believe it’s only Russian propaganda he’s describing, he’s talking about the injection of doubt and the lies that are being repeated, not with a view to making people believe these lies, but to confusing people.
LF: I completely agree and that’s one reason for playing a major card that western democracies have and other places don’t, which is trust. Citizens have to trust their governments; otherwise you undermine one of the cornerstones of society as we know it. To some extent, of course, we’re all human, none of us is perfect, but you do not expect to see what Snowden has revealed. It seems to me that the real damage was caused not by Snowden but by the policies that Snowden exposed. This amounts to the destruction of that particular cornerstone (trust); and now, we no longer believe what politicians or what governments, or more or less accountable agencies, are going to tell us about what they’re really doing.
To me, the best thing to do in order to move forward is not to avoid spying altogether, which has been going on ever since antiquity and the conflict between Sparta and Athens. After all, there are corners of the world that are pretty dark, and someone has to take care of those corners. But you have to be able to trust the people doing that, they have to be able to say “Look. I’m doing this for the public good”. And therefore there has to be transparency at some level and accountability; or course, not for me or you or every single citizen who goes there and asks questions, but there cannot be total opacity between what we’re told is going on, and what is actually being done.
Thus, for example, one way of fighting against Russian propaganda is to rebuild trust and, on that basis, provide better information, open information, things that people can actually double check. And this is our strength. If democracy is not resilient is nothing; whenever something goes wrong, there has to be a high level of certainty that you can actually fix it. People know what is going wrong, and as regards the NSA scandal, the disaster was that western secret services and security agencies were caught out red-handed in a way that was embarrassing. Now when they say we need more data from, for example, mobile phone companies, the reaction is “No, absolutely not. You’ve already made a mess, we don’t trust you anymore”. As to how you rebuild political trust, this seems to be more a philosophical question at base. But we should be fighting for politics with a capital “P”, the Good Politics that I would like to see implemented. This means considering long-term measures and implications and working for the common good of future generations as well.
WP: Here, I would give an example from Ivan Krastev’s book In Mistrust We Trust, a book about the failure of trust in democracies. Krastev’s approach would be that democracies have an inherent mistrust in the system of law because of the amount and availability of information that is gathered, which later gets exposed and we end up seeing kings naked more often than we used to; and when this happens, we see naked politics, naked power too. But you seem to be hopeful that trust can be regained by, what, more information?
LF: It’s not about information; more information may run the risk of generating more noise. I’m sure you remember the time when there were so many mobile phone contracts available that we couldn’t possibly decide which to choose; that was a strategy, it wasn’t accidental. It was meant to be confusing. More information, all reliable, all fair, all real, all true; and yet there was complete confusion.
It’s like being lost in a forest. Every tree is real, but you don’t know where to go. I don’t think that more information is the answer. The answer is empowering society’s regulatory agencies. So, for instance, the NSA could decide that one per cent of its budget may be spent on supporting an independent agency that checks what the NSA is doing, or on supporting an independent journalistic enquiry into what its activities. Of course there should be plenty of safety measures, such as journalists not being able to inform the enemy of what they’ve discovered. Some people seem to confuse transparency with everybody knowing everything, but in epistemology that’s called something else, not transparency, it is called common knowledge. Transparency enables those in charge to check what’s going on, it doesn’t need to be you or me, but someone you and I can trust as a third party, an independent agency to establish whether what the NSA or anyone else is doing is acceptable.
So imagine continuing with what we have always done in a democracy, building institutions that can check other institutions – who control the controllers, as the Romans used to say. Well, someone has to be there to do this. Then I would be a little bit more willing to trust the system. It’s like saying, “Look. I have nothing to hide”, so the agency that has actually been put in place can check and come and see whether anything illegal has taken place. That’s building trust, through the kind of institutional checks and balances that, at present, we don’t have. Many governments seem to be in the hands of the agencies that they have empowered, but which they don’t quite control as such. At least the American administration kept saying that they didn’t know what was going on at the NSA. So we shift from lack of trust to lack of confidence.
WP: What do you think about “memory”? Memory is also a function of politics – public memory, collective remembering – where there is much interest in information. Has technology and computerized data storage, which is not a category of memory in the original sense of the word, distorted collective memory and the way we do politics? Because computerized data storage never forgets data, while the human mind easily loses track of memories; moreover, forgetting is an important function in transnational processes.
LF: This is a huge topic. I’ve been involved in some of the discussions about digital libraries, for example, where the memory topic frequently pops up. Basically, there are at least two main problems; one you have already highlighted. Digital memory is very fragile; incredibly fragile. And we are blindly moving forward by relying on a misplaced faith in the robustness of digital memory. Just think that between 95 and 99 per cent of all the data we have today were created in the last ten years. It’s like a baby boom – they will all get old together and they will “die” at roughly the same time. Also, there’s malware all over the place. It only takes one big bug to destroy a whole database, something that doesn’t happen so quickly to traditional forms of memory.
The only equivalent here is the burning of the Library of Alexandria. Then there is hacking, a forma of digital vandalism. And finally, there’s obsolete technology. As you know, after a few years, your DVDs become unreadable, as do your hard disks. I certainly can’t read those big floppy discs on which I wrote my undergraduate thesis when I was in Rome. All the factors make digital memory incredibly fragile.
On top of that, like you said, this form of memory is non-discriminatory. This is the second main problem. It’s a bit like dynamite fishing; that’s not the way to fish – killing everything in the water and collecting the fish one wants. Today’s search engines have this dragnet and never discriminate. It may be early days, who knows what fancy technologies we will have in 100 years, but the point I want to make is that human memory not only forgets, but it’s able to remember without recalling (this is what I define as closure) – and that’s the difference.
So we find ourselves in a totally new situation; never before have we been confronted with the constant presence of so much memory so constantly recalled. Today, if you read the newspaper digitally, it’s basically one single endless front-page. There is no first, second and third page any more. The truth is that we are adding to one, single document as if it were a single, immense front page. As to what catches your attention, that is another question. But it’s all flat; there’s no kind of depth. Digital information does not sediment. And that is troublesome, because over the millennia, humans developed an ability to put things into perspective. We’ve lost that ability within a generation. No wonder we find it traumatic.
See, for instance, Matthew Panzarino, "Apple's Tim Cook delivers blistering speech on encryption, privacy", techcrunch.com, 2 June 2015, techcrunch.com/2015/06/02/apples-tim-cook-delivers-blistering-speech-on-encryption-privacy/; compare Farhad Manjoo, "What Apple's Tim Cook overlooked in his defense of privacy", The New York Times, 10 June 2015, www.nytimes.com/2015/06/11/technology/what-apples-tim-cook-overlooked-in-his-defense-of-privacy.html
Cf. Simon Davies, "Freedom through surveillance", Eurozine, 17 April 2015, www.eurozine.com/articles/2015-04-17-davies-en.html
Published 29 June 2015
Original in English
First published by Eurozine (English version)
Contributed by Res Publica Nowa © Luciano Floridi, Wojciech Przybylski / Res Publica Nowa / EurozinePDF/PRINT
Samuel Abrahám, editor-in-chief of Eurozine partner journal Kritika & Kontext, relates his attempts to translate a text by Czech author Milan Kundera into Slovak, and ponders Kundera’s prophetic words on the value of privacy.