Commonplaces of technology critique

What is it good for? A passing fad! It makes you stupid! Today’s technology critique is tomorrow’s embarrassing error of judgement, as Katrin Passig shows. Her suggestion: one should try to avoid repeating the most commonplace critiques, particularly in public.

In a 1969 study on colour designations in different cultures, anthropologist Brent Berlin and linguist Paul Kay described how the sequence of levels of observed progression was always the same. Cultures with only two colour concepts distinguish between “light” and “dark” shades. If the culture recognizes three colours, the third will be red. If the language differentiates further, first come green and/or yellow, then blue. All languages with six colour designations distinguish between black, white, red, green, blue and yellow. The next level is brown, then, in varying sequences, orange, pink, purple and/or grey, with light blue appearing last of all.

The reaction to technical innovations, both in the media and in our private lives, follows similarly preconceived paths. The first, entirely knee-jerk dismissal is the “What the hell is it good for?” (Argument No.1) with which IBM engineer Robert Lloyd greeted the microprocessor in 1968. Even practices and techniques that only constitute a variation on the familiar – the electric typewriter as successor to the mechanical version, for instance – are met with distaste in the cultural criticism sector. Inventions like the telephone or the Internet, which open up a whole new world, have it even tougher. If cultural critics had existed at the dawn of life itself, they would have written grumpily in their magazines: “Life – what is it good for? Things were just fine before.”

Because the new throws into confusion processes that people have got used to, it is often perceived not only as useless but as a downright nuisance. The student Friedrich August Köhler wrote in 1790 after a journey on foot from Tübingen to Ulm: “[Signposts] had been put up everywhere following an edict of the local prince, but their existence proved short-lived, since they tended to be destroyed by a boisterous rabble in most places. This was most often the case in areas where the country folk live scattered about on farms, and when going on business to the next city or village more often than not come home inebriated and, knowing the way as they do, consider signposts unnecessary.”

The Parisians seem to have greeted the introduction of street lighting in 1667 under Louis XIV with a similar lack of enthusiasm. Dietmar Kammerer conjectured in the Süddeutsche Zeitung that the regular destruction of these street lamps represented a protest on the part of the citizens against the loss of their private sphere, since it seemed clear to them that here was “a measure introduced by the king to bring the streets under his control”. A simpler explanation would be that citizens tend in the main to react aggressively to unsupervised innovations in their midst. Recently, Deutsche Bahn explained that the initial vandalism of their “bikes for hire” had died down, now that locals had “grown accustomed to the sight of the bicycles”.

When it turns out that the novelty is not as useless as initially assumed, there follows the brief interregnum of Argument No.2: “Who wants it anyway?” “That’s an amazing invention,” gushed US President Rutherford B. Hayes of the telephone, “but who would ever want to use one of them?” And the film studio boss Harry M. Warner is quoted as asking in 1927, “Who the hell wants to hear actors talk?”

In the light of the actual state of affairs – i.e. that someone wants to use the telephone after all – consensus is finally reached on Argument No.3: “The only people who want this innovation are dubious or privileged minorities.” In the Nineties the Internet was claimed to be the exclusive preserve of white men between 18 and 45. Nor was there any chance of its reaching broader sections of the population, because “women are less interested in computers and tend to avoid the impersonal barrenness of the Web. In the real, non-virtual world, women are more important purchasers than men – so the Internet lacks a crucial purchasing class.” This was the view of Hanno Kühnert in Merkur in 1997, under the provocative title “If the Internet does not change, it will die”.

Leisure expert Horst Opaschowski prophesied in 1994 that “The ‘multimedia ride’ into the twenty-first century will look more like a ghost train for a few lost Nintendo and Sega kids, while the mass of consumers will continue to be ‘TV crazy’. The multimedia surge isn’t happening. The movers and shakers have left the users out of their calculations.” As far back as the early Nineties it was regularly being pointed out that terrorists and Nazis, along with pornographers and their consumers, were the principal users of the Internet.

Some years later it becomes undeniable that the novelty enjoys a certain amount of acceptance, and not only among criminals and fringe groups. But perhaps it will just go away, if we screw our eyes shut tight enough. “The horse is here to stay, but the automobile is only a novelty – a fad,” was the advice given to Henry Ford’s lawyer Horace Rackham by the president of his bank, when he asked whether he should invest in the Ford Motor Company. Charlie Chaplin took the view in 1916 that cinema was “little more than a fad”, Thomas Alva Edison announced in 1922 that “The radio craze […] will die out in time”, and Ines Uusmann, Swedish minister for transport and communication, was still hoping in 1996 that “the Internet is a fashion that might pass”. So much for – the not very long-lived – Argument No.4.

Instead of denying the innovation’s existence, one can continue for a while to dispute its repercussions (Argument No.5). “Make no mistake,” the French chief of general staff reassured parliament in 1920, “the machine gun will make absolutely no difference.” Or “The Internet won’t change politics” (tageszeitung, 2000). Most probably the invention is only a fancy gadget (Argument No.5a) with no practical consequences: “a pretty mechanical toy” as Lord Kitchener described the first tanks in 1917. In particular, the new technology is no good for making money (Argument No.5b): “[Aeroplanes] will be used in sport, but they are not to be thought of as commercial carriers,” prophesied flying pioneer Octave Chanute in 1904. Der Spiegel, under the headline “The Myth of the Web”, reported in 1996 that “Josef Schäfer, branch director in charge of multimedia at the RWE enterprise in Essen, is sceptical about the development. Multimedia is indeed ‘an interesting market, which everyone wants to be part of […] But is the customer also prepared to pay for it?'”

A variation on the charge of uselessness aimed at communication technologies is objection No.5 (c), namely that its users have nothing to say to each other. “We are in a great hurry to construct a magnetic telegraph between Maine and Texas, but it is possible that Maine and Texas have nothing important to discuss,” was Henry David Thoreau’s suspicion in 1854 in Walden. The telephone and the Internet have been subjected to the same accusations. “The much-vaunted Internet is a perfect example of how the unlimited opening of IT channels leads not only to an undisputed sum of high-quality information but also to a flood of empty words,” declared the Dortmund communications scientist Claus Eurich in 1998 in The Multimedia Myth.

In 2007, the author Andrew Keen wrote in The Cult of the Amateur of “Millions upon millions of excited apes (many no more talented than our primate cousins)” who create nothing other than “endless digital forests of the mediocre”. The same year, writing in the Tagesspiegel under the headline “The Internet makes you stupid”, Henryk M. Broder conjectured that the Internet was “a leading cause of the infantilization and stultification of our society”. “If the New York Times has the same access to the public sphere as a self-help group for cannibals, then over time the public sphere will even out not at the level of the New York Times, but at that of the self-help group for cannibals.” Probably all that is at work here, ultimately, is the time-honoured fear and criticism of the masses, which seems even more inappropriate given that the conventional charges directed at mass media – spreading a homogeneous culture, dumbing down, the encouragement of passive consumption, conservatism, etc. – are pretty hard to apply to the Internet.

A little later it can no longer be denied that the novelty continues to enjoy widespread acceptance, is not about to disappear, and is even commercially successful up to a point. In principle it is quite good, then, but – objection No.6 – not good enough. For example, it costs money and will become more and more expensive. “Anyone who uses the Internet regularly has a noticeably higher phone bill, despite the good value of the connections. Costs for individual users will continue to rise” (Kühnert). It is slow and laborious and will only get slower: “Experts fear that the problem of overloading will reach a critical point in a few years, if a solution is not found first. Until then, Internet speeds will continue to decrease perceptibly,” announced Peter Gläser in 1996 in Der Spiegel, under the title “World Wide Wait”. (Then, as with Thomas Malthus, a solution was indeed found first.)

Most of these accusations have in common that their supporters see the problems as natural and inevitable and assume that the situation will get progressively worse, although from a historical point of view there is little to support such a hypothesis. Kühnert lamented in 1996 that, “One of these [search] engines answered the request for the word ‘Internet’ with 1881 responses. After the 120th entry, I had no desire to keep clicking.” Two years later Larry Page and Sergey Brin provided a remedy in the shape of the Google search algorithm. It was no longer necessary to click through all 1.5 billion (as of October 2009) search results for the word “Internet”, merely the first few. Which did not stop Der Spiegel declaring in 2008 that “the Internet’s biggest problem is the flipside of its greatest advantage – the surplus of information. Search engines indeed supply millions of hits to every possible question and sort them in a hierarchy according to their popularity in the Web – relevance by opinion poll, so to speak. Good critical sense has not yet, however, been introduced by Google in its algorithm.” There’s always something, isn’t there?

Besides, the innovation is over-complicated and temperamental: “The bow is a simple weapon, firearms are very complicated things which get out of order in many ways,” was Colonel Sir John Smyth’s justification to the English privy council in 1591 for why switching over from bows to muskets was inadvisable. The London Times, in a lead article in 1895, considered it “extremely doubtful” that the stethoscope would ever find widespread circulation, since it was time-consuming to use and caused “a good bit of trouble”.

Finally, the innovation is not 100 per cent reliable. In his book Signposts, the folklorist Martin Scharfe has collated reports and cartoons in which a key role is played by signposts with illegible, broken or missing arrows. The same distrust of new-fangled orientation aids, and the same Schadenfreude over those who think themselves particularly clever and well-equipped and still get it wrong, expresses itself in the reports popular since the late Nineties about motorists led astray by their satellite navigation devices. Belonging in the same category is the accusation that anyone can write anything in the Internet without it being checked, which in its day was also applied to the no-longer handwritten book.

It is high time at this point to think about what the innovation is doing to the heads of children, adolescents, women, the lower classes and other easily impressionable citizens. “Those weaker than I am can’t handle it!” is Argument No.7. The then 82-year-old computer pioneer Joseph Weizenbaum declared in 2005: “Computers for children turn brains into mush.” Medical or psychological studies are hauled in to prove a fall in standards of some kind or to posit a correlation with the technology currently causing a stir. In a study of 16 000 college students, psychologist Jean Twenge of San Diego State University discovered that “young people born after 1982 are the most narcissistic generation in recent history and have no concept of social orientation. Websites like MySpace and YouTube share responsibility for this, since they allow a level of self-exposure that goes far beyond what was possible in the traditional media.”

A forefather of these misgivings is, of course, the criticism of reading. “People read not to enrich themselves with knowledge, but merely to see, they read what is true and what is false mingled together, without examination, and they do this purely out of curiosity, with no real thirst for knowledge. People read and please themselves in this comfortable, busy idleness of the mind, as if in a dream state. The waste of time this causes is far from the only disadvantage that arises from reading too much. Idleness becomes a habit and creates, as does all idleness, a relaxation of the soul’s energy,” was the warning of the second edition of the Universal Lexicon of Upbringing and Teaching in 1844. With predictable logical consistency, this dangerous “bibliomania” reappears in the 1990s in the new guise of “Internet addiction” or “online addiction”. The “relaxation of the soul’s energy” did not escape the attention of Der Spiegel either, which in August 2008 complained that, “the mania for communication on the web has created individuals who are highly strung and conspicuous in their behaviour, who want to experience more and more and know less and less”.

Alongside the training of others in the proper use of new technology, we now have the resulting questions of etiquette (Argument No.8) – which strictly speaking are not questions at all, since they are not so much asked as answered without having been asked in the first place. In the early days of the printing press it was seen as bad manners to give a printed book as a gift; until the 1980s there was a stigma of rudeness attached to typed private letters. The criticism of the use of mobile phones in public deems a conversation with an invisible partner – as opposed to one with a third party who is physically present – to be an unacceptable lack of respect for the people in one’s vicinity. Sitting in cafés with one’s laptop open is something that restaurateurs do not like to see – it gives an antisocial impression and reduces takings – yet sitting around in public with a book or an open newspaper has not caused any offence for quite some time. The unspoken thrust of these complaints is ultimately that opponents of an innovation do not want to be confronted with it without their consent.

If the new technology has to do with thinking, writing or reading, then it will most certainly change our techniques of thinking, writing or reading for the worse (Argument No.9). For critics around 1870, the postcard sounded the death-knell for the culture of letter-writing, while in February 1897 the American Newspaper Publishers Association discussed whether “typewriters lower the literary grade of work done by reporters”.

In 2002, it was possible to read in the Neue Zürcher Zeitung how variations in the typewriter’s print quality and the noises it made embodied individuality and brought to mind the dynamics in music. “That is long gone. The computer has completely smoothed out any such inconsistencies in the particularities of one’s writing. It treats all thoughts the same; the picture it presents is uniform. Every kind of dirt or violent motion, the angle of the paper, the compression of the lines, a raised C – all that has vanished. Which, as we know, tempts us into carelessness: which of us has not thought at some point that they have composed a splendid piece of text purely because everything was so clean and pretty to read? Who has not been tempted to simply just start, and then amend this or shift that, delete this and save that?”

The NZZ is a latecomer on this score: in fact these accusations against the computer had already been exhaustively treated in the Eighties and early Nineties, for instance in Dieter E. Zimmer’s The Electrification of Language. Peter Härtling explained in the Marbacher Magazin in 1994 “Experts can spot prose written by a poet on a PC by the way it is imperceptibly shaped by the fear that the computer will crash.” A study published in the journal Academic Computing at the University of Delaware in 1990 claimed that the graphic user interface of the Apple Mac caused students working on them to make more spelling mistakes, write more carelessly, and use simpler sentence structures and a more childlike vocabulary than students using PCs. More recent variations are complaints about how the presentation software Powerpoint, with its “easily digestible diagrams and morsels of text”, leads to “shallowness of thought” (Der Spiegel, 2004) and to a diminishing capacity to follow longer texts at all.

On those rare occasions when the critic recognizes the lack of originality in his objections, he argues that this time things are very different – and much worse. The American essayist Sven Birkerts wrote in 1934 that “the difference between the early modern age and the present is – to simplify drastically – that previously the body had time to absorb the new transplanted organ, whereas now we hurtle forwards at breakneck speed.” A promising argument, given that it scarcely seems realistic to expect the pace of change to slow down. Quite the opposite: “The time to adapt and learn how to use the new technologies is getting shorter and shorter. There were 3600 years separating the first known written records of mankind and the invention of manuscript, and another 1150 between then to Gutenberg’s movable typescript. Since then there has been no pause for breath,” reported Der Spiegel in August 2008.

The fact that every new technology must go through these stages explains the unexpected amount of Internet criticism in the last few years. Just at the moment that criticism of the World Wide Web, created in 1994, was coming to an end, various Internet-based innovations, for instance the microblogging service Twitter (2006), were entering the first stages. “What is unclear about it”, wrote Bernd Graff in 2008 in the Süddeutsche Zeitung, “is simply why: why people bother microblogging at all; why people – as they have now started saying –‘twitter'” (Argument No.1). The “usual pitch of web-twittering” is “monotonous, of an almost touching simplicity” (Argument No.5c). Johannes B. Kerner asked in September 2009 “Who is interested in it? I can’t imagine that an election campaign would be influenced by it. It’s complete nonsense. Utterly shallow for journalistic work” (Argument No.5).

It is true that, in the absence of any reliable surveys, there is no precise idea at the moment of who exactly uses Twitter and who doesn’t. But the fact that it is probably not a representative cross-section of the population provides grounds for criticism, such as Christian Stöcker’s (Der Spiegel, 2008): “It is clear that Twitter is used more by presidential candidates, chubby Silicon Valley nerds in their late thirties and technology journalists trying to be hip than by young people themselves” (Argument No.3).

The iPhone (which first appeared in 2007) has gone through the critical stages familiar since the introduction of the mobile phone in the Nineties – “Who needs one? – “Not me” – “It’s only for show-offs” – and has reached the stage of “I’ve got myself one of those iPhones – but the contract’s so expensive!” (Argument No.6). We have seen, in the cases of both the mobile phone and the smartphone, how at the moment of purchase the intention to use the new device just like its forerunner seems to predominate: “We only want to be contactable on holiday – not to make calls ourselves!” we reassure the salesman unprompted; or in the case of the smartphone, “we don’t want to go on the Internet – just use the phone!” It can take a while until the innovative capacities of the device are actually used.

Currently it seems to take between ten and fifteen years until an innovation gets past the predictable criticism. Nowadays only extremely bad-tempered letter-writers accuse the text message, which has existed since 1992, of being responsible for the death of language. Still, in 2007 the news came through from Ireland – a museum of the kind of cultural criticism that has died out elsewhere – that the writing of text messages had a coarsening effect on the language of the young. The standard of 15-year-olds’ final exams, according to a study by the Irish State Examination Commission, had fallen compared to the previous year. “Mobile telephones and the increasing popularity of text messages” have a clear influence on the writing capacities of young people (Arguments 8 and 9), stated the president of the Commission in an interview with the Irish Times.

What is really remarkable about the public discontent about new technologies is how much they depend on the person’s age and how little on the actual object of criticism. The same people who welcomed the Internet in the Nineties dismiss its continuing evolution with the very same arguments that they originally cheerfully brushed off. It is easy to use and to value technologies that when one is 25 or 30 raise one’s status and allow one to know more than others. It gets harder when a few years later those same benefits need to be defended against progress.

There are two approaches to overcoming this problem: in the simpler version, one can at least try to avoid using the standard points of criticism, particularly when volunteering one’s opinion in public. The objections to new technologies gathered together here are not automatically unjustified – it’s just not very likely that any valid points of criticism will be discernible among them. If each of these steps did denote a real decline, the world would be a staircase in an M.C. Escher painting.

The harder therapy is called forgetting. Lower ambitions for status improvement are not the main reason for the generational difference in fondness for the new. Adult humans simply have too many solutions to problems that no longer exist. Consequently, there is a tendency towards excessive generalization on the basis of one’s own experiences. In a Spiegel interview from 1996 with the then 35-year-old Friedrich Küppersbusch, headlined “Nobodies can become virtual somebodies”, we see revealed the whole spectrum of the “been there, done that” problem: the Internet, according to Küppersbusch, is “not much more than the reinvention of the telephone, but with pictures and data connection”; “the chattering on the Internet is no different from the CB radio of the Seventies”. Interactivity is something people have been familiar with “on television since the Seventies under the motto ‘You can call us!’ In that sense the Internet is twenty years late.” People’s attitude to consumption is far too well-developed, according to Küppersbusch back then: the Internet is “a great toy, but one that like all developed mass media only serves to atomize itself”.

Asked at the end of the interview if this is the “German intellectual technophobe” speaking, Küppersbusch admits, with rare candour, the autobiographical roots of his discomfort: “No, I’m just not interested in rushing into the same disappointments that I had at the age of 15. I thought that if I set up a school paper, 1500 kids could take part. I tried the same thing later on a youth talk show. People had all the opportunity they needed and made nothing of it. I’m not up for that kind of frustration every week.”

Anyone who insists on sticking throughout their entire life to the view they formed of the world in their youth develops the intellectual equivalent of a comb-over: what looks to oneself almost the same as before, to everyone else are just three hairs scraped across a bald head. As long as we cannot be “flashy-thinged”, like in the film Men in Black, we have to keep setting ourselves the tedious task of forgetting, of unlearning. With any luck the government will see sense and offer educational opportunities for adults in future – where we can discard knowledge that has become a nuisance, let’s say about libraries, typewriters, publishing houses or television.

Published 16 September 2010
Original in German
Translated by Saul Lipetz
First published by Merkur 12/2009 (German version); Eurozine (English version)

Contributed by Merkur © Kathrin Passig / Merkur / Eurozine

PDF/PRINT

Newsletter

Subscribe to know what’s worth thinking about.

Related Articles

Cover for: Intelligence is everywhere

Developments in ecology and technology herald a new Copernican revolution: language, the bastion of supposed human superiority, also belongs to nature and machines. Can expanding our definition of intelligence improve our relationship with the more-than-human world?

Cover for: The post-liberal condition

The perfect storm of viral politics and the challenge of Big Tech test the faith in the liberalism–fascism dichotomy. Fascism becomes endemic, playing on the premises of liberalism, pitting them against its democratic ideals.

Discussion