Leila Brännström: In our societies, critical thinking is often seen as an ideal. Critical thinking and a critical perspective is, for example, seen as an important skill at all levels of education. The question is, however, what critical thinking and a critical perspective means. How is critical thinking related to the capacity to conduct critique of ideology? Ideology in the sense of conceptions which hold a hegemonic position in society, which are “engraved” into institutions, and which are upheld by people both in institutionalised contexts and in everyday life.
Anders Johansson: Unfortunately the critical ability – both of ideology and literature – is rather limited today. There is a certain type of commentary, thinking and writing which is both perceived and sees itself as critical thinking, without actually being either critical or thinking (literary critique in which the feelings of the reviewer is understood as the truth, academic research which is only restating an existing doxa, predictable moves in media debates which only confirm current positions).
Critical thinking is by definition dialectical: to critique an object is to split it, to show that what appears to be whole and self-identical is always compound and contradictory. But critical thinking isn’t only about seeing an object from a subjective yet critical perspective, but also vice versa: to separate the objective in oneself as well as the subjective in the object, to highlight how the one is always mediated by the other. In other words: on the one hand to realise that my preferences are not just mine, but a product of all possible more or less common circumstances and structures, and on the other hand to trust one’s subjective judgement because the “thing-in-itself” is not accessible in any other way. I would argue that this kind of thinking is currently lacking in Sweden. But it is difficult to resist the demand for simple answers, clear stances, confirmation of identities and solutions which correspond to the current order. And if you do resist the demand for simplicity, there is always someone else who delivers the preferred answers and acceptable opinion.
At the same time, critical thinking is by definition reactive, and therefore runs the risk of contributing to the sustenance of the old and undesirable which it is reacting against. Therefore it isn’t only critical thinking that’s missing, but also the capacity to move beyond negative, reactive thinking, and to stop being critical. There are contexts where this kind of reactionary critical thinking is needed – for example, why are reactions to the changed terms for bonuses and increased salaries for executive elites not stronger? But there are also discourses where the critical, reactive perspective is the problem. I have in mind a certain kind of books contributing to current debates, articles and campaigns that attack various kinds of “evil”, but where the real purpose is to consolidate the moral opinions that the author and reader are presumed to share. Some reactions to the entry of the Sweden Democrats into the Swedish Parliament are prone to this criticism. Critique lacks substance when it turns in to a self-declaration of being “against” something that 95 per cent of the population is also against: violence, racism, etc.
Malin Rönnblom: Critical thinking and a critical perspective are crucial for me as a researcher. However these concepts have lost their meaning. To some extent I agree with Ylva Hasselberg, who argues that the concept “critical thinking” has no function in public debate because it can be embraced by anyone and can take any meaning. Hasselberg writes that one “cannot separate different standpoints and interests in terms of their stance on critical thinking. Critical thinking is a buzzword without content”. Hasselberg’s solution is to use the term “judgement” rather than critical thinking. She argues that there is currently a struggle for the right to use the concept between various professions and various financial and political elites. The way I interpret Hasselberg, this conflict is about (maintaining) control over scientific judgement – in other words, reinstating and securing the role of the researcher as a critical examiner. However, against Hasselberg, I argue that it is important to give concepts such as “critical thinking” and “review” content and meaning, since this can counteract the tendency towards superficial concepts that Hasselberg takes issue with. And I think that one of the preconditions for researchers to be able to critically examine society is that we guard our independent “judgement” in the neo-liberal order that characterises our society, universities included.
I think that feminist research ought to be considered critical, in the sense that it originates in a questioning of the current political order and that it reflects on the political consequences of its research. Additionally, it is founded in the critique of the societal order that can be found in the women’s/feminist movement. The way I see it, a critical perspective in academia involves a reflection on the context in which we exist as researchers, on our own theoretical and analytical origins and our surroundings.
To answer the question, I think that a critical perspective involves critique of ideology, if by ideology we mean (political) ideas with hegemonic status in society. Critical review of the current political order could therefore be classified as ideological critique. However I am not sure this is an appropriate understanding of ideology. It would mean that ideologies such as socialism and feminism would not qualify by virtue of not having hegemonic status. On the contrary, I think it is the “engraved institutions”, maintained and preserved through people’s every day actions, which are at the centre of the critical mission of the researcher.
Sharon Rider: Commenting upon Kant’s essay, “An Answer to the Question: What is Enlightenment?” (1784), Michel Foucault suggests that we notice how Kant introduces a new question into philosophical reflection, namely, the question of the present, the now, in its contemporaneity. The question Kant is attempting to answer philosophically can be formulated as: “What is happening today? What is happening now? And what is this now which we all inhabit, and which defines the moment in which I am writing?” According to Foucault, Kant’s attempt at an answer indicates just how difficult it is to answer such a question, precisely because one is always part of one’s times, both as agent and element. Since Kant, the radicalness of philosophy as an activity has consisted precisely in this aim to problematize the present that is the condition of one’s own discourse. In other words, any philosophical critique of the present is always first and foremost a matter of self-criticism and self-awareness (implying no distinction here between the singular and the plural, the I and the we). In my view, this form of critique is deeper than what is usually called ideology critique; moreover, I would argue that the latter presupposes the former for its effectiveness. I’ll be saying more about his issue in my response to the last question, but here it should suffice to say that the concept of self-criticism consists in its direction: self-criticism is located here, in my (our) own thinking, rather than targeting some problem out there (in society, culture, politics). Self-criticism, in the sense I am using the term, is less something that you choose or do not choose to engage in or be committed to, than something that you find yourself doing, something with which you find yourself struggling. Ideology critique assumes at the outset that one can in some sense distance oneself from the object of critique; self-criticism admits from the outset that any attempt at such distancing runs the risk of self-deception. We are, as Foucault reads Kant’s insight, always already both actor and element in our present culture, inextricably bound up with its language, ideas and ideals, and so forth. When we notice that we are discontented with our own culture, it is the discontent itself that poses the question to us: what is it in my way of thinking, in my way of life, that could or even perhaps should be different? What is it that is “given”, that we take for granted, that perhaps isn’t at all so self-evident or necessary as we assume in our everyday dealings and discourses (nor even in the intellectual life that is parasitic upon them)? Where do these received and recognized norms come from, and what purposes do they serve? What would it mean, what could it mean, to try to see them as not necessarily belonging to me? In short, the political project of ideology critique presupposes that the philosophical problem of self-critical self-awareness has been at least in part addressed, if not resolved.
Let me give an example. Nietzsche was a philologist by training and profession. In his first book, he attempted to understand the origins of western aesthetic and scientific culture on the basis of the established scholarly question concerning the development of Attic tragedy. Many years later, a scathingly self-critical Nietzsche acknowledged that he had misunderstood his own aim in writing the book, which actually had nothing to do with such academic questions, but was actually concerned with something entirely different, namely: “science itself, our science – indeed, what is the significance of all science, viewed as a symptom of life? For what – worse yet, whence – all science?” The philologist Nietzsche began doubting the ideals and norms of the very science and scholarship to which he had devoted his life, suspecting them to be symptoms of intellectual and moral vacuity, pessimism and cowardice.
Nietzsche’s suspicions regarding the real meaning and purpose of the prevailing scientific and scholarly ideals of his day were, of course, not something that could be formulated in terms of a field of research or area of scholarly expertise within philology as such. In this respect, Nietzsche simply found himself at a distance from the science and scholarship of the period. It was not a matter of choice nor a theoretical stance, but rather a problem for him. He recognized in himself as a thinker his own feelings of emptiness, his own inclination toward pessimism, his own lack of courage. Now compare this situation, in which a problem arises in one’s own thinking, with learning that there exists a certain set of viewpoints and positions regarding the objectivity, validity, possibilities and limitations of science and scholarship, through reading, for example, Dilthey and Gadamer, or Popper and Kuhn, or Derrida and Mohanty. In the latter case, one learns, inherits or adopts a theoretical position, its arguments and implications, within a certain academic tradition. One chooses a stance, and continues within and contributes to a debate whose boundaries have already been set, for reasons often long forgotten. This is an entirely different state of affairs from one in which I am genuinely concerned by something that is important to me, not only in my profession or discipline, but in my (our) very way of life. Serious criticism of our culture, of the present, begins in this kind of questioning: in scepticism, distrust and doubt, not of something “out there”, but towards something which is our own. Critique that doesn’t begin in this kind of self-doubt isn’t critique in the deepest (philosophical) sense, but simply ideology reproduction.
In policy documents concerning higher education, that is, in the bureaucratic, administrative worldview, critical thinking is conceived as something that can be trained and developed through effective and efficient planning, as a kind of technique, or, to use one of their preferred terms, “competence”. In the Bologna Process, for instance, critical thinking is described and treated as an ability that can and ought to be inculcated through mechanical training devices and technical quality assurance systems of examination and evaluation. Of course, this is entirely wrong-headed. Genuine critical thinking arises when “I can’t find my way”, to paraphrase Wittgenstein.
It seems to me that this insight is a crucial starting point for ideology critique: we have to start by admitting not that there is a problem, but that we have one: we can’t find our way. Allow me to clarify by way of another example. We humanists in the academy have a tendency to crouch in embarrassment when confronted by comparisons with our more “useful” colleagues in other faculties, or, what is worse, run around wagging our tails for politicians and policymakers, wanting so desperately to please, to show them how useful we really are. I can only see this as a sign of a low level of self-awareness and even lower self-confidence. Humanists have apparently bought into the currently popular picture of what is important and what is less so, what is valuable and what is expendable, what requires justification and what doesn’t. The value of the humanities, I would argue, lies precisely in the insights one can derive from years of study of material (literature, philosophy, art) for seeing things from another point of view; we can ask different questions than those prescribed by orthodoxy, convention, tradition and the political and financial establishment. History has shown that may of these questions have evolved into practically useful, fruitful or even foundational concepts, categories and ideas for other sciences and institutions. As Steve Fuller has pointed out, the greater part of the science, social institutions and cultural heritage of the West has arisen in the wake of something some dead humanist once said. Now, of course, most of us are no Marx, Nietzsche or Freud. But we can do our best to keep alive a tradition the essence of which can largely be described as raising new questions, arising in and out of the situation we find ourselves in. By studying Kant or Nietzsche, not as monuments but as living testimony of a way of thinking, a way of working with problems, we can, by taking our cue from their example, learn more about how to move on when old, inherited ideas lose their vitality and relevance. But we won’t be able to do this merely by uncritically parroting their concepts or memorizing and debating the technicalities of their systems. Rather, what we can learn from them, by way of example, is how to re-appropriate, make our own, the tradition we have inherited, in order to ask and perhaps answer new questions, questions that are genuine questions, questions that “arise”, as Collingwood would have said, in thinking about and living out our lives. This is, I think, what we really mean by calling a Kant or a Nietzsche, a “great thinker”. On this view, their greatness lay not simply in the intellectual power of their thought, but at least as much in the valour to doubt, to seek real answers to real questions; to run the risk that one has perhaps posed the wrong question, or had the right question in mind, but has not succeeded in formulating it in a way that allows for an answer; to accept that one might very well find oneself forced to go back to the beginning, over and over again (this last description is pretty much identical to Husserl’s definition of philosophy in the “radical”, as opposed to merely academic, sense).
The most radical thing one can do today is to reconsider rather than repeat the standard repertoire of isms and theories, including once radical ones, concerning, for instance, the hypocrisy of western liberalism, the subjectification processes of colonialism, the fluidity of gender, the performative nature of heteronormativity, etc. I don’t mean that we ought to reject or modify these ideas. My point is almost the opposite: that the original formulations from which such notions are derived were often groundbreaking, potent, incisive, provocative and even decisive for current and future political, cultural and intellectual life. They made us think in new ways, and impelled us call into question things that had hitherto been “self-evident”: they were liberating precisely insofar as they opened a clearing from which we could see “the given” as “given” under and by conditions which are not nearly so ineluctable as we assumed. And when we see that the conditions of our world are not as inescapable as we have been led to believe, we are empowered to think the radical thought: what shall I (we) do? But notions having to do with set concepts about, say, colonialism or gender construction have now become part of the established theoretical and intellectual discourse, rather than a way of standing it on its head. They’ve become hard cash in the marketplace of ideas.
I think that we need new resources today, if we are to have any possibility of collectively choosing another life form, one which doesn’t leave us angry, adrift, alone, abased, at cross purposes with ourselves and with each other. Speaking allegorically, if I may make a literary allusion (to Melville’s Moby Dick), Starbuck’s fair-minded common sense, honest intentions and good will simply aren’t enough to bring the Pequod and its crew back to civilization. Ahab’s vision is what gives the crew a sense to their suffering: something to strive for and the confidence that they have the capacity to act, to choose their fate, instead of being cast about by the vicissitudes of fortune. The promise of meaning through self-determination is shown to be a far stronger motivation for the men than the appeal to the Protestant values of duty and wealth offered by Starbuck. In this respect, I see much of contemporary cultural criticism and social analysis more as an expression of our way of life than a radical questioning of it. To be sure, Ahab is blinded to the point of insanity by his hatred and malevolence. But that he has such sway over the men has to with him offering them something entirely different from what they learned from catechism.
I would like to pose the question: must one be insane or obsessed to think for oneself? The point of critically examining our own society must ultimately be that one honestly feels ill at ease, at odds, with something about the form of life that it institutionalises and embodies. If one is seriously in doubt about something, one cannot at the same time be an ardent and outspoken proponent for the very thing that one is calling into question. Rather, one becomes doubtful as to the very grounds upon which is standing. All talk about alternative visions for a future society must begin here, with our own failings and misgivings about who we are. I realize very well that many people, perhaps most, will find this sort of reasoning romantic, dreamy, escapist – in Marxian terms, bourgeois ideology. After all, on what basis can we enact change with self-reflection as a starting point? My answer, however unsatisfactory, is that this is precisely the question that we should be asking as radically beginning (in the philosophical sense sketched above) cultural critics. For this is the heart, the very stuff, of critique. Kant’s famous maxim in the essay cited above is a good start: Sapere aude (“dare to know”). What we require is not some sublime new theory, but the courage to make use of our reason without being led by someone else. To think for one’s self in this regard means not relying on received notions and norms: these are constituent elements of the present. Polarisations, contentions and debate presume the terms and their internal relations from the outset. What we should be trying to do is to reconceive the present as best we can, beyond “the given”, i.e. the enabling assumptions upon which the polarisation derives its force and sense. In becoming aware that we have such assumptions, in bringing these to light, we can make room for something else by dissipating the appearance of necessity. Philosophy, as the radical questioning of origins and assumptions (in contrast to a quasi-technical academic discipline) and social criticism are one and the same.
LB: The political parties have moved closer to each other and most of us cannot imagine a society which is radically different from the current growth maximising liberal democracy. Is the space for critique of ideology smaller than before?
AJ: Well, I would think it is a bit of both. It is certainly the case that the political alternatives are confusingly similar, so if one wants to speak from within the existing hegemony the critical alternatives may be fewer. But the hegemony should be criticised. One might not be taken seriously on the comment pages of the daily newspapers, but in a way that is their problem, not a problem for critique. As such, the space in media for critique of ideology is smaller, but I do at the same time think that the major restriction is self-imposed rather than external. It is of course notoriously difficult to separate internal and external restrictions. For example, a professional critic or debater must not only have the current task in mind, but also potential future commissions. When the economic conditions for critics are worsened and become more insecure, there is a greater risk that their work is reduced to self-promotion. The newly-founded critics prize “Lagercrantzen” awarded by the Swedish broadsheet Dagens Nyheter is an indication of this: the space for critique is shrinking, the economic conditions are getting worse, therefore we salute the excellence of the critic. As if the problem could be found on an individual level.
This may be a simplistic and rather gloomy picture – you can sometimes find some surprisingly well-written pieces in the comments section of papers – but it is difficult to be positive today. One might have to put ones hopes in specialist cultural magazines (as long as they are allowed to exist), academic publications and possibly the Internet.
MR: The short answer is yes, if we by ideology mean the ideologies that the major parties originate in. The question is, however, if it is a matter of a reduction in the space for critique of ideology or the space for ideology. If the starting point for ideological criticism is the classic ideologies such as socialism, conservatism and liberalism, there are numerous indications that, in their party political versions, they have been merged in to some kind of neo-liberal “meta-ideology” whose first and foremost privilege is to not have to define itself as an ideology. In other words, to cite Chantal Mouffe, the political has abandoned politics. The question is whether this meta-ideology puts different demands on a critical perspective than when ideologies are more clearly defined.
The way I see it, the political ideologies we are familiar with have had to give way to a neoliberalism that goes beyond traditional political ideologies but still has an influence on party politics. Wendy Larner discusses three meanings of neoliberalism, as policy, as ideology and as governmentality. As a consequence, critical examination of ideologies becomes rather uninteresting – because the power to define “the good society” has become part of a neoliberal governmentality, where economic models are applied to all spheres of society, as if they were superior measures of success. This governmentality is expressed in different ways, but a focal example is governing by audit. As Nicolas Rose argues, “audit is the control of control”. Constant demands of measureable targets and indicators, evaluations and self-evaluations, changes both the content and aim of politics – not the least by depoliticising it.
It could thus be said that a neoliberal governmentality has replaced the role of the classical ideologies, with the audit assuming the role of the classical critique of ideology. This situation leads to a reduced space for critique – not only for academics and writers, but also more generally in the political discourse. The space for questioning shrinks when the political is abandoned. To work at all, critique must originate from resistance, contradiction and conflict, in other words the political. The main thing that the neoliberal governmentality is limiting is politics. When the citizen turns in to a consumer and the ideal of the good society is reduced to (economic) growth, the political becomes redundant. Market regulation is apolitical by definition and can therefore not be subject of criticism.
SR: In point of fact, we’ve never been capable of conceiving a society radically different from the one of which we are a part, anymore than we can lift ourselves by our hair. What we can do, and have done, is entertain utopian visions. But these visions themselves are expressions of what we, at a given point of time and under certain conditions, can conceptualize as a utopia. The grave danger with utopian visions is that they almost always begin from a deterministic starting point, that is, with the assumption that the human subject, as individual and as a collective, cannot, and ought not attempt to, free himself from his history and/or his material conditions. In utopian dreams and schemes, the human being is conceptualized either as determined by some formal categorization or concatenation of categorizations (“class” in Marxism, “race” in Nazism, “culture” in various forms of fascism, etc.) to which she is entirely and irrevocably bound, or as free in some entirely indeterminate, abstract, and thus vacuous sense (as in radical liberal and libertarian doctrines, as well as in certain theocratic ideals). I’m inclined to agree with those who fear that such visions lead either to Treblinka or the Gulag, in the case of collectivist ideals, or to the loss of human values, in the case of liberalism.
The latter requires perhaps some explanation. Liberalism as an ideology contains within itself two at times contradictory ideas, namely, “the free agent” and “the free market”. The inherent contradiction, however, is not conceptual, but practical. It’s not that the free market in and of itself constitutes some sort of negation of individual liberty or human liberation, but rather that we have seen how well the free market can function, all too well, without human liberty. Indeed, human liberty can be an impediment to economic growth and the prosperity (for some at least) that attends it. The market seems to be at its most efficient when it’s at its most autonomous (that is to say, freed from all the political constraints of democracy). It produces more at lower costs if the individual is shackled. Marx recognized that that shackles on the mind are more efficient and productive than shackles on the body, but the genius of neoliberalism is the production of a human being capable of perpetually re-creating and re-inventing his own psychic and social shackles through a new and extremely fertile conception of autonomy: self-realization through conspicuous consumption, not merely of goods and services, but also through the “free” selection of pre-fabricated self-projections.
The conservative American political commentator George Will has described contemporary capitalism as suffering from bipolar syndrome: it demands highly self-controlled workers who are at the same time out-of-control consumers. Mill’s apprehensions that liberalism’s triumph may not at all in the end lead to a better world inhabited by a better kind of man have shown themselves to be perspicuous. Liberalism as a social model always runs the risk of degenerating into its present shape, where human judgment is replaced by formal protocols and regulations, where human language is reduced to mass-produced thoughts and commercial and political slogans, where human care and concern are debased into cheap made-for-TV feelings and sentimentality. On the other hand, liberal democracy opens, at least potentially, for a questioning of its own conditions. This is where its strength and intrinsic value lay, if only we are capable of taking that liberty – and I do mean taking it. It’s not something that is given to us.
LB: Are the material conditions for critique of ideology especially bad in our times? We might discuss funding opportunities for projects which are not primarily impact-oriented, or the space for longer, more explorative texts in newspapers, or the organisational and operational structures of political parties.
AJ: When it comes to academic research it is obvious that the research with clear impact and use is favoured. This may not be new, but when the decisions about funding grants are moved from the faculties at universities to the central organisation for research grants, The Swedish Research Council, and bibliometric assessments (which are based on whether the researcher has been published in Anglo-Saxon journals) are more extensively used, this tendency is amplified. I have recently been startled by the need to defend critical thinking even within in the humanities. It is as if they are so keen to follow the political directives that they turn conveniently uncritical before anyone has the chance to force them. When they themselves dismiss critical thinking with arguments such as “it’s irrelevant” or “it’s so negative” there is undoubtedly reason for concern. All of a sudden, the Deleuzian critique – “critical thinking is reactionary by definition” – seems misguided or too sophisticated. Academic research is only reasonable and thoughtful if it is founded in critical thinking. I can’t imagine the humanities having any authority if it does not engage in critical thinking.
MR: I would rather say that the limited space to publish problematizing texts is not only about material conditions, but also what is considered “good” or “viable” knowledge. In an article from 2008, Carol Bacchi argues that the previous relative distance between research and politics is shrinking, as illustrated by the concept “user-driven research”. She contends that researchers are increasingly rewarded for research prioritised by the political establishment, and sees a shift in the role of the researcher – a shift from researching the right questions to getting the right results.
Another element central to this shift is the limited opportunity for researchers to critically examine the formulations of policy problems and, consequently, what solutions are possible. Instead, it is assumed that researchers are to accept the political problems the state formulates and find solutions, thus doing more traditional policy research. Instead of examining how ideologies, values and interests permeate policy processes, researchers are forced to take them for granted. As a consequence, it is politics that determines what “evidence” counts.
Bacchi also argues that the opportunity for researchers to question their new position is limited, especially because of their/our dependency on external funding – and in Sweden this primarily means funding from the state. An audit by the Swedish National Audit Office showed that more and more research grants are left unutilised at the universities. The reasons for this are, for example, that grants reach fewer researchers, investments are made in large collaborative projects and research contexts, and the fact that permanent positions are often linked to external funding. I would argue that this development could also be understood from the perspective of the neoliberal governmentality that dominates even the academic world, where “excellence” and “world-leading research” is assessed by the number of peer-reviewed publications and the sum of previous research grants. “Those who have much will receive more” seems an adequate description. When the administrative responsibilities for researchers increase, the time for critique decreases. Researchers simply do not have time to be critical. As Bronwyn Davies and her colleagues put it: “The talk that informs critique and the development of a counter-discourse takes time – time that no-one any longer has.”
SR: It is entirely possible to write serious texts about serious matters, even in the vein of ideology critique. The web is filled with all sorts of critical analysis, some of which is quite good. In this respect, the conditions for open critical discussion have never been better. For academic work, each and every one of us must ask herself the question: do I have something on my mind that I need to discuss with others who have the same concerns and questions, or do I want academic recognition (research funding, a professor’s title, citations, and so forth). If it’s the latter, then yes, it is probably more difficult than ever to “to hold one’s own” in the competition. But is that the point and purpose of our supposedly “critical” activity – to beat the competition? That would mean simply buying into the current system of norms and values by which the quality and worth of our thinking is assessed. But such “quality assurance” systems are themselves often a kind of reification of the internal sense and purpose of scholarly work. Academics today seem more willing than ever to accommodate themselves to whatever’s in demand: they manufacture impressive quantities of often short, highly specialized articles instead of comprehensive and penetrating book-length studies on problems that take years to survey and master. I find it striking that even “Foucault specialists” see to it that they publish in English in internationally recognized (i.e. Anglo-American) journals with high Journal Impact Factor (JIF), one of the most important instruments used to evaluate (and steer) research in public funding of research. And naturally, from the perspective of success in competition, they can hardly do otherwise. All else would be failure: in the eyes of the funding agencies, in the eyes of their colleagues and, ultimately, in their own eyes. They have internalized the system and become “academics” in exactly the sense required by the system, which they reproduce in their own research and teaching, even if it amounts to what Habermas would call a “performative contradiction”. For if the point is to achieve a kind of Verfremdungseffekt, i.e. to create a distance from the norms, rules and values that form the conditions of one’s own activity, via recognition of their contingency and the definite functions that they serve, which was explicitly one of Foucault’s aims, one might expect a freer attitude exhibited in the scholar’s own manifestation of that recognition and the freedom that it allows.
When we recognize the specificity of these systems, that is, that they do not reflect some neutral objective necessity, well, then we aren’t really sure about what we should expect of ourselves, are we? If we don’t know in advance of our own reflection what we should do, well, then we can do something other than what comes “naturally”, what we know is expected of us (by others). To the extent that the systems become so effective and efficient that there is no longer room for such reflection, i.e. hesitation and doubt, within the confines of academic scholarship and teaching, then in effect there is no longer any room for critical thinking. Perhaps this is already the case; perhaps the university’s role as the guardian and guarantor of unfettered, critical reflection (insofar as such a thing is possible even as an ideal) has become obsolete. But to accept such a conclusion need not mean bowing to this system of thought. It merely means admitting that perhaps we’ll have to look around the corner for other possibilities and venues.
To summarize my verbose response somewhat more neatly, my answer is that the necessary material conditions exist, but they are hedged by an intricate and intellectually crippling system of “quality control”. The economic issues are not, however, the heart of the problem. The most serious challenge to critical thinking today is not lack of money, but lack of time. Time, like liberty, is not something given, but something you take. You need to take the time necessary to think through a problem, to formulate your own question and find an adequate answer. But taking the time to think has its price, both on the personal and the professional level. You might risk losing the comfort of belonging to a community with shared implicit values; perhaps you won’t be counted as being “representative” of your discipline or area of research, but be regarded as something of an outsider; you might even decide that you don’t want to represent it. The difficult question is whether or not your prepared to pay that price; but that is not a political, economic or academic issue. It’s a moral one.
LB: The classical critique of ideology aims to expose certain representations, belief systems and convictions that guide our lives despite being false. Ideology is false consciousness and the critique of ideology, which originates in the possibility of an objective description of the world, confronts ideology by showing how things really are. In the imminent critique of ideology, such as, for example, the Habermasian interpretation, accepted and recognised ideals and promises are contrasted to a reality that fails to live up to them. The gap between normative benchmarks and real conditions is exposed. In a more utopian critique of ideology (possibly queer theory), current conditions are contrasted to a vision of a different future. Here they rest on the possibility of a better future and contrast it to the limitations, deficiencies and plainness of the contemporary. What are the points of departure for the critique in the political debate, aesthetic activity and the intellectual conversation today? What does the critique intend to expose? How do critics envision the critique to be useful? Or is usefulness not the primary purpose?
AJ: I’m not sure about the portrayal of critique – if we limit ourselves to art and literary critique – as an auxiliary discipline. Even if all critique is secondary to the object it is criticising, it is simultaneously a genre of its own – its raison d’être is just as complex as the one of the arts. I would argue that literary critique, in the widest possible meaning, today is potentially more interesting and more challenging than the fiction it relates to. The latter is often so deeply and unreflectively immersed in its own autonomy, preoccupied with fulfilling the demands (from colleagues, publishers, press etc.) of its own biosphere, so preoccupied by the false liberal individualism that permeates society, that it lacks critical power. The value of critique lies not in portraying some kind of utopia. I believe, rather, that it is within the critique itself – the theorising and thinking – that an alternative can be found. Critical thinking that sees its mission as giving answers, visions and utopian promises risks of losing its critical edge.
MR: I think it’s difficult to assess the foundations of the critique that we see today, both in political debate, aesthetic activity and intellectual discourse. What I see, especially in artistic expressions, are different attempts to challenge the established orders by presenting alternative interpretations of apparently self-evident problems, a way of exposing norms and preconceptions which act as “engraved institutions”. Here I think of for example the exhibition Lost and Found – Queerying the Archive, shown in Copenhagen and Umeå in 2010, or the work of Elisabeth Ohlson Wallin. These are artistic expressions that “force” the observer to reflect on the things taken for granted. This kind of critique is of course evident in intellectual discourse as well, but I think that, as researchers, we could learn from the critical perspectives of arts and culture. For me, as a researcher, critique is about examination and questioning, not primarily about presenting useful (political) alternatives. But this does not mean that ideology represents the normative whilst the critical represents the real. I find this approach problematic because it assumes a kind of ontology that separates the researcher from the research. Nor would I agree that, for example, queer theory contrasts “the way things are” to a different possible future. The way I see it, queer theory is about questioning whether it is even possible to say things in a certain way. Maybe the usefulness of critique is rather to expose the “engraved institutions” so that they become visible and as a consequence can be changed.
SR: The infected question of utility is a destructive and divisive one, especially in the humanities and social sciences. The most dangerous aspect of it is that it encourages us to neglect or even reject what I take to be the primary function of the study of the humanities: to ask non-standard questions. To engage in critique (which is not the same thing as having a negative attitude) is to attempt to examine underlying assumptions and validity claims, to the best of one’s ability, honestly, rigorously and with no holds barred. But this critical task must encompass, not merely the standard answers we get from politicians, policy-makers, administrators and industry, but even one’s own ideas and assumptions about what constitutes an intelligible, pertinent, important or reasonable question to ask. In my view, too much scholarship in the humanities and social sciences today uncritically assumes and accepts current assumptions about what sort of questions to ask, how the questions should be formulated and what sort of answers one should expect. Strikingly, even (or perhaps especially) scholarship of a utopian kind is thesis-driven, accepting at the outset the terms of the discourse under discussion.
My second reason for resisting the notion of “useful” scholarship in the humanities is related to my first. I want to ask: “Useful to whom?” “Usefulness” usually goes hand in hand with adaptation to current ideals, which are in turn tied to certain determinate interests and aims. The most common “use” of humanities scholarship seems to be as a bludgeon in political contexts: “Research has shown that X”. I would, in contrast, like to see more scholarship whose value is not dependent upon some pre-determined use to which it will be put, by those with an interest in using it for just that purpose. I don’t mean by this that social scientists should stop conducting, say, quality of life research about people with functional handicaps – on the contrary. But if it becomes universally accepted that research and teaching in this field is and should be steered by values, expectations and norms extrinsic to science and scholarship as such, and these extrinsic expectations and norms by necessity lead to certain kinds of results, regardless of the intrinsic norms of science and scholarship, then we really have depleted the idea of science and scholarship of all meaning, except for the institutional. Here I am inclined to invoke Weber’s distinction between value-neutral and value-free science. For the scientist/scholar as a human being, the latter is an unachievable, and perhaps even undesirable, goal. The former, however, is, if only as an ideal or regulative principle, an absolutely necessary condition for science and scholarship.
To return to the first question, one could say that the best way to question a dominant ideology is not to advocate its opposite, but to describe – thoroughly, perspicaciously and in detail – the object of critique. In other words, instead of proposing a counter-ideology, one shows how the representations articulated in it arise through a study of: i) the assumed vocabulary; ii) how this vocabulary de facto forms our way of talking, seeing, thinking, living and acting; and iii) how this picture emerges though language use, and in what ways the representational techniques propel thought in a certain direction.
I would argue that this is a fair description of the manner in which Marx and Foucault, but also Freud and Nietzsche, conducted their investigations. In a sense, I would call all of them “philosophers of language”. Notice that in all these cases, one sees oneself in the attitudes, patterns of thought and forms of speech that are subjected to critique. We find them compelling because we suspect that we too have unwittingly assimilated certain ways of thinking – digested them whole without first feeling them, tasting them, smelling them – and we wonder what it is that we’ve swallowed. In revulsion, we ask ourselves: Why did I consume that? What did it look like? And just who is it that served that stuff? Was it served together with something else? Did everyone swallow it? And what is it made of? What are the main ingredients? A sober and careful attempt to answer such questions provides us not so much with an accurate explanation as with a deeper understanding of the enabling conditions for our present form of life, of why we think and act as we do. It seems to me that this kind of self-knowledge or self-reflection is far more “useful”, for ourselves and for others, than a utopian recipe for another repast or refection, which I myself and those who share my tastes and inclinations might find appealing.
LB: Many people who conduct and have conducted critique of ideology have implicitly assumed that when there is a gap between deception and reality, between ideals and reality, or between reality and visions of the future, the exposure itself provides a motive for action. Peter Sloterdijk argues in his Critique of Cynical Reason that the false consciousness in our society is “enlightened”, by which he means that we can see the deception of the ideologies that we are surrounded by. But out of convenience we choose to act as if we took them seriously. According to Sloterdijk, we constantly act in this way, despite knowing better. Or as Slavoj Zizek puts it: “they know very well what they are doing, but still, they are doing it”. If Sloterdijk’s diagnosis is accurate, how can the critique of ideology meet the challenge posed by the cynical reason? What methods and tools are used, and can be used, for critique to lead to a willingness to change? Would affirmative rather than evaluative forms of critique mobilize more?
AJ: I am not sure that critical thinking is necessarily about influencing the ideas of people to in the end change their behaviour. It is not an undesirable consequence, but the causation may not be so clear. Zizek’s (and Sloterdijk’s) observation is probably perfectly accurate in that respect: critique of ideology does not have that power, because rationality isn’t one-dimensional.
I think there is something authoritarian and uncritical in that kind of traditional critique of ideology. We do not need more intellectuals who claim to have discovered what is false and informs the unenlightened how things really are. If thinking is to be critical, it has to be more critical of itself than that. Once more I think that the liberating potential lies in the thinking itself, not in some kind of supposedly liberating message. The task – if “mobilisation” is even the correct term to use – is to force or tempt the reader to think, rather than thinking for him or her. At the same time, it must be said that we need different types of critics and intellectuals, more forms and temperaments: both pedagogical and uncompromisingly philosophically advanced, both provocative and attentive, both short and long, both historicising and politicising. The straightforward answers, the assertive messages, can in some contexts – such as the media – be justified.
MR: I think it is important to perceive of critique at two different levels. My task as a researcher is to make ideologies accessible for people, particularly by pointing out the different consequences for different kinds of people. The exposure becomes essential. Critique may not need to be either affirmative or judging – if we by the latter mean that critique is to determine whether something is “good” or “bad” – but it can be scrutinising. Researchers are often criticised, particularly by journalists, for not taking a clear stance, that they “hide” behind alternative interpretations. I argue that it is a task in itself to clarify the meaning, and the political consequences, of different alternatives to support people in their political decisions. I also think it’s important to relate critique to people’s everyday lives. Critique has to affect people; it has to speak to you individually. Critique has to be felt, be acute and personal.
LB: Is there a kind of critique, within a certain area or critique of certain things that you think is missing in the political and intellectual life?
AJ: It is tempting to call for more self-criticism in many areas, particularly among those groups who have a preferential right to interpretation: economists, terrorist experts, media advisers, politicians, celebrities, etc. This is of course hoping for a utopia. But it might suggest that other actors should criticise these agents in a way that they themselves fail to do. Take the Swedish foreign minister Carl Bildt: somehow he has been given such an aura of wisdom that no-one seriously questions his conduct. He is often asked to comment on the current state of the world, and journalists and commentators take his opinions as facts.
There is, at the same time, a great need for critics to be more self-critical and reflective. A critic that does not question their own position, their language, their theoretical premises, their ideals, their emotions, their network, their upbringing, etc., is not as critical as they should be.
MR: I am particularly concerned with the lack of a critique of what I have referred to above as the neoliberal meta-ideology and its consequences across society. Here I mean neoliberalism as governmentality, rather than as a policy or more “pure” ideology. I am concerned by how society is governed by “audit” and how we constantly subject ourselves to it. Furthermore, I take issue with that politics has become all about packaging. The party leaders have become brands rather than political visionaries and it is difficult to find the political in the show and image. And what happened to critique? To the critical, intellectual discourse that challenges the establishment and at the same time forces the establishment question their own perspectives?
I do however find it difficult to answer this question, and this frightens me a bit. I ask myself: Have I become used to the lack of critique? But surely there must be a critical discourse? I have the latest issue of Bang in front of me. It is the anniversary edition: 20 years of a critical magazine. So of course the discourse exists. And maybe it has to be situated at the margins. But I would like to see more of it in the mainstream media. And in mainstream research. I think that journalists have, to a greater extent than researchers, surrendered to the neoliberal “higher ideology” and abandoned their scrutinising mission. Tomas Lappalainen wrote in Dagens Nyheter in the context of Italy’s 150th anniversary as a nation state that in Swedish media Italy is portrayed as subject to a media dictatorship ruled by Berlusconi: “This is an inaccurate description. Anyone who lived in Italy for a while would actually rather feel oppressed by Swedish public conformity. In Italy, the debate presents such opposed views that, according to Swedish sensibility, it would seem seriously dangerous […]. Large newspapers, such as La Republicca, publish extremely intelligent attacks on the government basically every day.”
Where is the third power when we need it most? Where is the media’s examination of the meaning of established politics for different groups in society? What we see is reports of political statements and inadequate analyses of the consequences of political decisions. Where is, for example, the thorough critique of how the Sweden Democrats entry in to the Swedish parliament affects the political establishment and the possibilities to politicise conflicts in society? What we are offered are descriptions of the behaviour of individual members of parliament, not a critical examination of what the homogenisation of the political establishment really means. Critique is a necessary condition for a working democracy. It is everyone’s responsibility, but particularly the responsibility of the media.
SR: Just about everything that I’ve said thus far can be read as a plea for more self-criticism and self-examination. The Delphic command to “know thyself” has haunted philosophy since time immemorial, perhaps because of just how difficult it is to follow. The great philosophical projects of the past all began there. Socrates doubts and tests his own capacity for wisdom; Kant examines the conditions for valid knowledge claims, including those of philosophy; Nietzsche asks himself if the scientist’s and scholar’s will to truth and the accepted academic standards by which it is to be attained are really what they purport to be. The main problem with the notion of “critical thinking” on the whole is that, like ideology critique, it tends to be directed there, at that, instead of here, at us. There doesn’t seem to be any room for genuine confrontation with the deepest problems, namely, our own starting points. By way of illustration, let’s take a rather banal example. It has become common in the humanities in recent years to talk about our “research” with a certain insouciance, as if this were the core of humanist thinking. In certain disciplines, such as archaeology, there’s something to that. But to talk about philosophical “research” in the same way as research in, say, limnology or cardiology, is to gloss over fundamental differences. By this, I don’t mean first and foremost differences in “method” (“qualititative” vs. “quantitative”, for instance), or differences in the nature of the “object” (say, “brute facts” vs. “social facts”), but something much more profound, namely, the meaning of the activity in question. I would argue that the best humanist thinking is just that: “thinking” as opposed to “research” in the contemporary, bureaucratic sense. The best work of a Nietzsche or a Marx (or a Freud or a Weber, for that matter) grew out of a research context, to be sure. But their greatness was not “excellence”, to use one of the emptiest of today’s popular conceptual placeholders, as “researchers”. My point here is that this slide into talking about “humanist research” is an expression of an almost sacrosanct dogma of our day, namely, that there is no value in the kind of serious, thoughtful and studious doubt, hesitation, scepticism and concern that I have suggested is so badly needed. What I am calling for is, one might say, a kind of historicized transcendental philosophy, in which we as thinkers examine the conditions for our own activity of thought.