Thinking critically about ‘critical thinking’
Reader advisory: this is a long post . . .
‘We must never’, Bismarck is said to warned, ‘look into the origins of laws or sausages’. Sage advice, I’ve always thought (and no pun intended with that ‘sage’)—but how much at odds it is with the dominant current of modern thought, which is to say Enlightenment thought. Immanuel Kant, a great hero of the Enlightenment, summed up the alternative to Bismarck’s counsel when, in an essay called ‘What is Enlightenment?’, he offered ‘Sapere Aude’, ‘Dare to know!’, as a motto for the movement. Enlightened man, Kant thought, was the first real adult: the first to realize his potential as an autonomous being—a being, as the etymology of the word implies, who ‘gives the law to himself’. As Kant stressed, this was a moral as well as an intellectual achievement, since it involved courage as much as insight: courage to put aside convention, tradition, and superstition (how the three tended to coalesce for Enlightened thinkers!) in order to rely for guidance on the dictates of reason alone.
Bismarck’s observation cautions reticence about certain matters; it implies that about some things it is better not to inquire too closely. What Walter Bagehot said about the British monarchy—‘We must not let in daylight upon magic’—has, from this point of view, a more general application. The legend ‘Here be monsters’ that one sees on certain antique maps applies also to certain precincts of the map of our moral universe. En-lightened man, by contrast, is above all a creature who looks into things: he wants to ‘get to the bottom’ of controversies, to dispel mysteries, to see what makes things ‘tick’, to understand the mechanics of everything from law to sausages, from love to society. Who has the better advice, Bismarck or Kant?
Of course, it is not a simple choice. For one thing, it might be argued that Kant’s own attitude toward the imperative ‘Dare to Know!’ was complex. In a famous passage toward the beginning of The Critique of Pure Reason, for example, Kant tells us that he was setting limits to reason in order to make room for faith. Exactly what Kant meant by this . . . what to call it? this admission? this boast? this concession? Well, whatever Kant meant by his invocation of faith, it has been an abiding matter of debate. Nevertheless, it is fair to say that Kant’s ‘critical philosophy’ is itself a monument of Enlightenment thought, as much in its implied commendation of the ‘critical attitude’ as in the ‘Copernican revolution’ he sought to bring about in philosophy.
Today, we can hardly go to the toilet without being urged to cultivate ‘critical thinking’. Which does not mean, I hasten to add, that we are society of Kantians. Nevertheless, what we are dealing with here is an educational watchword, not to say a cliché, that has roots in some of the Enlightenment values that Kant espoused. It’s a voracious, quick-growing hybrid. A search for the phrase ‘critical thinking’ using the Google search engine brings up 86,100,000 references in .31 seconds. The first match, God help us, is to something called ‘The Critical Thinking Community’, whose goal is ‘to promote essential change in education and society through the cultivation of fair-minded critical thinking’. (Why is it, I wonder, that the conjunction of the phrase ‘critical thinking’ with the word ‘community’ is so reliably productive of nausea?)
Everywhere you look, in fact, you will find the virtues of ‘critical thinking’ extolled: Colleges and universities claim to be stuffed with the thing, and even high schools— even, mirabile dictu, primary schools—brag about instilling the principles of ‘critical thinking’ in their charges. There’s ‘critical thinking’ for bankers, for accountants, for cooks, gardeners, haberdashers, and even advanced toddlers. A few summers ago, my wife and I took our then-5-year-old son to an orientation meeting for parents considering sending their children to a local kindergarten. School officials enthusiastically told us about how they would bring the principles of critical thinking to Sally’s coloring book and little Johnnie’s sport. Absolutely everyone is enjoined to scrutinize his presuppositions, reject conventional thinking, and above all, to be original and/or ‘creative’. (Ponder, if your stomach is strong enough, a ‘Creative Critical Thinking Community’.)
To some extent, we owe the infestation of ‘critical thinking’ to that great twentieth-century movement to empty minds while at the same time inflating the sense of self-importance, or, to give it its usual name, Progressive Education. It was John Dewey, thank you very much, who told us that ‘education as such has no aims’, warned about ‘the vice of externally imposed ends’, urged upon his readers the notion that ‘an individual can only live in the present’. (The present, Dewey said, ‘is what life is in leaving the past behind it’, i.e., a nunc stans of perfect ignorance.) So much for the Arnoldian ideal of a liberal arts education as involving the disinterested pursuit of the ‘best that has been thought and said’.
The first thing to notice about the vogue for ‘critical thinking’ is that it tends to foster not criticism but what one wit called ‘criticismism’: the ‘ism’ or ideology of being critical, which, like most isms, turns out to be a parody or betrayal of the very thing it claims to champion. In this sense, ‘critical thinking’ is an attitude guaranteed to instill querulous dissatisfaction, which is to say ingratitude, on the one hand, and frivolousness, on the other. Its principal effect, as the philosopher David Stove observed, has been ‘to fortify millions of ignorant graduates and undergraduates in the belief, to which they are already only too firmly wedded by other causes, that the adversary posture is all, and that intellectual life consists in “directionless quibble”’.
The phrase ‘directionless quibble’ is from Jacques Barzun’s The House of Intellect, and a fine book it is, too, not least in its appreciation of the ways in which unanchored intellect can be ‘a life-darkening institution’. I suggest, however, that the phrase ‘directionless quibble’ is not entirely accurate, since the habit of quibble cultivated by ‘critical thinking’ does have a direction, namely against the status quo. The belief, as Stove puts it, ‘that the adversary posture is all’ is at the center of ‘critical thinking.’ Lionel Trilling spoke in this context of ‘the adversary culture’ of the intellectuals. It was not so long ago that, I received word of a long article in Teachers College Record, a journal from Indiana University which describes itself as ‘the voice of scholarship in education’. The featured article is a 30,000 word behemoth by a professor of ‘inquiry and philosophy’ called ‘Ocularcentrism, Phonocentrism and the Counter Enlightenment Problematic: Clarifying Contested Terrain in our Schools of Education’. I am too charitable to subject you to a sample of its almost comically reader-proof prose (you can see for yourself here), but it is worth pausing to note that such work is absolutely typical in the academic establishment today. It really is ‘the voice of scholarship’, or what’s become of scholarship.
Prejudice against prejudice
How we got here makes for a long story. I’d like to dip into a few chapters of that story and then speculate briefly about what an alternative might look like.
It seems obvious that ‘critical thinking’ (I employ the quotation marks because the activity in question is neither critical nor, in any robust sense of the word, thinking) is a descendant or re-enactment of the Enlightenment imperative ‘Dare to Know!’ In this sense, it is a precursor or adjunct of that ‘hermeneutics of suspicion’ that the French philosopher Paul Ricoeur invoked when discussing the intellectual and moral demolition carried out by thinkers like Darwin, Marx, Freud, and Nietzsche. It would be hard to exaggerate the corrosive nature of these assaults. Often, indeed, what we encounter is less a hermeneutics of suspicion than a hermeneutics of contempt. The contempt expresses itself partly in a repudiation of the customary, the conventional, the habitual, partly in the cult of innovation and originality. Think, for example, of John Stuart Mill’s famous plea on behalf of moral, social, and intellectual ‘experiments in living’. Part of what makes that phrase so obnoxious is Mill’s effort to dignify his project of moral revolution with the prestige of science—as if, for example, his creepy relationship with the married Harriet Taylor was somehow equivalent to Michael Faraday’s experiments with electro-magnetisim. You see the same thing at work today when young hedonists in search of oblivion explain that they are ‘experimenting’ with drugs.
It is worth pausing over Mill’s brief on behalf of innovation. You’ve heard it a thousand times. But familiarity should not blind us to its fatuous malevolence. Throughout history, Mill argues, the authors of such innovations have been objects of ridicule, persecution, and oppression; they have been ignored, silenced, exiled, imprisoned, even killed. But (Mill continues) we owe every step of progress, intellectual as well as moral, to the daring of innovators. ‘Without them’, he writes, ‘human life would become a stagnant pool. Not only is it they who introduce good things which did not before exist; it is they who keep the life in those which already exist’. Ergo, innovators—‘developed human beings’ is one phrase Mill uses for such paragons—should not merely be tolerated but positively be encouraged as beacons of future improvement.
David Stove called this the ‘They All Laughed at Christopher Columbus’ argument. In a penetrating essay in his book On Enlightenment, Stove noted that ‘the Columbus argument’ (as he called it for short) ‘has swept the world’.
With every day that has passed since Mill published it, it has been more influential than it was the day before. In the intellectual and moral dissolution of the West in the twentieth century, every step has depended on conservatives being disarmed, at some critical point, by the Columbus argument; by revolutionaries claiming that any resistance made to them is only another instance of that undeserved hostility which beneficial innovators have so regularly met with in the past.
The amazing thing about the success of the Columbus argument is that it depends on premises that are so obviously faulty. Indeed, a moment’s reflection reveals that the Columbus argument is undermined by a downright glaring weakness. Granted that every change for the better has depended on someone embarking on a new departure: well, so too has every change for the worse. And surely, Stove writes, there have been at least as many proposed innovations which ‘were or would have been for the worse as ones which were or would have been for the better’. Which means that we have at least as much reason to discourage innovators as to encourage them, especially when their innovations bear on things as immensely complex as the organization of society. As Lord Falkland admonished, ‘when it is not necessary to change, it is necessary not change’.
Article printed from Roger’s Rules: http://pjmedia.com/rogerkimball
URL to article: http://pjmedia.com/rogerkimball/2013/1/11/thinking-critically-about-critical-thinking