This morning’s developments – Trump sacking the acting Attorney General Sally Yates for ‘betrayal’ – has brought to mind one of the more frustrating episodes of my teaching career.* Some years ago I was advising a mature student, a retired commercial lawyer, on his Masters thesis; lovely bloke, good knowledge of the material, interesting ideas, but we hit a complete impasse when it came to his style of argument. He would cite a passage from a source as if its meaning were obvious, or at best assert his understanding of it and move on; or he would make a statement, with a reference to a single modern source, and then treat the matter as settled. Our meetings increasingly became variants on the same basic conversation: “Don’t you think it might be more complex than that?” “No.” “What about these other interpretations and arguments?” “I don’t agree with them.” “Don’t you think you should set out your reasons for rejecting them?” “No, I don’t see that.”
Today, with hindsight and a lot more experience, I would move rapidly to the “well, the norms of historical discourse say that you bloody well should, so you need to change your approach or you’re going to fail” gambit.** As it was, I tried a more indirect tack, trying to engage with his past experience. After all, my mother had trained as a barrister, and I grew up with competing styles of argument and a certain amount of meta-commentary on the subject (that is to say, my father, who as a mathematician had different standards, characterising the legal style as represented by my mother as “The window is not broken. Even if it is broken, it wasn’t me. Even if it was me, it was an accident.”). Surely, I argued, the law is all about interpretation and argument, considering the different ways in which a given situation might be understood and the different laws that might apply, depending on how they’re interpreted? Not in business, he replied. “If the Managing Director asks me a question, he wants a straight answer about what the law says, not all this humming and hawing.”
Obviously it would be illegitimate to conclude from a single anecdote that the entire business world is like this, but it wouldn’t surprise me if it’s a fairly common attitude. The charitable interpretation is that it’s a means of limiting the number of variables for the purposes of planning and strategy: the lawyer’s job here is not to multiply uncertainties but to help establish parameters. As with any sort of model, the existence of complexity is not denied, but it’s discounted for the moment on the basis of a “good enough” assumption. The less charitable reading is that MDs, chief executives and the like are deranged tyrants who will happily decree complexity to be non-existent if it interferes with their plans, on the assumption that if things go wrong they can always blame the lawyer or other underlings. Someone who insists on reminding them that the world (or the law) may be resistant to their desires is simply asking to be fired.
Increasingly, incidentally, I suspect that the vogue for motivational quotations from figures like Socrates and Thucydides, endlessly retweeted by business-related bots, is closely related to this phenomenon: the belief that the world is governed by simple principles, and all you need to do is find the correct one that will lead you to business success. Yes, I really should find the time to write The Strong Do What They Can: The Real Art of Negotiation According to Thucydides… It was inevitable that Trump would become frustrated with institutions and individuals who dared to disagree with him or suggest that his plans might not be realisable or legal; it’s only a surprise that it’s happened so quickly and comprehensively.
Extreme cases make bad law; speculating about the undesirable consequences of certain patterns of thought on the basis of a single narcissistic sociopath would be rather rash. Still, I do wonder about the prevalence of such thinking: assuming away complexity and ambiguity can be a powerful and effective form of analysis, of course, but it’s risky if there is no recognition that complexity and ambiguity are still there even if you temporarily pretend that they aren’t – e.g. my student’s apparent belief that, because the anti-ambiguity approach works in the Real World of business, therefore historians’ obsession with issues of uncertainty and interpretation is a ridiculous bit of self-indulgence that gets in the way of actual knowledge.
The core of our task as teachers of humanities subjects at university level could be seen as helping students to learn how to think critically, to move from the rote learning they experienced at school to a proper engagement with evidence and competing interpretations. But how often do they get stuck halfway (temporarily, one hopes), recognising the need to provide evidence to support statements but thinking that their job is done when they’ve put in a reference? The idea that they need to consider different interpretations and arguments, and to test their own ideas as rigorously as possible, may be as big a step for them as the shift at the beginning of their university studies from learning not just to regurgitate authority.
I’ve always had a certain nervousness about the idea of ‘Evidence-Based Policy’ on the same basis: what we academics mean by it is of course ‘policy based on a proper review of all the evidence’, but in practice it often becomes ‘policy for which we can cite something that fits what we were planning to do anyway’. Likewise – at the risk of this descending into absolute banality – in wider political discourse; leaving aside the tendency of authoritarian governments simply to invent their own ‘facts’, too often commentators (and everyone else) simply pick things that fit their world-view and ignore things that don’t – and are adept at dismissing contrary views as partisan.
That is probably the point. The core task of the historian (critic, philologist, whatever) is not only to read the evidence critically but to be self-critical. ‘Because: reasons’ is never enough: are they good reasons, and are my judgements reasonable? (Which cuts both ways; not just, am I too trusting of things that suit my assumptions, but also, am I too cynical about things that don’t?). This skill is essential at postgraduate level, and a student who manifestly fails to test their own arguments properly or to respond adequately to the counter-arguments and questions of the examiners is going to fail. But surely it ought to be essential for the many more students who head out into the world after their first degree, or we will be failing in our duty as teachers.
Well, at the moment I fear we are too often failing. We don’t generally have the sustained engagement with undergraduate students and their work that we do with postgrads; there’s insufficient time and resources for the detailed critique and discussion of their arguments, over time, that would properly support them in learning how to analyse and argue properly. Assessment systems test their knowledge and understanding at a specific moment – and I suspect that in many cases, even when I do comment on their approach to constructing an argument, this is not fully carried over to the next assignment, on a different topic and assessed by someone else. Of course we reward those who do engage properly with a range of evidence and interpretations, which is better than nothing; but those who take the ‘statement: reference of some sort’ approach don’t get failed for it, as it’s clearly a lot better than a load of unsupported assertions – which is to say, self-critical argument is not regarded as essential but only as desirable. Is that sufficient? I don’t think so.
What to do about this? I think it certainly requires the explicit teaching of historical theory and methodology, rather than treating this as something that will be picked up by osmosis or just by practice. It’s a focus of the feedback I offer on essays already – and I wish that more students would come to talk to me, both before and after – and next year I intend to try the approach pioneered in Exeter by my colleague Rebecca Langlands of a two-part written assignment, in which I can comment explicitly on a draft and then the final essay is assessed in part on the basis of how well the student has responded to criticism and suggestions. I’m open to further ideas…
*Yes, this is trivial and irrelevant in the face of world events, but it’s a coping strategy, and my next course in post-apocalypse survival skills (basic butchery) isn’t for another couple of months.
**Which won’t necessarily work – in this case, when I eventually became that blunt, the response was something to the effect of “well, that’s historical discourse’s problem; I think that’s silly” – but at least I’ll have covered myself professionally; the student then proceeds at his/her own risk.
Nice post. Restricting my comments on your question whether the business world generally reflects the anti-ambiguity attitude you describe. Since I’m from that world – I’m a Director in an IT services company, not an academic (trolling economic history/economics/history blogs is what I do for fun and mind-maintenance), I may be somewhat qualified to comment.
I certainly see what you describe is business and in business leadership. some of it is personalities (as in the case you describe), but most of it stems from the pragmatic, results = truth nature of business. Strategic decisions (in good companies) are well-reviewed, but the daily pressure is to get decisions made, act on them and be accountable for the outcomes. When puzzling problems are encountered, truth-seeking SWAT teams (see the ambiguity there?) may be formed, but “good enough” solutions may be accepted, because resources are scarce and the next issue looms. “Root cause analysis” – something often demanded by customers – is often not quite the search for truth that the moniker implies. It’s the search for a constructive answer that stakeholders on both sides can accept, a political process every bit as much as a scientific one. This business world is certainly more nuanced than the one described by your student. But it’s a world that pre-exiled Thucydides might have found familiar.
Thanks; that makes sense. My primary concern wasn’t that business may work something like this – I have no problem with the idea of different sorts of simplifying models as a basis for dealing with a complex world – so much as what happens when that sort of thinking is transferred into a different realm. It certainly doesn’t work in historical research, where complexity, ambiguity and argument are the whole point, but it also seems dangerous as an approach to politics. It’s the old “spherical cows in a vacuum” joke; abstract models are great, but to apply them to reality you need a clear sense of what the model has assumed away and how much this matters, rather than taking the view that reality should be changed to conform to the model.
Agreed. My only point was that while business leadership thinking is (usually) more complex “because reasons”, it does operate on a simpler level (one which, yes, often fails to circle back on simplifying assumptions) than the critical thinking that prevails in history and the social sciences.
I point I didn’t make – but you have made it by implication – is how incredibly vulnerable business thinking is to buzzwords and idea fads. “What would Thucydides say?”, I’m sure, could be parleyed into a pretty good round of strategy consulting.
(I poke fun this tendency in my “Buzz is noise” post from last summer. Your comments on the current rage for dead Athenians could be dropped without change into that argument.)
Working in a company wholly owned by Americans, I would often remark that the business is run on a ‘spreadsheet economy model’ It looks at numbers both past and future.
if I made any reference to discussing potential options other than numbers, I would be accused of over analysing the situation