Here we go again… The return of lockdown brings some very familiar feelings: relief that what seemed like pessimism in early December (stocking up on cat food and soya milk in anticipation of possible Brexit disruption, deciding to stick with entirely electronic reading lists although students were asking about hard copies of stuff in the library) has left me in a better position than I might have been, frustration and uncertainty about how to modify teaching plans again. This term should have been easier (and maybe still will be) as we’ve all got better at the different elements of online learning, not least by working out which ones aren’t worth bothering with. However, someone somewhere was obviously feeling optimistic at a critical moment, and so we’re currently scheduled to have less recorded and asynchronous stuff and more face-to-face time in any given module – although the latter will now be online for most if not all the term. A more precautionary approach would have been to assume that we’d be lucky if we could just carry on in the way we have been, but no…
You could label this an unforced error, insofar as it reflects (and reinforces) the general perception that asynchronous learning is by definition inferior, rather than being the result of direct government incompetence and wishful thinking. The good news is that it’s relatively easy to fix, probably just by me recording more material than planned on top of the extra ‘class’ time. But it does feel like a repeated pattern; not just the tendency towards optimism about things returning to ‘normal’ as soon as possible, but also the particular (and not always intuitive) definitions of what ‘normal’ is. The paradigm remains the A-level fiasco back in the summer, where until far too late it was simply taken for granted that whatever else happened results must be made to conform to the patterns of previous years; ‘normal’ is students from the best schools getting their expected grades while other students know their place and accept it. Similarly, many of the issues facing universities stem from the almost unquestioned assumption that the normal pattern of mass student migrations in September, December, January etc. will naturally take place, regardless of how different things may be in between. Yes, of course there are also all the financial imperatives to maintain this; that just emphasises how far priorities may be skewed…
If occurred to me this morning that one of the issues here is a repeated confusion about the identification of what is ‘essential’ and ‘necessary’ (and therefore should be a priority) and what is ancillary or superfluous – driven, most of the time, by a lack of any sort of analysis, instead just assuming that it’s obvious. Take the idea that all but essential shops should close. Shops are precisely the sorts of places where infection is likely to spread (strangers mingling indoors); but people still need to be able to buy essential supplies; QED. But one potential consequence is that more people then visit the shops that remain open, increasing infection risk; what you really want to discourage is non-essential shopping, but keep as many shops as possible open so as to space out the people who do need to buy things.
Further: how do you define an ‘essential’ shop, or commodity? Intuitively, it’s obvious – hence those stories in the first lockdown about over-zealous supermarkets cordoning off some aisles. But, like the distinction between ‘luxuries’ and ‘staples’ – a major part of my life’s work in ancient economic history has been to try to banish such terminology completely, as being utterly unhelpful – the distinction tends to fall apart the moment you look at it in any detail. Flour: clearly an essential – except, how many people today actually depend on baking their own bread (and that’s before we get to the question of whether organic spelt flour is more or less essential than plain white Home Pride…)? Chocolate: clearly not essential – except for psychological well-being and a little comfort. Bookshops: clearly not essential – unless you have children at home and would prefer them not to be staring at screens all the time. And so forth. You can make a case that most things may seem essential, or at least more than a superfluous luxury, for some people at some point – and the correct response is not to condemn them for their decadence and lack of moral fibre (it is striking how far the advent of any sort of crisis brings out a latent Puritanism, or atavistic reversion to imagined Spirit of Rationing and the Blitz) but to think a bit harder about what you’re actually hoping to achieve by closing ‘non-essential’ shops. I assume that it wasn’t, consciously, just to boost the profits of Amazon and other people who can still provide people with things they need.
There was a brief period, back in the first lockdown, when there seemed to be some recognition of the problems with taken-for-granted ideas of the ‘essential’: when it was noted that actually delivery drivers, shop and warehouse workers and the like were just as vital for keeping society on the rails as the much-clapped NHS. It didn’t last; the idea that we might re-think some of our assumptions and priorities as a result of the pandemic faded rapidly, not least because the government was so determined to get things ‘back to normal’. Same procedure as last year? Same procedure as every year. So there must be A-level results so children can be sorted, and they must be similar to results in previous years regardless of all other circumstances; and students must migrate around the country for the standard university experience, even though they won’t actually be getting anything like that, and they must all return home for Christmas, because Christmas, and so forth.
While the luxury/staple distinction is not helpful, except insofar as it helps us interrogate some assumptions, there is a useful idea within that discourse: substitutability. How easily can one thing be replaced with another, without a significant loss of utility and/or additional expense? How adequate is the substitute in different respects? We can agree for the sake of argument (and before I get yelled at from Oxford as usual) that face-to-face-in-person teaching is easily best, especially for more advanced seminars – but is it completely irreplaceable (even if increasingly adulterated with masks, distancing, 75%+ of the students being online etc.)? Or is there a point where you can reasonably say, this isn’t butter, but in the absence of any butter it will do, and in some respects it’s actually better for you? The problem… one of the problems… among the many problems with teaching this year has been the dogged insistence that only a single specific form of interaction with students is properly acceptable, regardless of circumstances – so when the circumstances have forced a change regardless, everyone is primed to assume that it is by definition inferior and inadequate. I maintain that the substitute isn’t half bad, and it could have been better if we’d been able to devote more time and energy to developing it on its own merits.
But there are things that cannot be so easily substituted. Is sitting in a room all day on your own with occasional (mostly passive) online interactions, or at best seeing the same small group of people twenty-four seven, an adequate substitute for the usual ‘university experience’? Manifestly not. As Jim Dickinson at WonkHE has chronicled throughout the year, we’re faced with a weird double-think: students must be brought back to campus because that’s what university is all about, but then denied any sort of social life – massive restrictions on what student unions and societies were allowed to organise – because that’s not what university is all about. If f2fip teaching were genuinely non-substitutable then this might be an acceptable trade-off, but since we can do good-enough teaching online at a distance, that rationale falls away. I don’t think I’m just being naive in thinking that this is not only about the money (though clearly any more radical approach would require rent refunds, government support to universities etc.); it’s about the pretence of (or desperate longing for) a return to something resembling ‘normality’ in some respects, crossed with assumptions about the ‘essential’ and ‘non-essential’ aspects of a university education, all of which tends to fall apart if you examine it carefully, but that doesn’t stop it being powerful.
And, realistically, things are unlikely to improve. We’ll muddle through this term, with or without a return to limited f2fip classes in March (my money is on ‘nope’, but even if we do I doubt that many students will actually return to campus then), all predicated on the assumption that this is a temporary thing and we’ll be back to normal in September. And by the time it becomes clear that this won’t happen – that the roll-out of vaccination is slower than currently promised, or that it has less of an impact than hoped – it’ll be too late to make significant changes, so we’ll have to muddle through next year as well. From a teaching point of view, that’s manageable, as we’ve already done it. What’s worrying is that we are nowhere near developing decent alternatives to the default idea of student experience and social life, as this year it’s simply been dumped in the deep freeze. And of course there is no realistic alternative to the current model of the UK university, because Too Big And Complicated, plus sunk costs.
Happy New Year! Meet the new crisis, same as the old crisis…
I think your dislike of trying to distinguish between what is essential and what is non essential is well founded. One way of looking at that is the consumer sovereignty value judgement. Under that judgement, each person is assumed to be the best judge of what they prefer. Is one apple better or not than two oranges etc, etc? For example, under consumer sovereignty, if my view is that, for me, candy floss is an every day necessity, that is taken as being what matters. Of course, putting it that way, using candy floss as an example, strains the consumer sovereignty value judgement somewhat, but if one rejects consumer sovereignty, what can one put in its place? If my views on candy floss need correcting, that creates a need for finding someone better equipped than me to make such value judgements. Then, if I disagree, the judgement needs to be enforced to be effective. That creates a need to appoint a know-better elite, to tell us what we should be wanting. But if I am too ignorant to truly know whether I should like candy floss, how can I be allowed to vote? So rejecting consumer sovereignty would seem to lead to rejecting democracy. Hence, consumer sovereignty can be defended by noting the difficulty of replacing it with something better (like Churchill’s defence of democracy).
Under consumer sovereignty, I can of course decide for myself that my judgements on candy floss may be problematic and purchase the advice of a nutrition adviser. (I have bought several books on nutrition). It is the coercive enforcement of that advice that is contrary to consumer sovereignty.
Thanks for this. There’s been extensive discussion of the consequences and implications of treating students like consumers – often marked by government annoyance at their refusal to make the decisions that they ‘ought’ to be making… In the pandemic situation, both in universities and more generally, I guess the issue is that it’s generally accepted that either consumer sovereignty has to be limited or the range of choices available has to be reduced, for the collective good, but there is significant disagreement about the basis on which to make these decisions.