On education

When Scythrop grew up, he was sent, as usual, to a public school, where a little learning was painfully beaten into him, and from thence to the university, where it was carefully taken out of him.

thomas Love Peacock, Nightmare Abbey

All human societies produce children in much the same way; education, in the broadest sense of the term, is how those children are turned into functioning, well-integrated adults, and this process is far more variable. I am going to focus here on the way we do things in our society, particulary education in the UK as that is what I know best, but I want to remind you that formal schooling is by no means the only way to educate children, and that even among us a lot of education happens outside the schoolroom.

It appears completely normal to us, for example, that children should be taken from their families to a special place – a school – where they can be taught by trained professionals. Some schools even provide accommodation for the children, so that they can have a completely immersive experience. Since this is routinely offered by the most expensive private schools, presumably this is considered a benefit. We shall return to the question of who the beneficiary might be.

Schools play many roles in modern industrial society. One function that became conspicuous by its absence when they were obliged to close during the pandemic is that of day-care for children. The days are long gone when a single wage-earner was enough to keep the average family afloat; both partners must work, and therefore something needs to be done with the children. Given the high cost of childcare, schools fill a clear economic need for many families.

By extension, schools have tended to become providers of social services to children. In the UK, children from poor families are given free meals at school, and a reluctant government has been pressured into continuing this during school closures after a campaign led by the footballer Marcus Rashford. Increasingly, UK schools are referring children to social services; this is logical, since it is in schools that children are mainly available for official supervision and inspection.

This is all well and good, but tangential to schools’ declared purpose of educating the children in their care. But what are the aims of this education, and (crucial in a manager-led culture) how can its success or failure be measured? These questions have been with us as long as formal education, and have become increasingly urgent since compulsory schooling was introduced in the nineteenth century.

Broadly speaking there are two approaches to schooling, that of Plato and that of Mr Gradgrind. The Platonists hark back to the origin of the word, the Latin educare, which literally means “to continually draw out”. As in the famous passage in Plato’s dialogue Meno (82b–85b), they see the teacher’s role as that of a sympathetic guide, drawing out the child’s innate abilities, and school as a facility to enable this by providing books, experimental apparatus and so forth, and a structured environment. In this picture, education is a pleasant experience for all involved, pursued for its own sake. John Henry Newman’s The Idea of a University is a classic exposition of this ideal; Montessori schools embody it to this day.

The Gradgrind tendency, by contrast, has a more starkly industrial approach. They are manufacturing a product for the marketplace; nothing more, nothing less. The aim of education is to drill the child into a knowledge of useful facts (and, less obviously, a set of useful habits). A successfully educated child can regurgitate these facts on command. This can readily be measured and quantified. Whether or not this is pleasant for either pupil or teacher is only material to the extent that sugaring the pill can make the medicine go down more easily. The point is to get it down the child one way or another.

I think it is probably fair to say that most teachers would prefer to be Plato, and most Secretaries of State for Education would prefer them to be Mr Gradgrind. This tension runs through our entire education system.

Why is education compulsory in the UK? Originally it was justified on the grounds that parents would be prevented from putting their children to work, and indeed it is fair to say that a child would have been better off even in Mr Gradgrind’s classroom than up a chimney. That is less of an issue nowadays, however. Wouldn’t most parent want their children to go to school, if it were the Plato model? Indeed, wouldn’t most children want to go there?

Something that emerges clearly from the contrast between the two approaches is that in the first it is the child who is the primary beneficiary, and in the second it is – someone else.

All societies have rules, and well-socialised adults follow them, at least most of the time. An important function of education is to impart knowledge of those rules to the next generation. This is a benefit both to the child, who will know how to fit it, and to the society as a whole. In industrial society, those rules are designed to prepare the child for its future as an employee, as I’ve argued in a previous post.

What the child is not prepared for, however, is its future as a citizen. By this I mean someone who understands their part in (what is supposed to be) a democratic society, and is able to fulfill it. This requires at least three things:

  • The ability to think critically about the utterances of politicians, and in general about the endless blizzard of messages intended to persuade us to buy X or believe Y.
  • Knowing how to do research into the facts of the case, so that one can accept or reject such statements on a sounder basis than mere prejudice. One could characterise this more generally as knowing how to learn.
  • Being able to participate in a discussion in which the aim is to establish the truth rather than to score points; perhaps even being open to having one’s mind changed.

I am not arguing that all children should be made to read classical philosophers in the original, although it would be no bad thing if more of them did. I merely wish them to be given the tools with which to engage in the political process. Perhaps more people would do so if they had them; it might even improve voter turnout.

Now one does not have to be a cynic to imagine that there are some influential people who would find this scenario uncomfortable. Tragic though this may be, I still suggest it would be of benefit to society as a whole. Incidentally, those skills will be useful to anyone who needs to adapt to a new situation and find new solutions. Today’s children are certainly going to find themselves in that category.

Through no fault of their own, however, the bulk of the adult population, certainly in the UK and I suspect in most industrial countries, has not been provided with these tools. Even the graduates of our finest universities are deficient in them, if the example of our current Prime Minister is anything to go by. Most people, I think, have enough sense to be aware that they are being sold a pup; trust in the mainstream media seems to be declining, and I suspect that trust in online newsfeeds will follow the same trend, if it hasn’t already. The question is, how are people to fill the resulting vacuum?

The collapse of political discourse is evident – compare the Lincoln-Douglas debates with anything said during the last few US presidential elections – and unsurprising. In the words of former US President and all-round sage George W Bush: “You can fool some of the people all the time, and those are the ones you want to concentrate on.” Education should be aiming to minimise the number of such people. It is failing to do so.

Ironically, the UK education system is also failing in Mr Gradgrind’s terms, according to the very employers who are its real customers. Institutional reform appears unlikely, although I expect plenty of tinkering. Homeschooling was increasing in popularity even before the pandemic, but it isn’t for everyone; as I pointed out above, few families can afford the time.

But education is more than schooling. Personally, I learned at least as much out of school as in it. I was lucky enough to be brought up by parents who believed in the value of education; they came from that tradition in the working class that gave rise to things like the Workers’ Educational Association. When I came to read Plato, I was reminded of the discussions that used to take place at home. Even though we weren’t well-off, there were always books.

It was the formal educational system that gave me the bits of paper which have made me (more or less) acceptable to employers over the years. That’s not, however, the same as an education. I know which has the more lasting value. Nor did my education cease when I graduated. Yours needn’t either – but then you’re reading this blog, so you probably already knew that.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On time, its uses and abuses

Time is an illusion. Lunchtime doubly so.

Douglas Adams, The Hitchhiker’s Guide to the Galaxy

Only the Maya can claim to have been half as obsessed with time as our industrial civilisation is, and even they only went as far as devising a calendar so accurate it is said to be used by NASA. (This may, however, be a well-known fact; I was unable to find any confirmation of this on their website.) We have clocks and timers everywhere. If you don’t believe me, try counting the various timepieces to be found around your home, including those on or about your person.

Timekeeping devices of one sort or another have a long history, but the mechanical clock dates back to the fourteenth century and was largely the result of too many monks having too much time on their hands. Monks were interested in observing the canonical hours, which were (and are) the timetable of prayer which is at the centre of monastic life. These are really no more than conventions, standardised in such codes as the Rule of St Benedict, with no particular reason to think that God was too concerned about the exact timing. Because in pre-industrial times clocks were expensive to make, they became status symbols as much as anything.

One development is however significant. Prior to the advent of the mechanical clock, not all hours were of the same length. In non-tropical latitudes, this makes complete sense. The Romans, for example, divided the day into twelve equal parts and the night likewise, but how long each twelfth lasted depended on the season. A daylight hour would be appreciably longer in summer than in winter.

If you are making a machine to measure time, however, your life will be much simpler if all the hours are the same length. Simpler for the clock-maker, perhaps, but no so much for its users, for whom the motion of the sun in the sky tends to diverge from the time on the dial. The resulting inconvenience eventually led to the unlovely kludge that is daylight saving time; some idea of the technical difficulties in which this has involved us can be gleaned from the late Erik Naggum’s fine essay “The Long, Painful History of Time.

This trend towards the standardisation of time was, like so many trends, accelerated by the Industrial Revolution. If one takes noon to be the point during the day when the sun is at its height, this time will be found to be slightly different in London than it is in, say, Manchester. This fact was never an issue until relatively high-speed transport became available, in the form of the railway. You could set your watch by the station clock when you set out, and it would disagree with the clock at the other end. The need to standardise all these different local times led to the introduction of “railway time” as far back as 1840.

Another fruit of the Industrial Revolution was the introduction of the time-clock. This invention is attributed to an American with the splendid name of Willard Legrand Bundy and was the logical outcome of employers’ need to monitor and control their workers. Modern companies such as Amazon have taken this to quite extreme levels, because they can. Technically they can do it, because the capability now exists, but mostly they can do it because we collectively let them. After all they create jobs, and jobs are good, right?

I’ll be discussing education in a future post, but suffice it to say one of the main things our education is designed to do is to foster an awareness of, if not a fetishistic worship of, clock time. This is because the industrial demands that its workers be good time-keepers, to the point where bad time-keeping is considered a legitimate reason for dismissal. There are of course good practical reasons for this, as I have pointed out elsewhere, but I think there’s more to it.

People outside of industrial culture are not in general that bothered about time. They don’t need to be. This is not because indigenous people are intrinsically lackadaisical, despite the frequent complaints from their colonisers. Even the Romans were comparatively relaxed about it, and they were nothing if not businesslike. It may not be a coincidence that we, who are inside industrial culture, saturate ourselves so thoroughly in time; we wallow in clocks, we adore them, to the point that even tenths or hundredths of a second occur like epochs in our lives.

Why should this be? It seems to me that our worship of clock-time has much in common with our worship of money. Both are abstractions, fictions even, to which we attribute quasi-mystical properties – that they are (in principle, at any rate) infinite, eternal, and the same for all observers – even though they don’t in fact possess them. We sometimes go so far as to equate them, even though this is manifestly absurd.

Benjamin Franklin’s oft-repeated adage that time is money has become a truism without actually being true. (Try buying a hamburger with twenty-five minutes.) What it does manage to do, with admirable succinctness, is to sum up a certain cast of mind which is favoured by our present economic arrangements. For the one situation in which we have to pretend that time and money are interchangeable is paid employment. Payment – what the Americans with refreshing frankness call compensation – is always expressed in terms of money for time. So much an hour, or a week, or a year. We rent ourselves out, and this is considered completely normal.

But empty featureless Newtonian time, rolling smoothly onwards and the same for everyone everywhere, is no longer considered the best model even by physicists. Those of us who are mortal – and that includes the two of us, dear reader – must reckon with a finite extent of the stuff. We may not know how much we have, but it certainly isn’t eternity. There are no overdrafts available.

One fundamental way in which money differs from time is that one can both give and receive money – indeed, that’s pretty much the whole point of it – but one can only give time. Once you have lived a minute of your life, you will never have it again. Time cannot be refunded. Exchanging it for money is therefore not a decision to be taken lightly.

That’s pretty much the deal that industrial society offers us, though. Now it will be objected that everyone has to invest some of their time in obtaining the necessaries of life (well, apart from trustafarians). But there is more flexibility than most people suppose in both the quality and the quantity of that investment. As the old joke runs, very few people on their death-beds wish they had spent more time at the office.

In his provocative essay “On the Phenomenon of Bullshit Jobs,” which subsequently became the basis for a book, the late David Graeber argues that for many people the time they exchange for the necessaries of life is time completely wasted. “Huge swathes of people, in Europe and North America in particular, spend their entire working lives performing tasks they secretly believe do not really need to be performed. The moral and spiritual damage that comes from this situation is profound. It is a scar across our collective soul. Yet virtually no one talks about it.”

The bald equation of time with money conceals all this, of course. Nor is this an accident. To my way of thinking, it makes no real sense for the employee to think like this; but it makes much more sense from the point of view of the employer. All they are investing, after all, is money, and money which they expect to recoup. From their perspective, it’s the same as buying any other commodity. You won’t find many employers who put it as starkly as that, because even those who are sociopathic enough to find it acceptable are also usually smart enough to realise that it isn’t a great look for for recruitment.

A clock will tell you a number, really, and nothing more. Time is a far richer notion than that. There are the seasons of the year, and the seasons of our own lives too. There are geological cycles that are unimaginably long to us, and the generations of microbes that are startlingly short. A year means one thing to an oak tree and something else to a mayfly. Ours is not the only perspective.

Time is not a number, still less a unit of currency. No single instant of it is quite like another. It may not seem as if every moment is precious when you’re having root canal work done, or sitting through another pointless meeting, but once it’s gone it’s gone. You probably spent around seven and a half minutes reading this far; I hope you found it worthwhile.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On water

Water belongs to us all. Nature did not make the sun one person’s property, nor air, nor water, cool and clear.

Ovid, Metamorphoses, Book IV (tr. Michael Simpson)

[For World Water Day, 22nd March 2021]

Without water, there would be no life. It’s true that certain creatures – the rotifers spring to mind – can survive in a dessicated state, but they need to rehydrate in order to do anything that looks like living. Life appears to have started in the oceans, at least on this planet. This is why exobiologists get so excited when they find evidence for water elsewhere: it’s not a sufficient condition for life, but as far as we know it is a necessary one.

Water covers two-thirds of our world, and with climate-driven sea-level rise it will soon be covering even more. Although that water is salt and not of much direct use to us, the hydrological cycle is Nature’s own desalination plant, delivering fresh water in the form of rain. It’s hard to imagine that something which literally falls out of the sky could be a scarce resource, at least in those parts of the world which receive regular rainfall. Human ingenuity, however, has found a way.

Consider Egypt. It used to be called the gift of the Nile, because it has no rainfall and used to depend entirely on the annual Nile flood for irrigation. Because the flood also spread fertilising alluvium on the land there was no build-up of soil salinity, which was the bane of Mesopotamian agriculture. A lot of Egypt was and is desert, but the fertile strip along the river – the black land, as it was called – was productive enough to feed the country and also grow a surplus for export. Ancient Rome was largely fed by grain from Egypt.

Then Progress came to the land of the Pharaohs, in the form of the Aswan High Dam. Part of the idea of this immense engineering project was to regulate the Nile floods, which could vary from year to year. This sounds like a good idea, but in practice it has tended to increase soil salinity and coastal erosion. The alluvium which used to wash down over the fields is now building up in the reservoir behind the dam, reducing its capacity; water is also lost from the reservoir by evaporation.

Egyptian farmers are now resorting to the Nubian Sandstone Aquifer to irrigate their thirsty crops of cotton and potatoes. Unlike the Nile, this is not a renewable resource but fossil water that was deposited underground in ages past. Like other fossil resources, when it’s gone it’s gone.

It isn’t just Egypt that relies on fossil water. About 30% of the water used for irrigation in the United States is drawn from the Ogallala Aquifer. There are already concerns that it is depleting rapidly, certainly at a rate far in excess of the amount that rainfall can make good. Farmers across the Midwest are obliged to sink their wells ever deeper. At the same time, water quality is also under threat from pollution due to fracking as well as from agricultural sources such as fertiliser runoff. If you ever saw the 2010 documentary Gasland, you will doubtless remember seeing people who live near fracking wells setting fire to the fluid that is supposed to be their drinking water.

In California’s Central Valley, farmers have been so successful at extracting groundwater that they have caused subsidence on a massive scale. Apart from the damage done to infrastructure such as bridges, this has permanently reduced the capacity of the aquifers to hold water in the future. Again, the response to this has been to drill more and deeper. What could possibly go wrong?

I don’t want to make it sound as if farmers are the villains here. It’s true that industrial farming practices are often wasteful of water; see Gabe Brown’s Dirt to Soil (Chelsea Green, 2018) for an account – from a farmer! – of some of the reasons why this is and what can be done about it. Industrial meat production is a particularly egregious example of turning what ought to be a resource – dung – into a pollutant. But farmers have been pushed in an unsustainable direction by forces outside their control.

(Incidentally, in the interests of balance I should point out that livestock farming is far less prodigal of water than is often supposed, at least when done properly. In his review of Simon Fairlie’s book Meat: A Benign Extravagance (Hyden House, 2013), the environmental journalist George Monbiot admits his mistake on this point, in a heartwarming display of intellectual honesty which is all too rare amongst his profession.)

It’s true also that excessive irrigation and the resulting build-up of salt in the soil transformed the fertile lands of Mesopotamia into the desert we now call Iraq. “Forests precede us and deserts dog our heels,” in Derrick Jensen’s grim aphorism. But nobody ever set out to do this. It was an unintended consequence of actions which seemed reasonable at the time in the face of an immediate need. This pattern is by no means rare in human history.

Industry itself has long been cavalier with water, especially in its willingness to contaminate local waterways. The Love Canal is the poster child for this – again, as with the fracking pollution I mentioned above, we find a substance that purports to be water proving to be flammable – but of course it goes back far longer. The town of Walsall, long associated with the leather industry, ended up hiding its river in a culvert because of the noxious level of pollution (tanning produces lots of nasties). The central point of the town is still known as the Bridge, but if you go there today you will see no sign of either bridge or river.

Fresh clean water has thus become a scarce resource, and as such a fertile source of conflict. There are many places in the world dependent on rivers whose headwaters are in foreign territory, and conflicts over water are the inevitable result. By constructing dams it is possible to extract so much water from a river that it no longer reaches the sea – the once-mighty Colorado River is a prime example. In the Middle East, which was already well-supplied with tinder, similar threats to the Tigris, Euphrates and Jordan rivers may well spark violence. Even the Aswan High Dam may find itself trumped by the construction upstream of the Grand Ethiopian Renaissance Dam.

Now let’s throw climate change into the mix. With extreme weather events of all kinds becoming more common and more violent everywhere, flooding is a major issue. (In the UK at least, this is not helped by the moronic practice of building in flood plains against all advice.) Industrially farmed soil tends to be slow to absorb rainfall and prone to erosion – again, I refer the interested reader to the work of Gabe Brown – which has obvious consequences for agriculture.

On the flip side of the coin, we need to be more resilient to drought. In addition to the obvious point that water shortages are bad news for anything that grows, we can also expect more and scarier wildfires and (once again) more topsoil erosion. The Dust Bowl is a dramatic example of what can happen, and will again if we carry on as we are.

So: what is to be done?

Many of our water-related problems are already baked into the cake. Contaminated groundwater is going to stay contaminated. Depleted fossil aquifers will take millennia to replenish themselves. We can try to adopt farming practices aimed at improving the soil, along the lines being developed and practised by Brown and others. This is good common sense anyway: even Michael Gove, a man who often seems to have only a tenuous grasp on reality, said some sensible things on this subject when he was briefly Environment Secretary in the UK.

As individuals, we can do a good deal to reduce our water usage. Reusing grey water will economise further. We can also harvest rainwater (now legal in all 50 US states, albeit subject to regulation in places) which is perfectly fine for many things. If you have a garden, get the soil in the best condition possible – good gardening practice in any case, but soil in good heart can both absorb rainwater when the heavens open and retain it through dry weather. If where you are is liable to flooding, ensure your drainage is good.

We can also make a difference through our purchasing decisions. After all, we’re always being told that the consumer is king, so maybe we should start acting that way. Do your research; be aware of “virtual water” and cut it out of your purchases as far as you can. Here in the UK, for instance, we import a lot of water in the form of lettuce and tomatoes from semi-arid parts of southern Spain and, as mentioned before, cotton and potatoes from Egypt. Personally, I’m not especially comfortable doing that. Your mileage may, of course, vary.

Water does, as Ovid says, belong to us all, and as such it has become subject to the tragedy of the commons. (Nestle’s behaviour during the recent California drought shows what can happen.)The good news is that at least that is a problem that can be solved (see, for example, the work of Elinor Ostrom in this area) and indeed has been solved in many times and places. In the Alpujarras region of Spain the system of acequias introduced by the Moors in the eighth century, which includes an elaborate apparatus for resolving disputes, is still in operation.

Individually and collectively, we must all learn to share the gifts we have been given. As Gandhi said: “Earth provides enough to satisfy every man’s needs, but not every man’s greed.” Water is a thing we all need and for which there is no substitute. It’s about time we started to treat it, and one another, with some respect.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On quantification

Counting is the religion of this generation. It is its hope and its salvation.

Gertrude Stein, Everybody’s Autobiography

Industrial civilisation is in love with numbers. We have numbers on pretty much everything that can be quantified, and some things that arguably can’t. When someone wishes to assert incontrovertibly that X is the case, the magic words are: “Studies show…”. And how do the studies purport to show that X is true? Statistically, that’s how.

Like so many things that appear to us to be immutable truths carved into the primordial ground of cosmic being, this tendency has a history and is the result of human choices. We first find the appeal to statistics in the eighteenth century, originally in connection with matters relating to governing a state (hence the name) such as population and economic activity. The following century saw the formation of the Royal Statistical Society in London, and the use of statistics by political and other campaigners to advance their cause; one famous example is Florence Nightingale, the Society’s first female member, who popularised the pie-chart as a means of visualising data and thereby presenting it in an accessible form.

It is not, I suspect, a coincidence that this parallels the development of the Industrial Revolution, whose consequences I discussed in a previous post. It is a natural part of the standardising tendency which gave us the Whitworth screw thread, SATS and the Big Mac. We want the world to be made up of things which are all the same.

This is obviously a prerequisite for counting things. If you have a pencil, three bananas and a bicycle, you don’t really have five of anything. If, on the other hand, you can say that one lump of pig-iron is much the same as another, then knowing you have five of those is a useful thing. Of course, it’s not actually true to say that all lumps of pig-iron are strictly equivalent; things like the carbon content will vary from one to another; but you may not need to worry about that if all you want to know is how many tons your foundry produced this week.

Then again, if you care about the quality of your pig-iron, knowing that you have five lumps of it is not especially useful. So the meaning of a statistic depends on what is being counted and why. Who is doing the counting may also be significant; human motivation is rarely pure.

Consider official unemployment figures. No government would like these to be higher than they need to be, and it is much easier and quicker to fix the numbers than it is to fix the economy. Thus the definition of unemployment becomes remarkably fluid, in order to leave out as many people as possible. In the UK, for example, you need to be unemployed and claiming a government benefit. Unsurprisingly this is coupled with a benefits system of Byzantine complexity which has all the hallmarks of having been designed to deter applicants.

But in any case, is counting people really the same sort of thing as counting pig-iron? You may (or may not) be familiar with King David’s census of the Israelites (2 Samuel 24, if you want to look it up). The king soon came to regret it: “And David’s heart smote him after that he had numbered the people. And David said unto the LORD, I have sinned greatly in that I have done: and now, I beseech thee, O LORD, take away the iniquity of thy servant; for I have done very foolishly.”

The biblical text doesn’t make clear why this was such a bad idea, apart from the punishments that follow, but there’s a case to be made that this that this is a reaction against the kind of state bureaucracy that flourished in both Mesopotamia and Egypt at this period. Here I am following the account of state-formation in James C. Scott’s Against the Grain (Yale University Press, 2017), and while I’m not going to attempt a summary here, the relevant point is that ancient bean-counters started off counting measures of barley and ended by counting pretty much everything.

Now there will be some variability even between measures of barley, but it seems intuitively clear that human beings are individuals – indeed, we even use the word individual to refer to a person. Moreover, they have relationships with one another. What does it mean to treat another human being as a countable unit, like a measure of barley or a lump of pig-iron? Surely it is not the way most of us would want to be thought of. It is a denial of one’s basic humanity.

But when one is dealing with large numbers of people – more, let’s say, than Dunbar’s number – it is inevitable that one has to resort to this approach. It’s the only practical way of keeping on top of things, and early states were all about keeping on top of things. This appears to have been why writing was invented, and why so much of the vast corpus of cuneiform texts is so dull.

Nowadays, of course, we have Big Data. This is largely a result of technological advances; scribes inscribing clay tablets can only record a limited amount. Thanks to the miracles of Progress, we are now able to collect, store, and analyse stupendous amounts of data, most of it about us. And because we can, we do. (The scribes of Third Dynasty Ur or Qin China would most certainly have done so, given the opportunity.)

In this context, “data” just means “counts of stuff,” where many of the things being counted are events – typically things that a person has done: they spent x seconds viewing such and such a web-page, they bought this thing, they liked that post. This has a market value, because companies can use that information to sell you stuff. Governments can also use that information to identify people they don’t like; the Chinese government already does this, and I’d be very surprised if they were the only one.

However much we may claim to dislike this state of affairs, we still put up with it. It does however give us some idea of why the ancient Israelites of 2 Samuel found the whole notion of counting people as if they were things so viscerally repugnant. And it is also dangerous, because data can be inaccurate and/or misinterpreted.

This can be the result of error, or it can be deliberate. Science itself is not immune to this: famous (or notorious) examples include Sir Cyril Burt, who fabricated data wholesale to support his ideas about the heritability of IQ, or more recently Dr Werner Bezwoda and his fraudulent cancer research. There may well be many more whose nefarious practices have not been discovered; there is a lot of research which has proved difficult or impossible to replicate. Scientists are themselves human beings, a point which we seem to find difficult to admit.

You can also build a mathematical model to interpret that data, which looks impressive and works beautifully until it doesn’t. This is what happened to Long-Term Capital Management, which went south in 1998 despite the presence on its board of Nobel Prize-winning economists, requiring a bail-out to the tune of $3.625 billion. They thought they understood their model. With modern statistically-based AI, of course, nobody understands how the algorithms work. Because they seem to function most of the time – as the LTCM model did until 1997 – it’s just assumed that we can rely on the results. You may not find that thought comforting. I certainly don’t.

When Hillary Clinton ran for the US Presidency in 2016, her campaign made heavy use of data analysis, all of which suggested that she would win. We know how that ended up. I was reminded at the time of the statistical approach taken by Robert McNamara in managing the Vietnam War, relying on the body count as his measure of success. That didn’t go too well either.

But this is more than a practical question. It has profound implications for how we deal with one another and with the world in general. Is the world, to borrow a phrase from Coleridge, “an immense heap of little things” to be reckoned and classified and managed, to be bought and sold and monetised? Are there not important truths which cannot fit into a spreadsheet? I suspect most people would agree that there are, but that’s not the way we do things. There is the quantified and the unquantified, and most of the time the quantified takes precedence.

Iain McGilchrist’s magisterial study The Master and His Emissary (Yale University Press, 2010) examines this division in the light of research into the workings of the human brain. It’s a fascinating, well-researched and thoughtful book, but ultimately rather depressing. There seems to be every likelihood that we will continue to be blind-sided by our obsession with numbers, just as LTCM did, failing on an epic scale to see the wood for the trees. Nobody felling a tree in the Amazon to clear land for soya-bean cultivation is doing so because they want to damage the planet. They end up doing so just the same.

The philosopher Martin Buber arrived at much the same insight from a different direction. He distinguished between I-Thou relations, in which the other is treated as a person, from I-It relations, where the other is a mere thing. When we count, we necessarily treat what is counted as things rather than persons. This may work well enough for pig-iron, but it doesn’t work for people and I would argue other living creatures too. Twenty thousand chickens in a broiler house can only be thought of statistically; a flock of half a dozen are individuals. If this reminds you of the famous remark attributed to Stalin – “A single death is a tragedy; a million deaths is a statistic” – that is not a coincidence.

Genocide can only happen when the victims are treated as things, not people. Nothing about the Holocaust is more striking than the way in which it was managed as an industrial process. Likewise, ecocide can only happen when living beings are treated as things, either as raw materials to be processed or as obstacles to be removed. Those long-dead Sumerian scribes weighing and recording barley never intended any of this, I’m sure, but that’s where we are.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On jobs

In an advanced industrial society it becomes almost impossible to seek, even to imagine, unemployment as a condition for autonomous, useful work. The infrastructure of society is arranged so that only the job gives access to the tools of production.

Ivan Illich

Once upon a time, nobody in the world had a job. Of course, that’s not to deny that people expended energy on the tasks required for survival; manifestly they did, or you and I would not be here. My point is that there was originally no distinction between work and leisure. People did what was necessary, which would vary across the seasons. Work, if we want to call it that, was done where people already were; people would move around their territory depending on the availability of food and water, but there was nothing we would recognise as a commute.

I start my account of jobs here because we tend to focus on what replaced it, namely the division of labour. Adam Smith himself did it: An Inquiry into the Nature and Causes of the Wealth of Nations, Book I, Chapter 1, is entitled “Of the division of labour,” and we go almost immediately into his famous account of the pin-factory. It is easy to forget how radically different working in a pin-factory is from the way people have got their living for the bulk of the time humans have been on earth.

It has to be admitted that Smith does not exactly sell the pin-factory as a great place to work. “One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it at the top for receiving the head; to make the head requires two or three distinct operations; to put it on is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them into the paper; and the important business of making a pin is, in this manner, divided into about eighteen distinct operations, which, in some manufactories, are performed by distinct hands, though in others the same man will sometimes perform two or three of them.” The work is repetitive and monotonous, and no single person involved in it will even get the satisfaction of having made an entire pin.

Smith is more interested in the large quantity of pins that can be made in this way, but here I want to consider the quality of life that it implies. He goes on to contrast the pin technician with his less specialised brethren: “A country weaver, who cultivates a small farm, must lose a good deal of time passing from his loom to the field, and from the field to his loom. … The habit of sauntering and of indolent careless application, which is naturally, or even necessarily acquired by every country workman who is obliged to change his work and his tools every half hour, and to apply his hand in twenty different ways almost every day of his life, renders him almost always slothful and lazy, and incapable of any vigorous application even on the most pressing occasions.”

We are reminded here that Smith was not an economist – there was no such thing in 1776 – but a theologian and a Professor of Moral Philosophy. His problem with the country workman is not really that he is inefficient. Clearly the things that needed to be done were still getting done. No, he is more worried about a lack of moral fibre. The Devil finds work for idle hands, and all this sauntering and indolence can lead to no good. There is no opportunity to saunter when all you do, all day and every day, is whiten pins or put them into paper.

His choice of a country weaver is suggestive. These were exactly the people who were shafted by the Industrial Revolution, as E. P. Thompson showed in The Making of the English Working Class (Pelican, 2013). They went from making a comfortable living and working the hours they chose to sixteen-hour days in the pitiless roar of a cotton-mill for starvation wages. In his classic essay “The Original Affluent Society,” the anthropologist Marshall Sahlins estimated that hunter-gatherers typically spend 3-5 hours per day obtaining food. (The essay is collected in his book Stone Age Economics, first published in 1972.) The rest of their time, presumably, is spent sauntering.

These represent extremes, of course. As Sahlins points out, the reason hunter-gatherers can work so little is that their material needs are kept to a minimum. (If you have to carry all of your belongings around with you, then you are naturally incentivised to do this.) Few of us would be prepared to accept a material standard of living at this level. But you will have noticed the word material in the previous sentence.

To obtain material goods, most of us need a job. It is the nipple which connects us to the milk of industrially-produced goodies which is what we rely on for survival. Without it, we are ill-equipped to fend for ourselves. In exchange, we accept a hefty set of constraints (and our education system is designed to prepare the way for this acceptance):

  • Timekeeping. You need to show up at the agreed time, and keep working until the agreed time. We are so used to this that we forget how unnatural it is. Smith’s account of the pin-factory makes it clear that no pins can be made unless everyone involved is present and correct; the system doesn’t work if people only turn up as and when they feel like it.
  • Obedience. The guy whose job is to draw out the wire has to draw out the wire, whether he fancies doing so or not. I’ve never had to do this for a living myself, but I should imagine it gets old pretty quickly.
  • Measurement. Smith goes into raptures about the number of pins produced per day (he estimates 4,800 per person) and of course this is an invitation to quantitative assessment of your performance, just like all those tests and exams you did at school. There isn’t so much scope for Taylorism amongst the hunter-gatherers, on the other hand.

Time and again, colonial administrators have bemoaned how terrible indigenous people are when put into factories and expected to comply with this stuff. The same thing happened in England in the early days of the Industrial Revolution. Unless people are trained up to it from an early age, they are unlikely to find these compromises appealing. It is no coincidence that compulsory education came in during the nineteenth century.

Because most of us are so dependent on having a job, unemployment becomes a problem. Governments like to talk about job creation as if this were self-evidently a good thing, regardless of the nature of the jobs themselves. (Technically, the construction of Auschwitz-Birkenau created jobs, after all.) Here in the UK we have the economic miracle of the zero-hours contract, by which one can have a job without any of the benefits. I expect this exciting innovation to sweep through the industrialised world in short order.

The solution to this trap is clear, but not necessarily easy. It is a question of re-education: and not merely in the myriad skills we need to support ourselves outside of the world of conventional employment – and while there are many useful courses and other resources out there, that in itself is still the task of a lifetime. We need also to reappraise some of our most deeply held beliefs about, for example time. I’ll be dedicating a future post to the subject of time, but let’s just say there are worse sins than turning up half an hour late.

After all, how many pins does the world really need?

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On human exceptionalism

Modern man does not experience himself as a part of nature but as an outside force destined to dominate and conquer it. He even talks of a battle with nature, forgetting that, if he won the battle, he would find himself on the losing side.

E F Schumacher, Small Is Beautiful: Economics as if People Mattered

It will not have escaped regular readers of this blog that I often refer to myself and my fellow-humans as social primates. This is not, of course, a complete description of what a human being is, although in some contexts I think it can be a helpful one. What it does serve to highlight is that human beings are not separate from the rest of nature.

We often talk as if there were this separate thing called “the environment” which is out there somewhere, and which is something vaguely to do with us, which we can have a minister for, and which can be a line item on a budget (and frequently cut). The reality is that we eat it, we drink it, we breathe it, we could not be more intimately connected with it. We are ourselves ecosystems, hosting and depending on countless other creatures, to the extend that we can be described as holobionts – that is to say, assemblages of life-forms, not simply a species on its own.

From this point of view, it makes little sense to speak of “the environment” as something out there. It is right in here with us; it is us. Yet not only do we in fact constantly speak like this, but we consider it an insult to be put on the same level as our fellow-creatures. Words like brutal and bestial literally mean “like an animal,” while the kind of cruelty that is only too typical of our species is branded inhumane.

We seem to need a lot of convincing that other animals are anything more than automata. Apparently, it is newsworthy that fish might feel pain, or that many animals self-medicate. (That one is even posted under the heading “Surprising Science,” despite the fact that any observant goat-keeper will have witnessed it routinely.) The default assumption is always that non-humans are without consciousness or agency or any capacity with which we might be tempted to identify. We claim an exclusive right to personhood, on no particular ground other than it is convenient for us.

What is this obsession with standing apart from the natural world? It seems to me to be about control. If we are a small part of a larger whole, we are clearly not in control. But if the world consists of objects outside ourselves which we can manipulate, suddenly we can be managers. It is not a coincidence that the word management turns up a lot in descriptions of what we do with the environment, despite the fact that a forest, for example, is dizzyingly complex and manifestly beyond the capacity of anyone to manage in any meaningful sense.

We can, however, at least imagine that we can manage a bunch of inanimate things, robots whose behaviour we can predict and control. After all, we do this all the time with our factories and warehouses. If the natural world is just a machine, we can make it do what we want by pressing the right buttons. That’s the thing about machines: they exist to serve our purposes. But this way of thinking goes back well before the industrial age.

And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth.

Genesis 1:26 (King James version)

To claim dominion is to claim power. The root of the word is the Latin dominus, which means a slave-owner. In Roman law, a dominus had absolute power over his slave, up to and including the power of life and death. That is how we routinely think about our relationship to the natural world.

A good deal of the shouting about how we are destroying all life on the planet is an attempt to convince ourselves that we could if we wanted to. But Stewart Brand notwithstanding, we are not as gods. Yes, we can do and are doing a lot of harm. So long as we pursue the industrial path, we will carry on doing harm, because an essential part of that is being able to pretend that we aren’t doing harm, or that if we are it’s someone else’s problem.

Another defining characteristic of industrialism is that whatever one does is done at the largest possible scale, so we will do harm at the limit of our capacity. That capacity, however, is finite, and Mother Nature is a tough old bird. There have been multiple extinction events in the past, the granddaddy of them all being the Great Dying at the end of the Permian period, about 251 million years ago, which is estimated to have wiped out about 96% of all species on the planet. Life found a way then, and it will find a way this time.

Once we see ourselves as an integral part of the web of life, as members of the community of living beings, we can no longer pretend that our actions are without consequences. We can no longer treat our fellow-creatures as slaves, or resources. I don’t think it is a coincidence that industrial society increasingly treats even human beings in this way; one would have thought the phrase “human resources” came out of the Todt Organisation but apparently these days it is perfectly respectable.

If we want to go on living on this planet – and Mars would not appear to be an inviting alternative – we are going to have to change our attitude. We need to stop behaving like thugs and vandals and start living as decent citizens. This means that we need to give up what control we have and also give up the illusion that we have more control than we do. First and foremost, we need to outgrow our collective sense of entitlement. We are not the only show in town.

When I say “we,” by the way, I am talking about the inhabitants of the industrialised or industrialising world – you know, the sort of people who have Internet access. That is not by any means the entirety of the human race, especially when our ancestors are taken into account, but it is a lot of people. I am not expecting this to happen overnight. At the very least, it will take a generation; major shifts in attitude always do.

But like all such changes. it will happen one person at a time. You could be next. It’s just a thought.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On virtue

Few are those who wish to be endowed with virtue rather than to seem so.

Marcus Tullius Cicero, De Amicitia

To those of us of a certain age, it is surprising how popular the notion of virtue has become in recent years. Virtue used to be more or less synonymous with sexual continence, especially in women, and connected in some obscure way with the cleanliness of net curtains. It was old-fashioned, dowdy, and not conducive to having fun. People like Mary Whitehouse who took it seriously were considered faintly ridiculous.

Contrast that with the situation today, in which so many people are so loudly obsessed with being virtuous, enforcing virtue in the public realm, and deploring a lack of virtue in others. Savonarola would have been right at home on social media. Except, that is, for the ideas of virtue that are being promoted. What Savonarola preached was broadly compatible with what contemporary Florentines already believed, or at least felt they ought to believe. What we are dealing with now is equivalent to a new revelation.

One prominent element of it would nevertheless have been familiar to the old firebrand: the call for repentance. It rarely seems to be an effectual channel of grace in practice, but self-criticism is the only acceptable defence to accusations of non-virtue – I hesitate to call it sin, as there is no explicit theological component to it, although as we shall see there are religious parallels.

The basis of this is a doctrine of collective guilt extended indefinitely across time and space. As with the Manichaeans, humanity is divided into two disjoint groups: the oppressors and the oppressed. All virtue resides with the oppressed. In a bold reversal of the proverb that two wrongs don’t make a right, here the only way to be right is to accumulate as many wrongs as possible. If one is an oppressor, one is guilty of any wicked act with which your group can be identified. (Here we see the usefulness of the tendency to put people into boxes, which I discussed in a previous post.) Thus, to take a concrete example, as an Englishman I should be held morally responsible for everything bad ever done either by English people or males, despite the fact that I have pretty solid alibis for the Amritsar massacre, the siege of Drogheda, and the triangular trade, to name just a few.

There is a splendid simplicity and purity about this view. Thinkers over the millennia have explored many forms of enquiry into moral questions. The great achievement of the new morality is to replace all that with a simple two-step method that requires almost no thought whatsoever. To be sure it requires an act of faith, but once that has been managed anyone can have access to the absolute truth of any moral question. It goes as follows:

  1. Identify the most oppressed person in the room. This process is familiar to anyone who played Top Trumps as a child. Gender, ethnicity and disability are all point-scorers here, although curiously not class, even though this line of thinking is associated with the political left, which traditionally was heavily into class analysis. Go figure.
  2. Accept uncritically whatever that person says. This step has the useful side-benefit of testing the virtue of those present, so that deviationism can be detected and rooted out.

Sadly, there are some practical issues with this approach. For one thing, if the moral status of the individual is to be identified with that of the collective, it is quite hard to find anyone who is not in some way an oppressor. That is to say, we are all sinners. (Again, this view would have been fine with Savonarola.) Given this, any individual’s claim to moral authority can only be relative, which makes the second step above unreliable by definition.

Moreover, the notion of collective responsibility is distressingly broad in its application. For example, it was used historically to argue that Jews were responsible for the death of Christ and therefore could and should be massacred. Collective punishment has a long and unpleasant history, and those who carry it out are surely to be numbered amongst the oppressors, and yet it is to be seen as virtuous when it is the virtuous meting out punishment. I hardly need to point out how dangerous this can be.

From the standpoint of the accused (and presumed guilty), there is also no inducement to behave well. If, as a heterosexual male, I am defined to be a rapist, why should I refrain from going out and actually committing rape? (I’ve always thought this to be a weakness in Calvin’s notion of predestination; you are, in the famous phrase, damned if you do and damned if you don’t.) This seems to me a basic problem with this approach to morality.

Inevitably, people have tried to game this morality by self-identifying as a member of a more oppressed group. No less a figure than Senator Elizabeth Warren attempted this gambit, with unfortunate results. Gender fluidity seems to be a card than anyone could potentially play. What exactly are the rules of the version of Top Trumps we are to play? Where do they come from? What makes them the specific rules that need to be followed in order to sort the sheep from the goats?

That is of course a Biblical reference, and it seems to me that there is a clear if unacknowledged debt to Christian thinking in all this. The notion of damnation is largely confined to post-classical Western thought; ancient philosophers tended to see virtue as a habit of mind to be encouraged and as a mean between opposing vices rather than as an absolute in its own right. Plato compared wrong-doing with making a mistake (Republic, Book I; the word he uses refers to missing a target).

Another serious flaw in this notion of virtue, at least in practice, is what it does not condemn. There is some token hand-wringing about the collapse of probity in public life, for example, but nobody appears to be seriously exercised about it. Whether or not some film passes or fails the Bechdel test seems more important than the question of whether or not Cabinet ministers are corruptly giving lucrative government contracts to their friends or allies and concealing the evidence. Regardless of the legal position, that seems to me to be immoral conduct which ought to be called out.

This narrowness of vision is also apparent in some areas which it does scrutinise. For example, in regard to environmental issues it tends to focus on climate change and specifically on carbon footprint. While this is certainly part of the issue, it is far from being a comprehensive view. And while it may be bad to drive a Chelsea tractor, it is not really much better to drive an electric car when one looks at the wider picture. The question of right living is much broader and deeper than any checklist of acceptable and unacceptable behaviours can capture.

All this is not to say that I decry this new-found interest in virtue. To give serious consideration to how one ought to live is an important and valuable endeavour. What I urge, however, is that those embarking on this quest be aware of two seductive temptations. The first is the siren call of simplicity: as H. L. Mencken pointed out, “there is always a well-known solution to every human problem—neat, plausible, and wrong.”

The second, which is harder both to notice and to resist, is the temptation to go along with what others think, or at least say they think. It is in our nature as social creatures to do this, and on the whole this tendency to agree with those around us is a good thing; a society of rugged free-thinkers would be wearing, to say the least. Nevertheless, on the really important questions it is essential to avoid groupthink, if only because we need to be able to own and stand by our beliefs when the chips are down.

No more than two cheers, then, for the modern pursuit of virtue. Insofar as it represents a sincere engagement with moral questions, I am all for it. The problem is when it descends into mindless dog-piling. Savonarola, after all, didn’t succeed in fixing much.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On hatred

Now hatred is by far the longest pleasure;

Men love in haste, but they detest at leisure.

George Gordon, Lord Byron, Don Juan, canto xiii

Hatred gets a very bad press, and on the whole deservedly so. It is often opposed to love, and given how depressing it would be to suppose love other than a good thing we see hatred as unequivocally bad. This has not however made it go away. Murder has been illegal for a long as there have been laws, and that hasn’t gone away either.

In his often-referenced novel, Nineteen Eighty-Four, George Orwell – Eric Arthur Blair, but as far as I know unrelated to Tony – describes an institution called the Two Minutes’ Hate. As so often, Orwell’s prediction seems laughably tame nowadays. With the advent of social media, we now have the Twenty-four Hours’ Hate. Ah, progress.

We might reasonably ask why this should be so. Is there more hatred going around now than there used to be? Or is the (perceived) safety and anonymity of online discourse simply removing our inhibitions, uncorking a reservoir of hatred that was already seething within us? Is there perhaps something in John Michael Greer’s view that hate is the new sex? – that is to say, that hatred has become for us as sexual desire was to the Victorians, a powerful emotion that was widely felt but which could not be expressed in a socially acceptable way.

In last week’s post, I talked about the importance of having a historical context in trying to understand the modern world, and this would seem to be an ideal candidate for that approach. For hatred is caused: not necessarily in a straightforward way, but something happened in the past, a transgression, or something that was perceived as a transgression. And probably not once, but many times.

I won’t get into specific examples here; goodness know there are plenty to choose from – Palestine, Ireland, Cyprus, Rwanda, Zimbabwe, Iraq, and the list goes on – because my point here is not to identify the rights and wrongs of the particular case so much as the fact that the people involved believe, with at least some cause, that they have suffered undeservedly. This will lead to anger, and frustrated anger will eventually express itself as hatred.

The target of that hatred may not necessarily be what you would expect. In the aftermath of the First World War, Germans had a very hard time of it, due in large part to the terms of the Treaty of Versailles. (The famous hyperinflation in the 1920s was at least partly caused by the need to pay exorbitant reparations to the Allies.) It would have been natural for this to have been expressed as hatred for the French, who were largely behind this, but France was too powerful. Instead, the hatred was transferred onto the Jews, who were close at hand and vulnerable.

This choice also had historical roots, as anti-Semitism was already a well-established tradition from medieval times (and not just in Germany, which we prefer to forget). Part of that was due to the association between Jews and usury. Lending money at interest was forbidden by the Church, so Jews were a convenient workaround. As they were often excluded from other ways of making money, they became money-lenders under the precarious protection of the local aristocracy, who could cash in at any time by simply withdrawing that protection, confiscating the money, and abandoning the Jews to the mob.

Inter-war Germany looked to the average German very much like another version of the same debt-trap with which their peasant forebears had been all too familiar. It was therefore easy for their anger and resentment to turn into a familiar channel, with the results we all know. Even though the Versailles settlement was not in fact a Jewish plot, it is quite comprehensible that many Germans might have wished it to be; and it is a short step from wishing something to be the case to believing that it is.

There is something delusional about all hatred. Those that we hate are never as purely evil and loathsome in reality as out hatred needs them to be. This is true of individuals, and more so of groups. As social primates, we have a strong need to differentiate the in-group from the out-group, and we readily confuse this with moral judgement. But once we have a delusion, the only way to preserve it is to keep contradictory evidence well away.

Hence the need to see those we hate as something other than people. The whole apparatus of Nazi “racial science” was created for this purpose. But consider equally the dehumanising of native Americans by the Conquistadors and later by European settlers in North America. If you are going to work someone to death in the silver mines of Potosí, you are not treating them humanely, that is to say, as a fellow-human. Slavery in general is a dehumanising process. A thing for sale cannot really be a person.

This gives us a clue to the abundance of hatred online, with which I began. There is a fundamental qualitative difference between relating to people online and in the real world. If you are talking to someone face to face, then (unless one or both of you is on the Asperger’s spectrum) there is a great deal of non-verbal information passing between you, and doing so immediately. It requires an act of will to overlook the personhood of the other party. (This can be done, and people do it all the time, but it is not the default level of interpersonal communication.)

In an online forum there is none of this. All you know about the other party is their bare words. Even without malice, it is easy to misconstrue what someone types. If you are looking for a fight – and a great many people are, as I will discuss in a moment – a fight is always available. And unlike the real world, there are no real consequences. If I smack someone over the head, I can expect retaliation and probably the intervention of the police. The worst thing that can happen to me online is a ban, and throwaway accounts are easy to make.

Moreover online communities are classic in-groups. The likes of Facebook and Google and Twitter are interested in ad traffic and data harvesting, and to keep you hooked they will ensure you get a constant diet of what you seem to like, or perhaps a little more so. A great many people imagine this is a balanced picture of the world, because Facebook, Google and Twitter certainly aren’t telling them any different. If you want a different point of view you will need to go and look for it proactively, and most people won’t do that: it’s time-consuming and uncomfortable and it may even oblige you to develop skills in critical thinking that nobody in power wishes to encourage.

People like to think of themselves as right-thinking and good. This goes equally for Joe Biden and Mao Zedong and Heinrich Himmler and their many admirers. We all prefer to sleep well at night. There is a warm glow of satisfaction to be derived from feeling that we certainly showed that Trump-loving bigot/capitalist running dog/Jew-lover what’s what. We may even feel as if we are pursuing a moral crusade.

But crusades are a two-edged sword. It is not for nothing that the original Crusades still rankle in Muslim eyes. Many of the most appalling things that people have done, and do today, are done in the sure and certain conviction of righteousness. It was after all a Cistercian abbot who uttered the cheerful advice “Kill them all, the Lord will know His own,” resulting in the deaths of thousands of people, at least some of whom were certainly not the heretics he was trying to get rid of. No doubt he genuinely believed he was saving souls.

Remember him the next time you’re about to click Send.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On history

I know it is the fashion to say that most of recorded history is lies anyway. I am willing to believe that history is for the most part inaccurate and biased, but what is peculiar to our own age is the abandonment of the idea that history could be truthfully written.

George Orwell, “England Your England”

Our culture has a curious relationship with history. On the one hand, we are dismissive of it; to call something or someone “history” is to consign them to irrelevance, and “ancient history” is even worse. This is bound up with our deeply-felt, if irrational, belief that the passage of time necessarily makes things better, so the past must have been worse than the present, and the further back in time you go the worse things must have been. We also like to believe that what happened in the past does not constrain our situation today, although of course it inevitably does.

But we are also deeply conscious of history, in a way that other cultures are not and indeed our own used not to be. If you look at mediaeval depictions of Alexander the Great, for instance, he is shown wearing contemporary armour. The past was imagined to have been much the same as the present. For us, however, as L. P. Hartley put it, “The past is a foreign country; they do things differently there.” History is exotic, other, even alien. We love books and movies set in the past, even if they are usually produced by people whose attitude to historical accuracy is comparable to King Herod’s attitude to childcare.

This does not, however, free history from our criticism. Passing judgement on the past is by no means a new pastime – Sellars and Yeatman’s definitive spoof 1066 And All That is full of Good Things and Bad Kings – but it has recently become an obsession in some circles. Curiously, we seem able to do this without imagining that anyone in the future might disapprove of us; we are obviously right, everyone else (past or future) merely thought/will think they were/are right. For instance, the person who wishes to cancel George Washington for having been a slave-owner may well own a dishwasher (for much the same reasons); it isn’t hard to imagine future generations taking a dim view of that.

The recent trend of pulling down statues of people of whom right-thinking people disapprove is just a conspicuous example of this. For example, some people seem to imagine that we can fix the problems of southern Africa by pretending that Cecil Rhodes never existed. This notion is not without its appeal, but the prospects of success are slim.

Now it is true that George Orwell wrote in Nineteen Eighty-Four that: “Who controls the past controls the future. Who controls the present controls the past.” But this was in the context of a totalitarian society, in which such control might be feasible. The real world is rarely so well-organised.

Henry Ford said in a 1925 lecture: “I don’t read history. That’s in the past. I’m thinking of the future.” This is more proof, if more were needed, that Henry Ford was not a deep thinker. (He also wrote: “Mass production is craftsmanship with the drudgery taken out of it,” which is the reverse of the truth.) It is obvious that the future originates in the present, and the present originates in the past. It may not be a coincidence that Ford’s legacy has not turned out to be a thing of loveliness.

Photo by zhang kaiyv on Pexels.com

But how can we know the truth about history? This depends very much on the kind of questions we choose to ask. It is true that written sources tend to be biased and incomplete. History is often written with a conscious agenda, but even when it isn’t it is inevitably selective. Most written European history between the end of the Western Roman Empire and the Renaissance comes to us via monks or clerics, for the simple reason that almost everyone literate fell into one of those categories.

Clearly you can’t believe everything you read in history-books, any more than you can believe everything you read in the newspapers (or on blogs like this one). This doesn’t mean that they contain no useful information at all, though. It simply means that critical thinking is required. Sadly, this appears to have been surgically removed from the educational curriculum, certainly in the UK and by all accounts in the USA and elsewhere.

A reliable strategy, where it can be followed, is to read multiple sources with known and conflicting biases, and try to determine where the truth seems to lie. For example, the standard account of the Industrial Revolution in England can be balanced with E. P. Thompson’s The Making of the English Working Class (Pelican, 2013). Incidentally, this strategy goes back at least as far as Thucydides, and is (at least in theory) the way modern historians approach their source material.

It is often objected that we learn nothing from history. Collectively, this may well be true. (Collectively, human beings seem to behave like idiots most of the time.) It cannot, however, be an absolute truth, since learning anything is the result of what happened to us in the past; otherwise you would still be merrily inserting your hands into the fire. Such lessons as we can learn from history are of course less clear-cut. History never repeats exactly. But this is not to say that useful parallels cannot be drawn.

Consider the cycle in which civilisations rise and fall. Joseph Tainter, in his classic study The Collapse of Complex Societies (Cambridge University Press, 1988), discusses a number of historical examples and concludes that the common theme is declining return on the investment in the complex structures and processes that are needed to keep things running. That is to say, at some point it makes more sense to give up on the whole project than to keep pouring resources into it.

This is something concrete we can look out for in contemporary events, and indeed when we look for it we shall find it. Industrial society depends on an enormous amount of complex infrastructure, which is hugely expensive to maintain – not just financially, but in terms of energy and physical resources. Just think what goes into keeping a motorway functioning: not just the physical roadway, but the signage, the drainage, traffic police and all the rest of it. At what point does all that become a price no longer worth paying?

Another useful result of the study of history is to provide context for contemporary events. The value of this is shown by what you get in its absence: the English are, on the whole, studiously ignorant of the history of Ireland, and thus are unable to comprehend why large parts of the population of Ulster would like to be shot of us. US reaction to 9/11 is another example. “Why do they hate us?” Well, a good start on answering that question might be to read French journalist Matthieu Auzanneau’s book Oil, Power and War: A Dark History (Chelsea Green Publishing, 2018; originally published in French under the title Or Noir: La grande histoire du pétrole, La Découverte, 2015). It will explain a great deal about US-Arab relations and quite possibly make your hair curl.

Without this context, much of the world around us is inexplicable, and so we tend to attribute much of it to mere lunacy, especially where religion is involved. I grew up in England during the heyday of the Northern Irish Troubles, and it was always framed as a conflict between Catholics and Protestants even though theology had very little to do with it. It was at least as much of a class war as it was religious war. (Then again, even the French Wars of Religion were about much more than religion.) In the same way, we steadfastly pretend there is no historical or political background to militant Islamism. It’s just the fault of random nutters. Right.

The poster child for this syndrome is Donald Trump. I should make it clear that I hold no brief for Mr Trump, and I dare say he is as despicable an individual as he is made out to be, but the fact remains that he spoke for a real and large constituency in the USA, namely those parts of the American working class who felt – with good reason – that they had been thrown under the proverbial bus. Since neither of the major US political parties was prepared to acknowledge this, let alone speak for these people, they were obliged to resort to Mr Trump. It is by no means clear that the Biden administration is likely to address this issue.

So the critical study of history is an important tool in understanding where we find ourselves and why, and also in trying determine what sensible options we might have in trying to plot a course towards a tolerable future. As the saying goes, history does not repeat itself, but it rhymes. We don’t really want to be rhyming with this guy:

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On quantification

Counting is the religion of this generation. It is its hope and its salvation.

Gertrude Stein, Everybody’s Autobiography

Industrial civilisation is in love with numbers. We have numbers on pretty much everything that can be quantified, and some things that arguably can’t. When someone wishes to assert incontrovertibly that X is the case, the magic words are: “Studies show…”. And how do the studies purport to show that X is true? Statistically, that’s how.

Like so many things that appear to us to be immutable truths carved into the primordial ground of cosmic being, this tendency has a history and is the result of human choices. We first find the appeal to statistics in the eighteenth century, originally in connection with matters relating to governing a state (hence the name) such as population and economic activity. The following century saw the formation of the Royal Statistical Society in London, and the use of statistics by political and other campaigners to advance their cause; one famous example is Florence Nightingale, the Society’s first female member, who popularised the pie-chart as a means of visualising data and thereby presenting it in an accessible form.

It is not, I suspect, a coincidence that this parallels the development of the Industrial Revolution, whose consequences I discussed in a previous post. It is a natural part of the standardising tendency which gave us the Whitworth screw thread, SATS and the Big Mac. We want the world to be made up of things which are all the same.

This is obviously a prerequisite for counting things. If you have a pencil, three bananas and a bicycle, you don’t really have five of anything. If, on the other hand, you can say that one lump of pig-iron is much the same as another, then knowing you have five of those is a useful thing. Of course, it’s not actually true to say that all lumps of pig-iron are strictly equivalent; things like the carbon content will vary from one to another; but you may not need to worry about that if all you want to know is how many tons your foundry produced this week.

Then again, if you care about the quality of your pig-iron, knowing that you have five lumps of it is not especially useful. So the meaning of a statistic depends on what is being counted and why. Who is doing the counting may also be significant; human motivation is rarely pure.

Consider official unemployment figures. No government would like these to be higher than they need to be, and it is much easier and quicker to fix the numbers than it is to fix the economy. Thus the definition of unemployment becomes remarkably fluid, in order to leave out as many people as possible. In the UK, for example, you need to be unemployed and claiming a government benefit. Unsurprisingly this is coupled with a benefits system of Byzantine complexity which has all the hallmarks of having been designed to deter applicants.

But in any case, is counting people really the same sort of thing as counting pig-iron? You may (or may not) be familiar with King David’s census of the Israelites (2 Samuel 24, if you want to look it up). The king soon came to regret it: “And David’s heart smote him after that he had numbered the people. And David said unto the LORD, I have sinned greatly in that I have done: and now, I beseech thee, O LORD, take away the iniquity of thy servant; for I have done very foolishly.”

The biblical text doesn’t make clear why this was such a bad idea, apart from the punishments that follow, but there’s a case to be made that this that this is a reaction against the kind of state bureaucracy that flourished in both Mesopotamia and Egypt at this period. Here I am following the account of state-formation in James C. Scott’s Against the Grain (Yale University Press, 2017), and while I’m not going to attempt a summary here, the relevant point is that ancient bean-counters started off counting measures of barley and ended by counting pretty much everything.

Now there will be some variability even between measures of barley, but it seems intuitively clear that human beings are individuals – indeed, we even use the word individual to refer to a person. Moreover, they have relationships with one another. What does it mean to treat another human being as a countable unit, like a measure of barley or a lump of pig-iron? Surely it is not the way most of us would want to be thought of. It is a denial of one’s basic humanity.

But when one is dealing with large numbers of people – more, let’s say, than Dunbar’s number – it is inevitable that one has to resort to this approach. It’s the only practical way of keeping on top of things, and early states were all about keeping on top of things. This appears to have been why writing was invented, and why so much of the vast corpus of cuneiform texts is so dull.

Nowadays, of course, we have Big Data. This is largely a result of technological advances; scribes inscribing clay tablets can only record a limited amount. Thanks to the miracles of Progress, we are now able to collect, store, and analyse stupendous amounts of data, most of it about us. And because we can, we do. (The scribes of Third Dynasty Ur or Qin China would most certainly have done so, given the opportunity.)

In this context, “data” just means “counts of stuff,” where many of the things being counted are events – typically things that a person has done: they spent x seconds viewing such and such a web-page, they bought this thing, they liked that post. This has a market value, because companies can use that information to sell you stuff. Governments can also use that information to identify people they don’t like; the Chinese government already does this, and I’d be very surprised if they were the only one.

However much we may claim to dislike this state of affairs, we still put up with it. It does however give us some idea of why the ancient Israelites of 2 Samuel found the whole notion of counting people as if they were things so viscerally repugnant. And it is also dangerous, because data can be inaccurate and/or misinterpreted.

This can be the result of error, or it can be deliberate. Science itself is not immune to this: famous (or notorious) examples include Sir Cyril Burt, who fabricated data wholesale to support his ideas about the heritability of IQ, or more recently Dr Werner Bezwoda and his fraudulent cancer research. There may well be many more whose nefarious practices have not been discovered; there is a lot of research which has proved difficult or impossible to replicate. Scientists are themselves human beings, a point which we seem to find difficult to admit.

You can also build a mathematical model to interpret that data, which looks impressive and works beautifully until it doesn’t. This is what happened to Long-Term Capital Management, which went south in 1998 despite the presence on its board of Nobel Prize-winning economists, requiring a bail-out to the tune of $3.625 billion. They thought they understood their model. With modern statistically-based AI, of course, nobody understands how the algorithms work. Because they seem to function most of the time – as the LTCM model did until 1997 – it’s just assumed that we can rely on the results. You may not find that thought comforting. I certainly don’t.

When Hillary Clinton ran for the US Presidency in 2016, her campaign made heavy use of data analysis, all of which suggested that she would win. We know how that ended up. I was reminded at the time of the statistical approach taken by Robert McNamara in managing the Vietnam War, relying on the body count as his measure of success. That didn’t go too well either.

But this is more than a practical question. It has profound implications for how we deal with one another and with the world in general. Is the world, to borrow a phrase from Coleridge, “an immense heap of little things” to be reckoned and classified and managed, to be bought and sold and monetised? Are there not important truths which cannot fit into a spreadsheet? I suspect most people would agree that there are, but that’s not the way we do things. There is the quantified and the unquantified, and most of the time the quantified takes precedence.

Iain McGilchrist’s magisterial study The Master and His Emissary (Yale University Press, 2010) examines this division in the light of research into the workings of the human brain. It’s a fascinating, well-researched and thoughtful book, but ultimately rather depressing. There seems to be every likelihood that we will continue to be blind-sided by our obsession with numbers, just as LTCM did, failing on an epic scale to see the wood for the trees. Nobody felling a tree in the Amazon to clear land for soya-bean cultivation is doing so because they want to damage the planet. They end up doing so just the same.

The philosopher Martin Buber arrived at much the same insight from a different direction. He distinguished between I-Thou relations, in which the other is treated as a person, from I-It relations, where the other is a mere thing. When we count, we necessarily treat what is counted as things rather than persons. This may work well enough for pig-iron, but it doesn’t work for people and I would argue other living creatures too. Twenty thousand chickens in a broiler house can only be thought of statistically; a flock of half a dozen are individuals. If this reminds you of the famous remark attributed to Stalin – “A single death is a tragedy; a million deaths is a statistic” – that is not a coincidence.

Genocide can only happen when the victims are treated as things, not people. Nothing about the Holocaust is more striking than the way in which it was managed as an industrial process. Likewise, ecocide can only happen when living beings are treated as things, either as raw materials to be processed or as obstacles to be removed. Those long-dead Sumerian scribes weighing and recording barley never intended any of this, I’m sure, but that’s where we are.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.