On desperation

The mass of men lead lives of quiet desperation.

Henry David Thoreau, Civil Disobedience and Other Essays

Desperation is not, of course, to be confused with despair. Both are marked by the absence of hope, but whereas despair is passive, desperation is active. People sunk into despair tend to do nothing; desperate people will do pretty much anything. There is something calming about really deep despair. There is nothing calming about desperation.

It is desperation that has families with small children trying to cross the English Channel in rubber dinghies. Desperation took Russian and Polish Jews to the East End of London in the 1880s, and the Irish to America a generation previously. Nobody wants to leave home when times are good. I remember seeing an interview with John Perkins, the author of Confessions of an Economic Hit Man (Ebury Press, 2006), in which he observed that he had met a lot of terrorists in the course of his career, and none of them wanted to be terrorists.

Quite substantial chunks of history have been driven by desperation. Popular uprisings, for sure, of which history offers plenty of examples. According to the Chinese historian Gang Deng (cited by David Graeber in Debt: The First Five Thousand Years), there have been periods in that country’s history where on average there was a peasant revolt breaking out somewhere over forty times per day. That’s a lot of desperation.

While some rioters are undoubtedly opportunists (or paid for), the ones who create the opportunities for looting are often genuinely desperate. People, on the whole, want to lead quiet lives; they generally accept the status quo. Only when the status quo becomes intolerable will they be driven to act to try and change it. Typically their resentment builds up slowly over time, much like the gradual increase in subterranean stresses that results in a sudden earthquake. Then something – it may be trivial – tips them over the edge. If this happens to a lot of people at about the same time, the results are apt to be spectacular.

A prudent regime finds ways to divert all that energy into harmless channels. The Roman elite famously kept the lower orders in check with “bread and circuses” – that is to say free food and free entertainment. The modern equivalent of the former is UBI, although this is mostly still hot air. We do, however, offer a fine array of free (or almost free) entertainment to distract us from all those unpleasant thoughts that lurk just in the background for many, if not most, of us.

Desperate people are, understandably, prone to substance use. A desperate person, more than anything else, just wants it to stop; drugs and alcohol can make that happen, at least temporarily. Of course addiction can then give them an even worse case of desperation. It’s a devil’s bargain, but by no means the only one we end up making in the industrial world. Of course people who are off their heads on the stimulant du jour will probably be disinhibited, making extreme behaviour more likely.

At the individual level, we see school shootings in the US, and at the other end of the spectrum an upsurge in petty crime. Suicide rates continue to rise in both the UK and the US. Much of this is the expression of desperation on an individual level (especially the suicides). Everyone deplores this, but we see no coherent political solutions to the underlying problems put forward, merely fixes for some of the symptoms. So we crack down on drugs, or propose legislation to limit the availability of firearms. (The Hungerford Massacre in the UK back in 1987 is a textbook example of shutting the stable door when the horse is already on the menu of a French restaurant.)

The obvious explanation for this reaction is that it is always easier to treat the symptoms than the underlying disease. In truth the issues are systemic: there simply is no way to improve the lot of the bulk of the population without radical change. At this point, there may not be a way even with radical change, but in any case the turkeys who govern us are never going to vote for Christmas.

I am not going to offer advice here to the desperate person, who has no doubt already examined their (limited) alternatives. I am certainly not going to recommend that they do anything illegal, in case I get a visit from the soi-disant competent authorities. But what can such a person do that is legal?

They can die. Technically this is illegal in some jurisdictions (it was in the UK until 1961), although once you’re dead you probably don’t care. This is a popular option, as per the suicide statistics I alluded to above. It is not, however, anyone’s choice of first resort. In a sense, suicide is self-harm taken to its logical conclusion. Abuse of alcohol and other drugs, eating disorders, and extreme behaviour in general can likewise be seen as on the same spectrum.

They can go somewhere else, where circumstances may be better. I’ve already referred to migration. It’s chancy, but if home is uninhabitable – whether because of war, famine or plague – it may be necessary. It definitely works out for some people, but not all or even most who try it. It is also a lot easier to do if you are comparatively well-off.

They can attempt to change the system from within. I am not aware of any examples of this strategy ever succeeding – suggestions in the comments are welcome. In societies where democracy has been reduced to a spectator sport, the opportunities are in any case few and far between. Where they do arise, voting for “anything but this” is likely get you President Trump or your local equivalent. On a bad day, it could be worse. In 1970, the people of Chile elected the wrong president – or wrong in the view of the United States, which arranged to have him replaced by General Pinochet, and the people of Chile didn’t get to elect another president until 1993. As Emma Goldman said, “If voting changed anything, they’d make it illegal.”

There is really no advice to be given to the truly desperate. Someone with reasonable options open to them will not, in any event, be desperate. The fact I want to draw attention to in this essay is simply that desperation is widespread, it is growing, and that it will erupt – how and where is anyone’s guess. This is not just a matter of statistics, either. This is the lived experience of people you know, the people you pass in the street, perhaps of you yourself. You see it on people’s faces every day.

I don’t believe all this bodes well for a smooth transition to a better state of society. Desperate people cause revolutions to happen, and while they may be interesting to read about they are generally not much fun to live through. Usually they result in extremely authoritarian and repressive regimes. There were a lot of desperate people around in the latter stages of the First World War, and again in the Great Depression, and the names of the winners of those struggles are a roll-call of tyrants. Those regimes themselves tend to be unstable.

But we should not forget that Hitler, for instance, garnered a lot of popular support by promising and delivering a better life for the average German worker (with some glaring exceptions). His contemporary Huey Long was on a similar path in the United States; had not been assassinated before he could run in the 1936 presidential election, who knows how things might have turned out? Such politicians are nowadays dismissed as “populists,” a term which seems to mean “someone I dislike who has won (or is likely to win) an election.” That description was certainly applied by the Americans to the unfortunate Salvador Allende, and we’ve already seen what the sequel was.

When the cake is shrinking – as I believe it is today and for the foreseeable future – those who currently get the lion’s share have a choice to make. They can either take less for themselves or risk losing everything at the hands of desperate people. Roosevelt saw this, and took America down a different path by bringing in the New Deal over the loud protests of the monied. It was a closer call than we like to remember.

The question is: where are the Roosevelts of today?

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On resilience

When life hands you a lemon, make lemonade.


After last week’s discussion of efficiency, it’s time to talk about its complement: resilience, literally the quality of springing back, like a rubber ball. It’s a broader and vaguer concept, with none of the satisfyingly crisp arithmetic we can bring to bear on efficiency, but I would argue it is more important.

As I mentioned, efficiency and resilience are in some ways antagonistic. If you have a warehouse full of goods waiting to be sold, you’re paying the costs of that now before you get any income from sales. On the other hand, you are resilient to disruptions to supply. Sometimes choosing the full warehouse is obviously the way to go, as it was for many UK business who stockpiled in the run-up to Brexit. Usually, however, businesses are under pressure to remain “lean.” Like many things in modern life, this strategy works well until it doesn’t.

On a more individual level, resilience might mean keeping money in a savings account, earning effectively no interest, rather than relying on borrowing to cover major expenditures. More generally, it would imply being ready to cope in the event that services on which you rely – healthcare, fuel, food in the shops – might become expensive or unobtainable. Many people have only one income stream, for example: their job, or their state pension or other benefits. They rely on that income for pretty much all of their needs, and yet it is in no way guaranteed. Job security is a distant memory these days (except for Cabinet ministers, apparently), and if you think you can rely on a state pension you clearly aren’t Greek.

This is the standard way of life in industrial civilisation, even for farmers. It’s typical for a farmer to focus on just a few crops, perhaps maybe even one, and feed themselves and their families from the money they make from doing so. The vast majority of farmers in history (or indeed today outside the industrialised world) would find the idea of farmers buying their food at the supermarket bizarre, but that is what they end up doing.

Our model of agriculture, in fact, is designed to be fragile. We depend, for one thing, on a small number of crops and animals, often growing the same varieties will little genetic variation. This increases our vulnerability to plant or animal diseases. It was a novel pathogen that led to the great Irish potato famine; since most of the population were heavily dependent on potatoes, the result was a catastrophe (admittedly not helped by the British government’s deliberate decision to continue exporting Irish wheat rather than feeding the Irish with it).

Resilient potato cultivation is quite a different beast. In his classic book The Unsettling of America (Counterpoint Press, 1977), Wendell Berry describes traditional potato farming in Peru. Here farmers have to cope with difficult conditions on the steep slopes of the Andes, with different parts of the same farm often being in quite different climate zones. They deal with this by using a very large number of different potato varieties, typically seventy or more, often planting many different varieties in the same small field, trying to match the characteristics of the plant to the specific microclimate in which they are gown. Their aim is not to maximise yield as such, but to ensure a consistent yield. They always want to have enough to eat: a surplus might be nice, but it isn’t the goal. They have been successful in this approach for many centuries, because it is resilient: enough of their crop will always do well in any given season.

Attempts have been made to try “conventional” potato farming methods in this environment. They have never succeeded for long. In a context where a failed harvest means going hungry, and two in a row would mean dying, only a fool would gamble everything on a single variety. And while potatoes are the staple crop, they grow others as well, such as oca, which is a tuber similar to the potato but unrelated and thus impervious to blight.

This is inefficient in a number of ways: it’s hard work, you need to know a lot about different potato varieties, it isn’t easy to mechanise, and you will never get the biggest possible harvest. On the other hand, you won’t starve to death either. It’s hard to argue that the Peruvian farmers have made the wrong trade-off here.

By contrast, our entire way of life is almost as if designed to be as un-resilient as possible. Consider a typical day for an average person. They wake up in what they think of as their house or flat, although if it’s rented or mortgaged they depend on being able to pay someone else for the privilege of living there. They have breakfast from food they bought from a third party, using energy which also comes from sources they don’t control and which again they must pay for. They travel to work, which is likely to involve energy from somewhere else, perhaps in a vehicle on which they owe even more money and which they cannot repair or maintain themselves. The job they do is unlikely to give them skills they could use to make money on their own account, especially if they work in an office, and probably offers few intrinsic rewards other than money. Then they go home and spend the evening consuming entertainment provided by other people. And so to bed, to do it all again tomorrow.

It should come as no surprise that many people feel helpless, because in actual fact they are helpless. But the good news is that this can be remedied. After all, human beings may be born helpless but they don’t have to remain so.

There are measures we can all take to make our lives more resilient. Modest stockpiling of some essentials is not expensive and will buy you some time if the shops empty. I lived through the UK miners’ strike of 1972, when we had to cope with power-cuts (most of the UK’s electricity was generated using coal at that time). Luckily, my family had camping equipment, so we could provide cooking, lighting and some heating for ourselves during those times. Many people were not so fortunate. If anything, our society is even more dependent on reliable electricity today than it was then.

More broadly, learn some skills. You won’t be able to do everything for yourself, but if you have some useful skills you can offer to others then you won’t have to. I recommend skills over material things (such as gold coins) because things can be stolen. If you already have a solid trade such as plumbing or carpentry then you have a head start here, but you might want to reassure yourself you can still work without things like power tools or plastic widgets that might present problems down the road.

Broaden your social networks too. I’m not talking about Facebook “friends” here, but in-person relationships with people in your vicinity. They say the definition of a friend is someone who will help you dispose of a body without asking questions. I’m not suggesting that level of trust, but the more people who might have your back in a crisis the safer you are. After all, they also say that any society is only three missed meals away from anarchy, and that isn’t so hard to imagine if you live as I do in a country that imports almost half of its food.

Issues with transportation are something we may all have to face in an uncertain future. By this I mean both transporting ourselves and the transport of goods. We have spent decades de-localising our lives in the name of efficiency; that process needs to reverse, and the sooner we start the better. Find and support local businesses that provide the things you need. If necessary, start one. Someone’s going to have to, sooner or later.

And remember, you are not alone. There are a lot of us in the same boat, and more and more people are starting to notice the resemblance to the Titanic. Check out movements like Transition Towns or Strong Towns or similar organisations; they’re trying to facilitate progress in the right direction. You’re not going to be able to fix everything all at once, but the more you can manage without, the better off you’ll be when less is readily available. It doesn’t have to be the Apocalypse; a lot of people would be seriously stuck if they had no electricity for a week, say.

More than anything else, resilience is an attitude of mind. Call it adaptability, call it bloody-mindedness, call it what you like, it is an attitude that has served our species well for quite a few millennia and with any luck will do so for plenty more. Look at your life, see where the weak spots are, and consider what you can do to fix them. You might surprise yourself.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On efficiency

There is nothing so useless as doing efficiently that which should not be done at all.

Peter Drucker

Surely if there is one thing we can all agree on, it is that efficiency is a good thing. By efficiency I simply mean the ratio between what one puts in and what one gets out. Fuel efficiency in cars is a straightforward example: it is expressed as miles per gallon (or kilometres per litre), stating that for every unit of fuel you put into the car you can travel a certain distance. The further the distance, the more fuel-efficient the car. One comparison site I looked at told me, for example, that I could expect to get 67.5 mpg out of a Ford Fiesta, compared to 13.3 from a Lamborghini Murcielago.

This is not of course the only measure of a car’s performance. I expect if I looked at how long it takes to get from 0-60 mph, the Lamborghini would look a lot better than the Ford. But that is not a measure of efficiency.

Which type of efficiency we choose to measure has a large bearing on whether we consider one thing more efficient than another. For example, industrial agriculture is often considered more efficient than traditional methods, but it depends on what you look at. In terms of calories out per man-hour in, industrial ag looks terrific; but if you look instead at calories out per calorie in – that is, at energy efficiency – the picture is very different.

A subsistence farmer using his own muscle power and that of his draught animals must always obtain at least one calorie of food for each calorie of effort put in – any less than that and he will stop subsisting and start starving. According to one study (Steinhart, J.S. and Steinhart, C.E., 1974, “Energy history of the U.S. food system”, Science, vol. 184), such farmers normally manage to get between 11 and 61 calories out for each calorie they put in. By comparison, another study (Leach, G., 1976, Energy and Food Production) found that UK agriculture in general gets 0.35 calories out per calorie in. I’ve seen estimates for US agriculture of 13 calories in for each calorie out, which works out at around 0.08.

On the face of it, these figures are catastrophically inefficient. The only reason the industrial-farming nations have not all starved to death is that most of the calories going in are coming from fossil fuels. Very little of that energy input is from the person sat in the cab. This is fine until it isn’t: fossil fuels are depleting, and at some point will become prohibitively expensive. That point is not as far in the future as we would like.

Another factor which can distort our perception of efficiency is the tendency to measure it in terms of price. Other things being equal, the cheapest way of doing something should be the most efficient, but other things never are equal. Economists have a delightful euphemism for a cost that a business is not going to pay; they call it an “externality.” The diesel which keeps all those tractors running is produced at considerable environmental cost which is in no way factored into the price. This is still true even if we insist on defining that cost purely in financial terms; it’s going to be an expensive proposition to deal with sea-level rise, for instance.

Because prices are often distorted in this way, the magic of the free market is unable to work as the theory says it should. The price no longer reflects the true cost. It would be interesting to know what the price of a barrel of crude oil really ought to be, once one includes all the externalised costs. As far as I can tell that study hasn’t been done, but I suspect the result would be somewhat higher than the $60-odd the stuff is going for at the moment.

Our love of efficiency has brought us the magic of the Just-in-time supply chain, where the amount of stock sitting in warehouses is minimised as far as possible. Sadly, this can all too easily morph into its evil twin, the Not-quite-in-time supply chain. As I write this, an enormous container ship has got itself wedged across the Suez Canal. Now I am sure that ship looks very efficient in terms of volume of goods moved per dollar, but at the moment it is holding up hundreds of other vessels and goodness know how many gazillion dollars’ worth of stuff. Nobody yet knows how long it will take to unblock the canal.

And this exposes the dark side of efficiency: its antagonism to resilience. Many of the ships queuing up to the south of the canal are oil tankers. Refineries typically keep about a week’s worth of crude on hand in case of interruptions to supply, but if any of those tankers end up having to go the long way around Africa, there will be more than a week’s gap. By the time you read this, no doubt we will have seen how that plays out, but issues like that are not uncommon. Recently, for instance, an earthquake in Japan shut down a semiconductor plant, interrupting production at a number of car factories.

Back in 2000, there was a strike by petrol tanker drivers in the UK which disrupted supplies to supermarkets, amongst other things. It emerged that only a few days’ worth of food was kept in stores. People know this; hence the incidence of panic buying in the early days of the 2020 pandemic. (This is one of those irregular verbs: I am stocking up on essentials, you are panic buying, he is a prepper.) As the saying has it, any society is three square meals away from anarchy.

Resilience implies a certain level of inefficiency. That spare tyre is increasing the weight of your car as well as taking up space, but it also makes the car resilient in the event of a puncture. The most efficient way is not necessarily the best. As with most things in life, a balance needs to be struck. Efficiency, after all, is a one-dimensional measure. It can tell you which car has the highest mpg, but that may not be the only or even the most important factor in your purchasing decision.

Consider the work of Frederick Winslow Taylor, the first person to apply the rigorous empirical methods of science to the workplace. In his case, he was trying to maximise output per man-hour. What he really wanted to achieve, but couldn’t due to the limited technology of his day, was complete automation. Instead he tried to treat the human beings he was studying as if they were robots. He advocated rigid adherence to standards which were to be defined and enforced by management, making Adam Smith’s pin factory a worker’s paradise in comparison.

Now of course Taylor wasn’t optimising for happiness, and indeed it is hard to imagine how one would go about doing such a thing. (The Utilitarians gave it a try, but their definitions beg most of the interesting questions.) If we define efficiency as a ratio between quantities, then only quantifiable things can be considered. That does not however mean that they are the only important things, or even the most important.

Efficiency is a seductive notion. It looks so clear-cut. After all, you can’t argue with arithmetic. Nor is it a bad tool when employed correctly. Fuel economy is certainly something I would take into consideration when buying a car, and it’s useful to have a standard for comparison. But like everything else, dedication to efficiency can be taken too far.

As we have seen with the example of agriculture, something may be efficient by one measure and appallingly inefficient by another. It is important to be aware of which measure we are choosing, and why. Nor is “efficient” a straightforward synonym for “good.” Would the Nazi death camps have been better if they had been more efficient at killing people? There is always a bigger picture than a simple measurement of efficiency will give us.

The next time you see something touted as efficient, look for that bigger picture and ask the awkward questions. You might be surprised by the answers.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On discrimination

For nothing is more democratic than logic; it is no respecter of persons and makes no distinction between crooked and straight noses.

Friedrich Nietzsche, The Gay Science

Languages are never static, and words change their meaning; sometimes they come to mean the opposite of what they originally did. People now use literally to mean metaphorically, as in: “My head has literally exploded,” which presumably it hasn’t if I can say or write those words. Sometimes these shifts of meaning imply a change in moral import: what was to be approved of is now to be scorned. Rather depressingly, many of our words for “stupid” originally meant something rather nice. Discrimination is another word that has undergone this reversal.

Nowadays it is a well-known fact that discrimination is a Bad Thing. In this country, we even have laws against it, as do many other of the so-called developed nations. But this is an example of the sloppy use of language, because what these laws are trying to address is not discrimination but prejudice.

We all have prejudices. I myself have a quite irrational prejudice in favour of the Irish, despite the fact that back in the seventies there were Irish people trying to blow me up. It is impossible to legislate the existence of prejudice away. Prejudice is a shortcut; people are always going to take shortcuts. The way to deal with your prejudices is to be aware of them and to make allowances. This used to be part of a process called “growing up.”

It is good that there should be legal redress for people on the receiving end of negative prejudices. Equal pay for equal work, to take one example. strikes me as simple justice. What is not good is conflating prejudice with discrimination.

For what is discrimination? It is the power to distinguish one thing from another, especially when the two things are superficially very similar. Back when being considered a person of discrimination was a compliment, it meant something like having good taste. Someone placing a bet on a horse-race will attempt to discriminate between the runners. A punter of discrimination will, in theory, pick more winners.

This is a good thing, unless you happen to be a bookmaker. It is likewise a good thing that I shall not be representing my country in the 100m at the next Olympics. This is because I am old and fat and slow, and the selectors rightly discriminate in favour of those who are not. If they failed to do so, they would not be doing their job.

You don’t actually need to be able to do much discriminating in order to express prejudice. A misogynist only needs to be able to tell if the person they are dealing with is female. Yes, there are tricky edge cases, but most of the time it’s pretty straightforward. In any case, the mere perception that So-and-so is female is quite enough to trigger the prejudice in someone that has it.

You do find an approximation of discrimination when prejudice is embodied in legislation. Apartheid South Africa was a political regime that was driven largely by racial prejudice, but the rules to determine to which racial category an individual should be assigned could be said to define a mechanical sort of discrimination. The Mischling Test is another example, and so for that matter are a lot of “anti-discrimination” laws. But in reality genuine prejudice is quasi-instinctive: when you see the object of your prejudice, you normally recognise it without effort.

Does it really matter that we call this by the wrong name? I think it does, because one thing we could certainly do with more of these days is discrimination, and in order to call for it we need to be able to use the word correctly. We need, in fact, to discriminate between prejudice and discrimination.

The reason I think this is so important is that a person who lacks discrimination is easily misled. I do not think there has ever been a time in human history when so many people have been lied to so often and so thoroughly. Lying used to be a small-scale, one might almost say artisanal, business; today it is a heavy industry. We can only hope that Abraham Lincoln was right to say that you can’t fool all of the people all of the time, because there are interest groups prepared to give that the old college try.

Politicians have always lied, of course, because governments need the consent of (most of) the people and that consent sometimes needs to be manufactured. This is as old as history. The battle of Kadesh back in 1274 BC did not go anything like as well for Ramesses II as his inscriptions would have you believe. Advertisers are also cavalier with the the truth – after all, if their product really was so wonderful they wouldn’t need to try so hard to persuade people to buy it. Now that political and corporate interests have largely captured mainstream journalism, there are few correctives available. One has to search them out.

This goes a long way to explain the increasing popularity of conspiracy theories. If you can’t believe the official narrative, then you will look elsewhere for an account of the world that you can believe. Conspiracy theories are a low-effort alternative, because they explain everything with one straightforward story, for example that it’s all the fault of the space lizards. There’s something comforting in the notion that everything is under control, even if the beings in control are evil monsters.

Unfortunately – or fortunately – everything is not under control, or at least not under one unified control. One has to discriminate between competing narratives. What is being brought to your attention? Who is doing that, and why? What other things are going on that they are keeping from your attention?

These questions don’t have simple answers. We need to be able to bring discrimination to bear on our news sources, whatever those may be (and if you only have one, I strongly recommend that you fix that). A useful technique is to compare and contrast news from a range of sources with known prejudices. Facts that they all agree on are likely to be true; the broader your range, the fewer facts are likely to fall into this category, but the more likely they are to be true.

Interpretation is important. The Reichstag fire is a case in point. Certainly the building caught fire. Almost certainly the man arrested for the crime was the person who set it. The Nazi government claimed he was preparing the way for a Communist coup, and used this as a pretext to tighten their grip on power. The Communists denied it had anything to do with them, and that he had acted alone. It has been claimed that he was a government stooge. The current consensus is that he acted alone, but the construction that the government was able to place on his actions at the time had far-reaching results.

Notice also what is considered important. It is the tendency of all news media to obsess over fluff, because (they hope) fluff is entertaining and will bring in viewers. A thing is not news just because it appears in a newspaper. Notice also the lifecycle of news stories. Often there will be a topic which is picked up and discussed intensively for while – the Syrian civil war, say – and then without explanation is simply dropped. There’s probably a reason for that. You may not be able to figure it out, but it’s worthwhile to consider what it might be.

The same goes for advertising. I’ve touched in an earlier post on some of the rhetorical techniques advertisers and others use to try and persuade you to buy their product. Much political discourse these days is basically advertising. A party or an individual politician is treated as a brand (some of them even have their own app – download at your own risk). Look for what you aren’t being told. Sometimes you’re being told essentially nothing; just the other day I saw billboards featuring a photo of some bloke in a suit, his name, and the words: “Your next mayor of London.” There isn’t even a statement of which party he represents, apart from the fact that he’s wearing a red tie (the colour of the UK Labour Party).

These things are designed to seep into your mind without your noticing – indiscriminately. Don’t let that happen, unless you actively want to be controlled. These people are not your friends. In a world of snake-oil salesmen, discrimination is your best ally. Cultivate it and use it.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On education

When Scythrop grew up, he was sent, as usual, to a public school, where a little learning was painfully beaten into him, and from thence to the university, where it was carefully taken out of him.

thomas Love Peacock, Nightmare Abbey

All human societies produce children in much the same way; education, in the broadest sense of the term, is how those children are turned into functioning, well-integrated adults, and this process is far more variable. I am going to focus here on the way we do things in our society, particulary education in the UK as that is what I know best, but I want to remind you that formal schooling is by no means the only way to educate children, and that even among us a lot of education happens outside the schoolroom.

It appears completely normal to us, for example, that children should be taken from their families to a special place – a school – where they can be taught by trained professionals. Some schools even provide accommodation for the children, so that they can have a completely immersive experience. Since this is routinely offered by the most expensive private schools, presumably this is considered a benefit. We shall return to the question of who the beneficiary might be.

Schools play many roles in modern industrial society. One function that became conspicuous by its absence when they were obliged to close during the pandemic is that of day-care for children. The days are long gone when a single wage-earner was enough to keep the average family afloat; both partners must work, and therefore something needs to be done with the children. Given the high cost of childcare, schools fill a clear economic need for many families.

By extension, schools have tended to become providers of social services to children. In the UK, children from poor families are given free meals at school, and a reluctant government has been pressured into continuing this during school closures after a campaign led by the footballer Marcus Rashford. Increasingly, UK schools are referring children to social services; this is logical, since it is in schools that children are mainly available for official supervision and inspection.

This is all well and good, but tangential to schools’ declared purpose of educating the children in their care. But what are the aims of this education, and (crucial in a manager-led culture) how can its success or failure be measured? These questions have been with us as long as formal education, and have become increasingly urgent since compulsory schooling was introduced in the nineteenth century.

Broadly speaking there are two approaches to schooling, that of Plato and that of Mr Gradgrind. The Platonists hark back to the origin of the word, the Latin educare, which literally means “to continually draw out”. As in the famous passage in Plato’s dialogue Meno (82b–85b), they see the teacher’s role as that of a sympathetic guide, drawing out the child’s innate abilities, and school as a facility to enable this by providing books, experimental apparatus and so forth, and a structured environment. In this picture, education is a pleasant experience for all involved, pursued for its own sake. John Henry Newman’s The Idea of a University is a classic exposition of this ideal; Montessori schools embody it to this day.

The Gradgrind tendency, by contrast, has a more starkly industrial approach. They are manufacturing a product for the marketplace; nothing more, nothing less. The aim of education is to drill the child into a knowledge of useful facts (and, less obviously, a set of useful habits). A successfully educated child can regurgitate these facts on command. This can readily be measured and quantified. Whether or not this is pleasant for either pupil or teacher is only material to the extent that sugaring the pill can make the medicine go down more easily. The point is to get it down the child one way or another.

I think it is probably fair to say that most teachers would prefer to be Plato, and most Secretaries of State for Education would prefer them to be Mr Gradgrind. This tension runs through our entire education system.

Why is education compulsory in the UK? Originally it was justified on the grounds that parents would be prevented from putting their children to work, and indeed it is fair to say that a child would have been better off even in Mr Gradgrind’s classroom than up a chimney. That is less of an issue nowadays, however. Wouldn’t most parent want their children to go to school, if it were the Plato model? Indeed, wouldn’t most children want to go there?

Something that emerges clearly from the contrast between the two approaches is that in the first it is the child who is the primary beneficiary, and in the second it is – someone else.

All societies have rules, and well-socialised adults follow them, at least most of the time. An important function of education is to impart knowledge of those rules to the next generation. This is a benefit both to the child, who will know how to fit it, and to the society as a whole. In industrial society, those rules are designed to prepare the child for its future as an employee, as I’ve argued in a previous post.

What the child is not prepared for, however, is its future as a citizen. By this I mean someone who understands their part in (what is supposed to be) a democratic society, and is able to fulfill it. This requires at least three things:

  • The ability to think critically about the utterances of politicians, and in general about the endless blizzard of messages intended to persuade us to buy X or believe Y.
  • Knowing how to do research into the facts of the case, so that one can accept or reject such statements on a sounder basis than mere prejudice. One could characterise this more generally as knowing how to learn.
  • Being able to participate in a discussion in which the aim is to establish the truth rather than to score points; perhaps even being open to having one’s mind changed.

I am not arguing that all children should be made to read classical philosophers in the original, although it would be no bad thing if more of them did. I merely wish them to be given the tools with which to engage in the political process. Perhaps more people would do so if they had them; it might even improve voter turnout.

Now one does not have to be a cynic to imagine that there are some influential people who would find this scenario uncomfortable. Tragic though this may be, I still suggest it would be of benefit to society as a whole. Incidentally, those skills will be useful to anyone who needs to adapt to a new situation and find new solutions. Today’s children are certainly going to find themselves in that category.

Through no fault of their own, however, the bulk of the adult population, certainly in the UK and I suspect in most industrial countries, has not been provided with these tools. Even the graduates of our finest universities are deficient in them, if the example of our current Prime Minister is anything to go by. Most people, I think, have enough sense to be aware that they are being sold a pup; trust in the mainstream media seems to be declining, and I suspect that trust in online newsfeeds will follow the same trend, if it hasn’t already. The question is, how are people to fill the resulting vacuum?

The collapse of political discourse is evident – compare the Lincoln-Douglas debates with anything said during the last few US presidential elections – and unsurprising. In the words of former US President and all-round sage George W Bush: “You can fool some of the people all the time, and those are the ones you want to concentrate on.” Education should be aiming to minimise the number of such people. It is failing to do so.

Ironically, the UK education system is also failing in Mr Gradgrind’s terms, according to the very employers who are its real customers. Institutional reform appears unlikely, although I expect plenty of tinkering. Homeschooling was increasing in popularity even before the pandemic, but it isn’t for everyone; as I pointed out above, few families can afford the time.

But education is more than schooling. Personally, I learned at least as much out of school as in it. I was lucky enough to be brought up by parents who believed in the value of education; they came from that tradition in the working class that gave rise to things like the Workers’ Educational Association. When I came to read Plato, I was reminded of the discussions that used to take place at home. Even though we weren’t well-off, there were always books.

It was the formal educational system that gave me the bits of paper which have made me (more or less) acceptable to employers over the years. That’s not, however, the same as an education. I know which has the more lasting value. Nor did my education cease when I graduated. Yours needn’t either – but then you’re reading this blog, so you probably already knew that.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On time, its uses and abuses

Time is an illusion. Lunchtime doubly so.

Douglas Adams, The Hitchhiker’s Guide to the Galaxy

Only the Maya can claim to have been half as obsessed with time as our industrial civilisation is, and even they only went as far as devising a calendar so accurate it is said to be used by NASA. (This may, however, be a well-known fact; I was unable to find any confirmation of this on their website.) We have clocks and timers everywhere. If you don’t believe me, try counting the various timepieces to be found around your home, including those on or about your person.

Timekeeping devices of one sort or another have a long history, but the mechanical clock dates back to the fourteenth century and was largely the result of too many monks having too much time on their hands. Monks were interested in observing the canonical hours, which were (and are) the timetable of prayer which is at the centre of monastic life. These are really no more than conventions, standardised in such codes as the Rule of St Benedict, with no particular reason to think that God was too concerned about the exact timing. Because in pre-industrial times clocks were expensive to make, they became status symbols as much as anything.

One development is however significant. Prior to the advent of the mechanical clock, not all hours were of the same length. In non-tropical latitudes, this makes complete sense. The Romans, for example, divided the day into twelve equal parts and the night likewise, but how long each twelfth lasted depended on the season. A daylight hour would be appreciably longer in summer than in winter.

If you are making a machine to measure time, however, your life will be much simpler if all the hours are the same length. Simpler for the clock-maker, perhaps, but no so much for its users, for whom the motion of the sun in the sky tends to diverge from the time on the dial. The resulting inconvenience eventually led to the unlovely kludge that is daylight saving time; some idea of the technical difficulties in which this has involved us can be gleaned from the late Erik Naggum’s fine essay “The Long, Painful History of Time.

This trend towards the standardisation of time was, like so many trends, accelerated by the Industrial Revolution. If one takes noon to be the point during the day when the sun is at its height, this time will be found to be slightly different in London than it is in, say, Manchester. This fact was never an issue until relatively high-speed transport became available, in the form of the railway. You could set your watch by the station clock when you set out, and it would disagree with the clock at the other end. The need to standardise all these different local times led to the introduction of “railway time” as far back as 1840.

Another fruit of the Industrial Revolution was the introduction of the time-clock. This invention is attributed to an American with the splendid name of Willard Legrand Bundy and was the logical outcome of employers’ need to monitor and control their workers. Modern companies such as Amazon have taken this to quite extreme levels, because they can. Technically they can do it, because the capability now exists, but mostly they can do it because we collectively let them. After all they create jobs, and jobs are good, right?

I’ll be discussing education in a future post, but suffice it to say one of the main things our education is designed to do is to foster an awareness of, if not a fetishistic worship of, clock time. This is because the industrial demands that its workers be good time-keepers, to the point where bad time-keeping is considered a legitimate reason for dismissal. There are of course good practical reasons for this, as I have pointed out elsewhere, but I think there’s more to it.

People outside of industrial culture are not in general that bothered about time. They don’t need to be. This is not because indigenous people are intrinsically lackadaisical, despite the frequent complaints from their colonisers. Even the Romans were comparatively relaxed about it, and they were nothing if not businesslike. It may not be a coincidence that we, who are inside industrial culture, saturate ourselves so thoroughly in time; we wallow in clocks, we adore them, to the point that even tenths or hundredths of a second occur like epochs in our lives.

Why should this be? It seems to me that our worship of clock-time has much in common with our worship of money. Both are abstractions, fictions even, to which we attribute quasi-mystical properties – that they are (in principle, at any rate) infinite, eternal, and the same for all observers – even though they don’t in fact possess them. We sometimes go so far as to equate them, even though this is manifestly absurd.

Benjamin Franklin’s oft-repeated adage that time is money has become a truism without actually being true. (Try buying a hamburger with twenty-five minutes.) What it does manage to do, with admirable succinctness, is to sum up a certain cast of mind which is favoured by our present economic arrangements. For the one situation in which we have to pretend that time and money are interchangeable is paid employment. Payment – what the Americans with refreshing frankness call compensation – is always expressed in terms of money for time. So much an hour, or a week, or a year. We rent ourselves out, and this is considered completely normal.

But empty featureless Newtonian time, rolling smoothly onwards and the same for everyone everywhere, is no longer considered the best model even by physicists. Those of us who are mortal – and that includes the two of us, dear reader – must reckon with a finite extent of the stuff. We may not know how much we have, but it certainly isn’t eternity. There are no overdrafts available.

One fundamental way in which money differs from time is that one can both give and receive money – indeed, that’s pretty much the whole point of it – but one can only give time. Once you have lived a minute of your life, you will never have it again. Time cannot be refunded. Exchanging it for money is therefore not a decision to be taken lightly.

That’s pretty much the deal that industrial society offers us, though. Now it will be objected that everyone has to invest some of their time in obtaining the necessaries of life (well, apart from trustafarians). But there is more flexibility than most people suppose in both the quality and the quantity of that investment. As the old joke runs, very few people on their death-beds wish they had spent more time at the office.

In his provocative essay “On the Phenomenon of Bullshit Jobs,” which subsequently became the basis for a book, the late David Graeber argues that for many people the time they exchange for the necessaries of life is time completely wasted. “Huge swathes of people, in Europe and North America in particular, spend their entire working lives performing tasks they secretly believe do not really need to be performed. The moral and spiritual damage that comes from this situation is profound. It is a scar across our collective soul. Yet virtually no one talks about it.”

The bald equation of time with money conceals all this, of course. Nor is this an accident. To my way of thinking, it makes no real sense for the employee to think like this; but it makes much more sense from the point of view of the employer. All they are investing, after all, is money, and money which they expect to recoup. From their perspective, it’s the same as buying any other commodity. You won’t find many employers who put it as starkly as that, because even those who are sociopathic enough to find it acceptable are also usually smart enough to realise that it isn’t a great look for for recruitment.

A clock will tell you a number, really, and nothing more. Time is a far richer notion than that. There are the seasons of the year, and the seasons of our own lives too. There are geological cycles that are unimaginably long to us, and the generations of microbes that are startlingly short. A year means one thing to an oak tree and something else to a mayfly. Ours is not the only perspective.

Time is not a number, still less a unit of currency. No single instant of it is quite like another. It may not seem as if every moment is precious when you’re having root canal work done, or sitting through another pointless meeting, but once it’s gone it’s gone. You probably spent around seven and a half minutes reading this far; I hope you found it worthwhile.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On water

Water belongs to us all. Nature did not make the sun one person’s property, nor air, nor water, cool and clear.

Ovid, Metamorphoses, Book IV (tr. Michael Simpson)

[For World Water Day, 22nd March 2021]

Without water, there would be no life. It’s true that certain creatures – the rotifers spring to mind – can survive in a dessicated state, but they need to rehydrate in order to do anything that looks like living. Life appears to have started in the oceans, at least on this planet. This is why exobiologists get so excited when they find evidence for water elsewhere: it’s not a sufficient condition for life, but as far as we know it is a necessary one.

Water covers two-thirds of our world, and with climate-driven sea-level rise it will soon be covering even more. Although that water is salt and not of much direct use to us, the hydrological cycle is Nature’s own desalination plant, delivering fresh water in the form of rain. It’s hard to imagine that something which literally falls out of the sky could be a scarce resource, at least in those parts of the world which receive regular rainfall. Human ingenuity, however, has found a way.

Consider Egypt. It used to be called the gift of the Nile, because it has no rainfall and used to depend entirely on the annual Nile flood for irrigation. Because the flood also spread fertilising alluvium on the land there was no build-up of soil salinity, which was the bane of Mesopotamian agriculture. A lot of Egypt was and is desert, but the fertile strip along the river – the black land, as it was called – was productive enough to feed the country and also grow a surplus for export. Ancient Rome was largely fed by grain from Egypt.

Then Progress came to the land of the Pharaohs, in the form of the Aswan High Dam. Part of the idea of this immense engineering project was to regulate the Nile floods, which could vary from year to year. This sounds like a good idea, but in practice it has tended to increase soil salinity and coastal erosion. The alluvium which used to wash down over the fields is now building up in the reservoir behind the dam, reducing its capacity; water is also lost from the reservoir by evaporation.

Egyptian farmers are now resorting to the Nubian Sandstone Aquifer to irrigate their thirsty crops of cotton and potatoes. Unlike the Nile, this is not a renewable resource but fossil water that was deposited underground in ages past. Like other fossil resources, when it’s gone it’s gone.

It isn’t just Egypt that relies on fossil water. About 30% of the water used for irrigation in the United States is drawn from the Ogallala Aquifer. There are already concerns that it is depleting rapidly, certainly at a rate far in excess of the amount that rainfall can make good. Farmers across the Midwest are obliged to sink their wells ever deeper. At the same time, water quality is also under threat from pollution due to fracking as well as from agricultural sources such as fertiliser runoff. If you ever saw the 2010 documentary Gasland, you will doubtless remember seeing people who live near fracking wells setting fire to the fluid that is supposed to be their drinking water.

In California’s Central Valley, farmers have been so successful at extracting groundwater that they have caused subsidence on a massive scale. Apart from the damage done to infrastructure such as bridges, this has permanently reduced the capacity of the aquifers to hold water in the future. Again, the response to this has been to drill more and deeper. What could possibly go wrong?

I don’t want to make it sound as if farmers are the villains here. It’s true that industrial farming practices are often wasteful of water; see Gabe Brown’s Dirt to Soil (Chelsea Green, 2018) for an account – from a farmer! – of some of the reasons why this is and what can be done about it. Industrial meat production is a particularly egregious example of turning what ought to be a resource – dung – into a pollutant. But farmers have been pushed in an unsustainable direction by forces outside their control.

(Incidentally, in the interests of balance I should point out that livestock farming is far less prodigal of water than is often supposed, at least when done properly. In his review of Simon Fairlie’s book Meat: A Benign Extravagance (Hyden House, 2013), the environmental journalist George Monbiot admits his mistake on this point, in a heartwarming display of intellectual honesty which is all too rare amongst his profession.)

It’s true also that excessive irrigation and the resulting build-up of salt in the soil transformed the fertile lands of Mesopotamia into the desert we now call Iraq. “Forests precede us and deserts dog our heels,” in Derrick Jensen’s grim aphorism. But nobody ever set out to do this. It was an unintended consequence of actions which seemed reasonable at the time in the face of an immediate need. This pattern is by no means rare in human history.

Industry itself has long been cavalier with water, especially in its willingness to contaminate local waterways. The Love Canal is the poster child for this – again, as with the fracking pollution I mentioned above, we find a substance that purports to be water proving to be flammable – but of course it goes back far longer. The town of Walsall, long associated with the leather industry, ended up hiding its river in a culvert because of the noxious level of pollution (tanning produces lots of nasties). The central point of the town is still known as the Bridge, but if you go there today you will see no sign of either bridge or river.

Fresh clean water has thus become a scarce resource, and as such a fertile source of conflict. There are many places in the world dependent on rivers whose headwaters are in foreign territory, and conflicts over water are the inevitable result. By constructing dams it is possible to extract so much water from a river that it no longer reaches the sea – the once-mighty Colorado River is a prime example. In the Middle East, which was already well-supplied with tinder, similar threats to the Tigris, Euphrates and Jordan rivers may well spark violence. Even the Aswan High Dam may find itself trumped by the construction upstream of the Grand Ethiopian Renaissance Dam.

Now let’s throw climate change into the mix. With extreme weather events of all kinds becoming more common and more violent everywhere, flooding is a major issue. (In the UK at least, this is not helped by the moronic practice of building in flood plains against all advice.) Industrially farmed soil tends to be slow to absorb rainfall and prone to erosion – again, I refer the interested reader to the work of Gabe Brown – which has obvious consequences for agriculture.

On the flip side of the coin, we need to be more resilient to drought. In addition to the obvious point that water shortages are bad news for anything that grows, we can also expect more and scarier wildfires and (once again) more topsoil erosion. The Dust Bowl is a dramatic example of what can happen, and will again if we carry on as we are.

So: what is to be done?

Many of our water-related problems are already baked into the cake. Contaminated groundwater is going to stay contaminated. Depleted fossil aquifers will take millennia to replenish themselves. We can try to adopt farming practices aimed at improving the soil, along the lines being developed and practised by Brown and others. This is good common sense anyway: even Michael Gove, a man who often seems to have only a tenuous grasp on reality, said some sensible things on this subject when he was briefly Environment Secretary in the UK.

As individuals, we can do a good deal to reduce our water usage. Reusing grey water will economise further. We can also harvest rainwater (now legal in all 50 US states, albeit subject to regulation in places) which is perfectly fine for many things. If you have a garden, get the soil in the best condition possible – good gardening practice in any case, but soil in good heart can both absorb rainwater when the heavens open and retain it through dry weather. If where you are is liable to flooding, ensure your drainage is good.

We can also make a difference through our purchasing decisions. After all, we’re always being told that the consumer is king, so maybe we should start acting that way. Do your research; be aware of “virtual water” and cut it out of your purchases as far as you can. Here in the UK, for instance, we import a lot of water in the form of lettuce and tomatoes from semi-arid parts of southern Spain and, as mentioned before, cotton and potatoes from Egypt. Personally, I’m not especially comfortable doing that. Your mileage may, of course, vary.

Water does, as Ovid says, belong to us all, and as such it has become subject to the tragedy of the commons. (Nestle’s behaviour during the recent California drought shows what can happen.)The good news is that at least that is a problem that can be solved (see, for example, the work of Elinor Ostrom in this area) and indeed has been solved in many times and places. In the Alpujarras region of Spain the system of acequias introduced by the Moors in the eighth century, which includes an elaborate apparatus for resolving disputes, is still in operation.

Individually and collectively, we must all learn to share the gifts we have been given. As Gandhi said: “Earth provides enough to satisfy every man’s needs, but not every man’s greed.” Water is a thing we all need and for which there is no substitute. It’s about time we started to treat it, and one another, with some respect.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On quantification

Counting is the religion of this generation. It is its hope and its salvation.

Gertrude Stein, Everybody’s Autobiography

Industrial civilisation is in love with numbers. We have numbers on pretty much everything that can be quantified, and some things that arguably can’t. When someone wishes to assert incontrovertibly that X is the case, the magic words are: “Studies show…”. And how do the studies purport to show that X is true? Statistically, that’s how.

Like so many things that appear to us to be immutable truths carved into the primordial ground of cosmic being, this tendency has a history and is the result of human choices. We first find the appeal to statistics in the eighteenth century, originally in connection with matters relating to governing a state (hence the name) such as population and economic activity. The following century saw the formation of the Royal Statistical Society in London, and the use of statistics by political and other campaigners to advance their cause; one famous example is Florence Nightingale, the Society’s first female member, who popularised the pie-chart as a means of visualising data and thereby presenting it in an accessible form.

It is not, I suspect, a coincidence that this parallels the development of the Industrial Revolution, whose consequences I discussed in a previous post. It is a natural part of the standardising tendency which gave us the Whitworth screw thread, SATS and the Big Mac. We want the world to be made up of things which are all the same.

This is obviously a prerequisite for counting things. If you have a pencil, three bananas and a bicycle, you don’t really have five of anything. If, on the other hand, you can say that one lump of pig-iron is much the same as another, then knowing you have five of those is a useful thing. Of course, it’s not actually true to say that all lumps of pig-iron are strictly equivalent; things like the carbon content will vary from one to another; but you may not need to worry about that if all you want to know is how many tons your foundry produced this week.

Then again, if you care about the quality of your pig-iron, knowing that you have five lumps of it is not especially useful. So the meaning of a statistic depends on what is being counted and why. Who is doing the counting may also be significant; human motivation is rarely pure.

Consider official unemployment figures. No government would like these to be higher than they need to be, and it is much easier and quicker to fix the numbers than it is to fix the economy. Thus the definition of unemployment becomes remarkably fluid, in order to leave out as many people as possible. In the UK, for example, you need to be unemployed and claiming a government benefit. Unsurprisingly this is coupled with a benefits system of Byzantine complexity which has all the hallmarks of having been designed to deter applicants.

But in any case, is counting people really the same sort of thing as counting pig-iron? You may (or may not) be familiar with King David’s census of the Israelites (2 Samuel 24, if you want to look it up). The king soon came to regret it: “And David’s heart smote him after that he had numbered the people. And David said unto the LORD, I have sinned greatly in that I have done: and now, I beseech thee, O LORD, take away the iniquity of thy servant; for I have done very foolishly.”

The biblical text doesn’t make clear why this was such a bad idea, apart from the punishments that follow, but there’s a case to be made that this that this is a reaction against the kind of state bureaucracy that flourished in both Mesopotamia and Egypt at this period. Here I am following the account of state-formation in James C. Scott’s Against the Grain (Yale University Press, 2017), and while I’m not going to attempt a summary here, the relevant point is that ancient bean-counters started off counting measures of barley and ended by counting pretty much everything.

Now there will be some variability even between measures of barley, but it seems intuitively clear that human beings are individuals – indeed, we even use the word individual to refer to a person. Moreover, they have relationships with one another. What does it mean to treat another human being as a countable unit, like a measure of barley or a lump of pig-iron? Surely it is not the way most of us would want to be thought of. It is a denial of one’s basic humanity.

But when one is dealing with large numbers of people – more, let’s say, than Dunbar’s number – it is inevitable that one has to resort to this approach. It’s the only practical way of keeping on top of things, and early states were all about keeping on top of things. This appears to have been why writing was invented, and why so much of the vast corpus of cuneiform texts is so dull.

Nowadays, of course, we have Big Data. This is largely a result of technological advances; scribes inscribing clay tablets can only record a limited amount. Thanks to the miracles of Progress, we are now able to collect, store, and analyse stupendous amounts of data, most of it about us. And because we can, we do. (The scribes of Third Dynasty Ur or Qin China would most certainly have done so, given the opportunity.)

In this context, “data” just means “counts of stuff,” where many of the things being counted are events – typically things that a person has done: they spent x seconds viewing such and such a web-page, they bought this thing, they liked that post. This has a market value, because companies can use that information to sell you stuff. Governments can also use that information to identify people they don’t like; the Chinese government already does this, and I’d be very surprised if they were the only one.

However much we may claim to dislike this state of affairs, we still put up with it. It does however give us some idea of why the ancient Israelites of 2 Samuel found the whole notion of counting people as if they were things so viscerally repugnant. And it is also dangerous, because data can be inaccurate and/or misinterpreted.

This can be the result of error, or it can be deliberate. Science itself is not immune to this: famous (or notorious) examples include Sir Cyril Burt, who fabricated data wholesale to support his ideas about the heritability of IQ, or more recently Dr Werner Bezwoda and his fraudulent cancer research. There may well be many more whose nefarious practices have not been discovered; there is a lot of research which has proved difficult or impossible to replicate. Scientists are themselves human beings, a point which we seem to find difficult to admit.

You can also build a mathematical model to interpret that data, which looks impressive and works beautifully until it doesn’t. This is what happened to Long-Term Capital Management, which went south in 1998 despite the presence on its board of Nobel Prize-winning economists, requiring a bail-out to the tune of $3.625 billion. They thought they understood their model. With modern statistically-based AI, of course, nobody understands how the algorithms work. Because they seem to function most of the time – as the LTCM model did until 1997 – it’s just assumed that we can rely on the results. You may not find that thought comforting. I certainly don’t.

When Hillary Clinton ran for the US Presidency in 2016, her campaign made heavy use of data analysis, all of which suggested that she would win. We know how that ended up. I was reminded at the time of the statistical approach taken by Robert McNamara in managing the Vietnam War, relying on the body count as his measure of success. That didn’t go too well either.

But this is more than a practical question. It has profound implications for how we deal with one another and with the world in general. Is the world, to borrow a phrase from Coleridge, “an immense heap of little things” to be reckoned and classified and managed, to be bought and sold and monetised? Are there not important truths which cannot fit into a spreadsheet? I suspect most people would agree that there are, but that’s not the way we do things. There is the quantified and the unquantified, and most of the time the quantified takes precedence.

Iain McGilchrist’s magisterial study The Master and His Emissary (Yale University Press, 2010) examines this division in the light of research into the workings of the human brain. It’s a fascinating, well-researched and thoughtful book, but ultimately rather depressing. There seems to be every likelihood that we will continue to be blind-sided by our obsession with numbers, just as LTCM did, failing on an epic scale to see the wood for the trees. Nobody felling a tree in the Amazon to clear land for soya-bean cultivation is doing so because they want to damage the planet. They end up doing so just the same.

The philosopher Martin Buber arrived at much the same insight from a different direction. He distinguished between I-Thou relations, in which the other is treated as a person, from I-It relations, where the other is a mere thing. When we count, we necessarily treat what is counted as things rather than persons. This may work well enough for pig-iron, but it doesn’t work for people and I would argue other living creatures too. Twenty thousand chickens in a broiler house can only be thought of statistically; a flock of half a dozen are individuals. If this reminds you of the famous remark attributed to Stalin – “A single death is a tragedy; a million deaths is a statistic” – that is not a coincidence.

Genocide can only happen when the victims are treated as things, not people. Nothing about the Holocaust is more striking than the way in which it was managed as an industrial process. Likewise, ecocide can only happen when living beings are treated as things, either as raw materials to be processed or as obstacles to be removed. Those long-dead Sumerian scribes weighing and recording barley never intended any of this, I’m sure, but that’s where we are.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On jobs

In an advanced industrial society it becomes almost impossible to seek, even to imagine, unemployment as a condition for autonomous, useful work. The infrastructure of society is arranged so that only the job gives access to the tools of production.

Ivan Illich

Once upon a time, nobody in the world had a job. Of course, that’s not to deny that people expended energy on the tasks required for survival; manifestly they did, or you and I would not be here. My point is that there was originally no distinction between work and leisure. People did what was necessary, which would vary across the seasons. Work, if we want to call it that, was done where people already were; people would move around their territory depending on the availability of food and water, but there was nothing we would recognise as a commute.

I start my account of jobs here because we tend to focus on what replaced it, namely the division of labour. Adam Smith himself did it: An Inquiry into the Nature and Causes of the Wealth of Nations, Book I, Chapter 1, is entitled “Of the division of labour,” and we go almost immediately into his famous account of the pin-factory. It is easy to forget how radically different working in a pin-factory is from the way people have got their living for the bulk of the time humans have been on earth.

It has to be admitted that Smith does not exactly sell the pin-factory as a great place to work. “One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it at the top for receiving the head; to make the head requires two or three distinct operations; to put it on is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them into the paper; and the important business of making a pin is, in this manner, divided into about eighteen distinct operations, which, in some manufactories, are performed by distinct hands, though in others the same man will sometimes perform two or three of them.” The work is repetitive and monotonous, and no single person involved in it will even get the satisfaction of having made an entire pin.

Smith is more interested in the large quantity of pins that can be made in this way, but here I want to consider the quality of life that it implies. He goes on to contrast the pin technician with his less specialised brethren: “A country weaver, who cultivates a small farm, must lose a good deal of time passing from his loom to the field, and from the field to his loom. … The habit of sauntering and of indolent careless application, which is naturally, or even necessarily acquired by every country workman who is obliged to change his work and his tools every half hour, and to apply his hand in twenty different ways almost every day of his life, renders him almost always slothful and lazy, and incapable of any vigorous application even on the most pressing occasions.”

We are reminded here that Smith was not an economist – there was no such thing in 1776 – but a theologian and a Professor of Moral Philosophy. His problem with the country workman is not really that he is inefficient. Clearly the things that needed to be done were still getting done. No, he is more worried about a lack of moral fibre. The Devil finds work for idle hands, and all this sauntering and indolence can lead to no good. There is no opportunity to saunter when all you do, all day and every day, is whiten pins or put them into paper.

His choice of a country weaver is suggestive. These were exactly the people who were shafted by the Industrial Revolution, as E. P. Thompson showed in The Making of the English Working Class (Pelican, 2013). They went from making a comfortable living and working the hours they chose to sixteen-hour days in the pitiless roar of a cotton-mill for starvation wages. In his classic essay “The Original Affluent Society,” the anthropologist Marshall Sahlins estimated that hunter-gatherers typically spend 3-5 hours per day obtaining food. (The essay is collected in his book Stone Age Economics, first published in 1972.) The rest of their time, presumably, is spent sauntering.

These represent extremes, of course. As Sahlins points out, the reason hunter-gatherers can work so little is that their material needs are kept to a minimum. (If you have to carry all of your belongings around with you, then you are naturally incentivised to do this.) Few of us would be prepared to accept a material standard of living at this level. But you will have noticed the word material in the previous sentence.

To obtain material goods, most of us need a job. It is the nipple which connects us to the milk of industrially-produced goodies which is what we rely on for survival. Without it, we are ill-equipped to fend for ourselves. In exchange, we accept a hefty set of constraints (and our education system is designed to prepare the way for this acceptance):

  • Timekeeping. You need to show up at the agreed time, and keep working until the agreed time. We are so used to this that we forget how unnatural it is. Smith’s account of the pin-factory makes it clear that no pins can be made unless everyone involved is present and correct; the system doesn’t work if people only turn up as and when they feel like it.
  • Obedience. The guy whose job is to draw out the wire has to draw out the wire, whether he fancies doing so or not. I’ve never had to do this for a living myself, but I should imagine it gets old pretty quickly.
  • Measurement. Smith goes into raptures about the number of pins produced per day (he estimates 4,800 per person) and of course this is an invitation to quantitative assessment of your performance, just like all those tests and exams you did at school. There isn’t so much scope for Taylorism amongst the hunter-gatherers, on the other hand.

Time and again, colonial administrators have bemoaned how terrible indigenous people are when put into factories and expected to comply with this stuff. The same thing happened in England in the early days of the Industrial Revolution. Unless people are trained up to it from an early age, they are unlikely to find these compromises appealing. It is no coincidence that compulsory education came in during the nineteenth century.

Because most of us are so dependent on having a job, unemployment becomes a problem. Governments like to talk about job creation as if this were self-evidently a good thing, regardless of the nature of the jobs themselves. (Technically, the construction of Auschwitz-Birkenau created jobs, after all.) Here in the UK we have the economic miracle of the zero-hours contract, by which one can have a job without any of the benefits. I expect this exciting innovation to sweep through the industrialised world in short order.

The solution to this trap is clear, but not necessarily easy. It is a question of re-education: and not merely in the myriad skills we need to support ourselves outside of the world of conventional employment – and while there are many useful courses and other resources out there, that in itself is still the task of a lifetime. We need also to reappraise some of our most deeply held beliefs about, for example time. I’ll be dedicating a future post to the subject of time, but let’s just say there are worse sins than turning up half an hour late.

After all, how many pins does the world really need?

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.

On human exceptionalism

Modern man does not experience himself as a part of nature but as an outside force destined to dominate and conquer it. He even talks of a battle with nature, forgetting that, if he won the battle, he would find himself on the losing side.

E F Schumacher, Small Is Beautiful: Economics as if People Mattered

It will not have escaped regular readers of this blog that I often refer to myself and my fellow-humans as social primates. This is not, of course, a complete description of what a human being is, although in some contexts I think it can be a helpful one. What it does serve to highlight is that human beings are not separate from the rest of nature.

We often talk as if there were this separate thing called “the environment” which is out there somewhere, and which is something vaguely to do with us, which we can have a minister for, and which can be a line item on a budget (and frequently cut). The reality is that we eat it, we drink it, we breathe it, we could not be more intimately connected with it. We are ourselves ecosystems, hosting and depending on countless other creatures, to the extend that we can be described as holobionts – that is to say, assemblages of life-forms, not simply a species on its own.

From this point of view, it makes little sense to speak of “the environment” as something out there. It is right in here with us; it is us. Yet not only do we in fact constantly speak like this, but we consider it an insult to be put on the same level as our fellow-creatures. Words like brutal and bestial literally mean “like an animal,” while the kind of cruelty that is only too typical of our species is branded inhumane.

We seem to need a lot of convincing that other animals are anything more than automata. Apparently, it is newsworthy that fish might feel pain, or that many animals self-medicate. (That one is even posted under the heading “Surprising Science,” despite the fact that any observant goat-keeper will have witnessed it routinely.) The default assumption is always that non-humans are without consciousness or agency or any capacity with which we might be tempted to identify. We claim an exclusive right to personhood, on no particular ground other than it is convenient for us.

What is this obsession with standing apart from the natural world? It seems to me to be about control. If we are a small part of a larger whole, we are clearly not in control. But if the world consists of objects outside ourselves which we can manipulate, suddenly we can be managers. It is not a coincidence that the word management turns up a lot in descriptions of what we do with the environment, despite the fact that a forest, for example, is dizzyingly complex and manifestly beyond the capacity of anyone to manage in any meaningful sense.

We can, however, at least imagine that we can manage a bunch of inanimate things, robots whose behaviour we can predict and control. After all, we do this all the time with our factories and warehouses. If the natural world is just a machine, we can make it do what we want by pressing the right buttons. That’s the thing about machines: they exist to serve our purposes. But this way of thinking goes back well before the industrial age.

And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth.

Genesis 1:26 (King James version)

To claim dominion is to claim power. The root of the word is the Latin dominus, which means a slave-owner. In Roman law, a dominus had absolute power over his slave, up to and including the power of life and death. That is how we routinely think about our relationship to the natural world.

A good deal of the shouting about how we are destroying all life on the planet is an attempt to convince ourselves that we could if we wanted to. But Stewart Brand notwithstanding, we are not as gods. Yes, we can do and are doing a lot of harm. So long as we pursue the industrial path, we will carry on doing harm, because an essential part of that is being able to pretend that we aren’t doing harm, or that if we are it’s someone else’s problem.

Another defining characteristic of industrialism is that whatever one does is done at the largest possible scale, so we will do harm at the limit of our capacity. That capacity, however, is finite, and Mother Nature is a tough old bird. There have been multiple extinction events in the past, the granddaddy of them all being the Great Dying at the end of the Permian period, about 251 million years ago, which is estimated to have wiped out about 96% of all species on the planet. Life found a way then, and it will find a way this time.

Once we see ourselves as an integral part of the web of life, as members of the community of living beings, we can no longer pretend that our actions are without consequences. We can no longer treat our fellow-creatures as slaves, or resources. I don’t think it is a coincidence that industrial society increasingly treats even human beings in this way; one would have thought the phrase “human resources” came out of the Todt Organisation but apparently these days it is perfectly respectable.

If we want to go on living on this planet – and Mars would not appear to be an inviting alternative – we are going to have to change our attitude. We need to stop behaving like thugs and vandals and start living as decent citizens. This means that we need to give up what control we have and also give up the illusion that we have more control than we do. First and foremost, we need to outgrow our collective sense of entitlement. We are not the only show in town.

When I say “we,” by the way, I am talking about the inhabitants of the industrialised or industrialising world – you know, the sort of people who have Internet access. That is not by any means the entirety of the human race, especially when our ancestors are taken into account, but it is a lot of people. I am not expecting this to happen overnight. At the very least, it will take a generation; major shifts in attitude always do.

But like all such changes. it will happen one person at a time. You could be next. It’s just a thought.

Comments are welcome, but I pre-moderate them to make sure they comply with the house rules.