On quantification

Counting is the religion of this generation. It is its hope and its salvation.

Gertrude Stein, Everybody’s Autobiography

Industrial civilisation is in love with numbers. We have numbers on pretty much everything that can be quantified, and some things that arguably can’t. When someone wishes to assert incontrovertibly that X is the case, the magic words are: “Studies show…”. And how do the studies purport to show that X is true? Statistically, that’s how.

Like so many things that appear to us to be immutable truths carved into the primordial ground of cosmic being, this tendency has a history and is the result of human choices. We first find the appeal to statistics in the eighteenth century, originally in connection with matters relating to governing a state (hence the name) such as population and economic activity. The following century saw the formation of the Royal Statistical Society in London, and the use of statistics by political and other campaigners to advance their cause; one famous example is Florence Nightingale, the Society’s first female member, who popularised the pie-chart as a means of visualising data and thereby presenting it in an accessible form.

It is not, I suspect, a coincidence that this parallels the development of the Industrial Revolution, whose consequences I discussed in a previous post. It is a natural part of the standardising tendency which gave us the Whitworth screw thread, SATS and the Big Mac. We want the world to be made up of things which are all the same.

This is obviously a prerequisite for counting things. If you have a pencil, three bananas and a bicycle, you don’t really have five of anything. If, on the other hand, you can say that one lump of pig-iron is much the same as another, then knowing you have five of those is a useful thing. Of course, it’s not actually true to say that all lumps of pig-iron are strictly equivalent; things like the carbon content will vary from one to another; but you may not need to worry about that if all you want to know is how many tons your foundry produced this week.

Then again, if you care about the quality of your pig-iron, knowing that you have five lumps of it is not especially useful. So the meaning of a statistic depends on what is being counted and why. Who is doing the counting may also be significant; human motivation is rarely pure.

Consider official unemployment figures. No government would like these to be higher than they need to be, and it is much easier and quicker to fix the numbers than it is to fix the economy. Thus the definition of unemployment becomes remarkably fluid, in order to leave out as many people as possible. In the UK, for example, you need to be unemployed and claiming a government benefit. Unsurprisingly this is coupled with a benefits system of Byzantine complexity which has all the hallmarks of having been designed to deter applicants.

But in any case, is counting people really the same sort of thing as counting pig-iron? You may (or may not) be familiar with King David’s census of the Israelites (2 Samuel 24, if you want to look it up). The king soon came to regret it: “And David’s heart smote him after that he had numbered the people. And David said unto the LORD, I have sinned greatly in that I have done: and now, I beseech thee, O LORD, take away the iniquity of thy servant; for I have done very foolishly.”

The biblical text doesn’t make clear why this was such a bad idea, apart from the punishments that follow, but there’s a case to be made that this that this is a reaction against the kind of state bureaucracy that flourished in both Mesopotamia and Egypt at this period. Here I am following the account of state-formation in James C. Scott’s Against the Grain (Yale University Press, 2017), and while I’m not going to attempt a summary here, the relevant point is that ancient bean-counters started off counting measures of barley and ended by counting pretty much everything.

Now there will be some variability even between measures of barley, but it seems intuitively clear that human beings are individuals – indeed, we even use the word individual to refer to a person. Moreover, they have relationships with one another. What does it mean to treat another human being as a countable unit, like a measure of barley or a lump of pig-iron? Surely it is not the way most of us would want to be thought of. It is a denial of one’s basic humanity.

But when one is dealing with large numbers of people – more, let’s say, than Dunbar’s number – it is inevitable that one has to resort to this approach. It’s the only practical way of keeping on top of things, and early states were all about keeping on top of things. This appears to have been why writing was invented, and why so much of the vast corpus of cuneiform texts is so dull.

Nowadays, of course, we have Big Data. This is largely a result of technological advances; scribes inscribing clay tablets can only record a limited amount. Thanks to the miracles of Progress, we are now able to collect, store, and analyse stupendous amounts of data, most of it about us. And because we can, we do. (The scribes of Third Dynasty Ur or Qin China would most certainly have done so, given the opportunity.)

In this context, “data” just means “counts of stuff,” where many of the things being counted are events – typically things that a person has done: they spent x seconds viewing such and such a web-page, they bought this thing, they liked that post. This has a market value, because companies can use that information to sell you stuff. Governments can also use that information to identify people they don’t like; the Chinese government already does this, and I’d be very surprised if they were the only one.

However much we may claim to dislike this state of affairs, we still put up with it. It does however give us some idea of why the ancient Israelites of 2 Samuel found the whole notion of counting people as if they were things so viscerally repugnant. And it is also dangerous, because data can be inaccurate and/or misinterpreted.

This can be the result of error, or it can be deliberate. Science itself is not immune to this: famous (or notorious) examples include Sir Cyril Burt, who fabricated data wholesale to support his ideas about the heritability of IQ, or more recently Dr Werner Bezwoda and his fraudulent cancer research. There may well be many more whose nefarious practices have not been discovered; there is a lot of research which has proved difficult or impossible to replicate. Scientists are themselves human beings, a point which we seem to find difficult to admit.

You can also build a mathematical model to interpret that data, which looks impressive and works beautifully until it doesn’t. This is what happened to Long-Term Capital Management, which went south in 1998 despite the presence on its board of Nobel Prize-winning economists, requiring a bail-out to the tune of $3.625 billion. They thought they understood their model. With modern statistically-based AI, of course, nobody understands how the algorithms work. Because they seem to function most of the time – as the LTCM model did until 1997 – it’s just assumed that we can rely on the results. You may not find that thought comforting. I certainly don’t.

When Hillary Clinton ran for the US Presidency in 2016, her campaign made heavy use of data analysis, all of which suggested that she would win. We know how that ended up. I was reminded at the time of the statistical approach taken by Robert McNamara in managing the Vietnam War, relying on the body count as his measure of success. That didn’t go too well either.

But this is more than a practical question. It has profound implications for how we deal with one another and with the world in general. Is the world, to borrow a phrase from Coleridge, “an immense heap of little things” to be reckoned and classified and managed, to be bought and sold and monetised? Are there not important truths which cannot fit into a spreadsheet? I suspect most people would agree that there are, but that’s not the way we do things. There is the quantified and the unquantified, and most of the time the quantified takes precedence.

Iain McGilchrist’s magisterial study The Master and His Emissary (Yale University Press, 2010) examines this division in the light of research into the workings of the human brain. It’s a fascinating, well-researched and thoughtful book, but ultimately rather depressing. There seems to be every likelihood that we will continue to be blind-sided by our obsession with numbers, just as LTCM did, failing on an epic scale to see the wood for the trees. Nobody felling a tree in the Amazon to clear land for soya-bean cultivation is doing so because they want to damage the planet. They end up doing so just the same.

The philosopher Martin Buber arrived at much the same insight from a different direction. He distinguished between I-Thou relations, in which the other is treated as a person, from I-It relations, where the other is a mere thing. When we count, we necessarily treat what is counted as things rather than persons. This may work well enough for pig-iron, but it doesn’t work for people and I would argue other living creatures too. Twenty thousand chickens in a broiler house can only be thought of statistically; a flock of half a dozen are individuals. If this reminds you of the famous remark attributed to Stalin – “A single death is a tragedy; a million deaths is a statistic” – that is not a coincidence.

Genocide can only happen when the victims are treated as things, not people. Nothing about the Holocaust is more striking than the way in which it was managed as an industrial process. Likewise, ecocide can only happen when living beings are treated as things, either as raw materials to be processed or as obstacles to be removed. Those long-dead Sumerian scribes weighing and recording barley never intended any of this, I’m sure, but that’s where we are.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On value, continued

I have the simplest tastes. I am always satisfied with the best.

Oscar Wilde (attributed)

In last week’s post we distinguished value from price, but having determined what it isn’t we haven’t said what it actually is. In this essay I want to have a crack at doing that.

The word value is prominent in economic discourse but also outside it. We speak of values in the plural when we wish to discuss an ethical position. There is also the expression value judgement to denote a subjective opinion. (I am old enough to remember the Senate confirmation hearings for General Alexander Haig when he was to be appointed US Secretary of State, and he used that phrase liberally in order to weasel out of giving a straight answer.) So what exactly is this thing?

There is undoubtedly a subjective component to it. Economists refer to money we spend on what we want rather than what we need as discretionary spending; one can think of this as an indicator of what people value, at least in the category of things that money can buy. What interests me here, however, are the things it can’t.

On Planet Economics, human beings are rational actors who exclusively pursue their own interests. I don’t know about you, but I’ve never met anyone remotely like that, nor would I wish to. Actual human beings are almost never rational; even when we think we are, most of the time we’re actually justifying our choices after the fact. Rationalisers we certainly are, but rational? Would Las Vegas even exist on Planet Economics?

Advertisers know this, of course. There are very rarely compelling reasons to buy Brand X rather than Brand Y of almost any consumer good. If Brand X is consistently better than Brand Y then usually Brand Y will simply go away, because while people aren’t rational they aren’t idiots either. So the advertisers hired to make you buy Brand X will try to make you like it on non-rational grounds. Buying Brand X will make you cool and sexy and irresistible to the opposite sex. (Personally I have never been attracted to anyone on the basis of what phone they have, and if someone were attracted to me on that basis I would run a mile, but that could just be me.)

There are a few exceptions where some sort of vague gesture in the direction of rational argument is attempted, usually around products that have some medical aspect to them. Often these take the form of surveys. When you look at the sample sizes given in the small print – and I assume it must be a legal requirement to provide these, because I’m sure the advertisers would rather not – they are always pathetically small. If you take enough samples of a few tens of people, you will eventually be able to find one where 79% of them like whatever it is that you’re pushing.

There was an old ad campaign I remember from my childhood which demonstrated this perfectly. These days they’ve learned to be a bit more subtle. It was so long ago that it was a TV advertising campaign for cigarettes, which has been illegal since 1965 in the UK. I’m quoting from memory, but the ad was really just some pictures of moderately cool and sexy-looking people going about their lives with a voice-over that went somewhat as follows:

People.

People like you.

People like you are changing.

People like you are changing to Players Number 6.

Disclaimer: I am not endorsing this product and I don’t think they still make them anyway.

Now I wouldn’t have been their target market at that age, but this must have been plastered all over the TV for me to have remembered it, ahem, many years later. The 1960s was self-consciously a decade of change, so it made sense for the advertisers to latch onto that. It worked, too: Players Number 6 was the best-selling brand of fag well into the 1970s.

What we’re also being sold here – and many, many advertisers and persuaders in general play this game – is that it is better to belong to the majority, or (as in this case) the group that will be the majority soon. This is a value most of us share, at least to some extent, and it is of course straight out of the social primate playbook. And also it has many practical advantages, which is why we follow it.

Some of us, though, imagine we are immune to this. We are outsiders, romantics, rebels. Advertisers have this covered too. I have never understood how consuming a mass-produced good can be a statement of one’s individuality, but you’ll find this claim being made – not, of course, so starkly.

Remember the Apple Mac commercial that riffed on Nineteen Eighty-Four? (It’s here if you need to refresh your memory.) The idea was that buying their mass-produced thing would mark you out as a special, non-conformist freedom fighter. So many people have now bought their mass-produced things that having an Apple product is now a mark of conformity. Nor is this an unintended irony: Apple paid for that that advert precisely in order that this should be so, because they have made an awful lot of money out of it, and continue to do so.

There’s an alternative, of course, which may appeal more to your values, although you won’t find it being suggested by advertisers, politicians, or anyone else; apart from occasional statements by the Pope, who enjoys pretty good job security. If you really want to be a radical outsider in industrial society, you’ll try to avoid buying mass-produced consumer goods altogether.

This is practically impossible to do for everything in our society, particularly when it comes to things like underwear where buying second-hand isn’t an appealing option for many. Your discretionary spending will get you a lot less if you choose not to take advantage of the “fact” that it is cheaper to buy something made in China and shipped half-way around the world than it is to buy the equivalent product from a local small-scale maker, assuming you can even find one. But the choice is there to be made, at least some of the time.

I have already suggested in my essay on food that there are practical advantages to be had if you can eat fresh locally-produced food purchased directly from the producer. A point I didn’t make there is that every pound/dollar/euro you give to that producer is a pound/euro/dollar that isn’t going to Big Ag or to the supermarkets. Ultimately this is the only kind of activism that such entities will pay attention to.

Consider also joining a local trading scheme such as LETS (this is a UK-specific site, but similar things no doubt exist elsewhere). This is the ultimate decoupling of price from value, because no money changes hands at all. Instead, you have a local credit economy – there’s more on the theory behind it here and here if you’re interested. My brother once got himself a second-hand car via his local LETS, so it’s quite a serious proposition.

All this does have the disadvantage of moving you out of the dominant majority group, as things stand today. But as things stand tomorrow, I suspect it will become more attractive to more people, and may indeed end up as a necessity. Your call, of course, but remember you read it here first.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On value

Nowadays people know the price of everything and the value of nothing.

Oscar Wilde, The Picture of Dorian Gray

Following on from last week’s discussion of wealth, I want to investigate the related notion of value. As Oscar Wilde pointed out, we often tend to conflate value and price, but these things are distinct in important ways and confusing them blinds us to much that we need to be aware of. I want to tease out some of these distinctions and some of the things we miss by ignoring them.

A price is a number that is supposed to correlate with or describe value. It is expressed in monetary units, which are an abstract representation of value. Prices are supposed to be determined by markets, which will be getting their own blog post in due course, and these determinations are considered to be infallible.

Now confusing the representation of a thing with the thing itself is a fundamental error. (The philosopher Gilbert Ryle coined the term “category-mistake” for this kind of thing.) Ordnance Survey sheet 158 is not the town of Newbury. It is a description of Newbury, which necessarily leaves a great deal out. Nobody lives in Ordnance Survey sheet 158. It has no MP, pays no taxes, and is only 1/50,000th the size of the real thing. Even in these times of economic difficulty, I don’t think you could buy the town of Newbury for £8.99. Clearly nobody would confuse the map with the actual town. But we make essentially the same basic mistake when we confuse price with value.

I mentioned the price of the map because that is supposed to stand for its value. But of course its value is not a constant thing. Ordnance Survey charge £8.99 because they hope they can sell enough at that price to cover the cost of producing it and make some profit on top. That’s its value to them (and even so, they charge the same for all of their 1:50,000 scale maps, and no doubt some sell much more than others).

But what is its value to you? Unless you live in the Newbury area, or are planning to go there, probably not much. Even if you do fall into that category, is it worth £8.99 of your hard-earned money? To answer that question, you must compare two values: the value of the map, and the value of £8.99.

For the value of £8.99 is not fixed. If you are a multi-millionaire, it is to all practical purposes zero. If you are a rough sleeper with no source of income, it represents a small fortune. You are probably somewhere in between those two extremes, but you will still have a sense of what £8.99 is worth to you.

This brings out the point that although we express prices arithmetically, they are not absolute in the way that arithmetical values are. The number 42 is always and everywhere 42. It is never 43 or 41. My 42 is the same as your 42. Some thinkers would indeed argue that 42 has its own existence, independent of there actually being 42 of anything, but be that as it may we can I think all agree that 42 is always the same thing.

But £42 (or $42 or €42) is clearly not always the same thing to all people. In England in the early fourteenth century, for example, £42 would have represented twenty-one years’ wages for a labourer. By 1900, according to one source, it would be equivalent to £3609.26 in 2021 terms. And of course the value of £42 to someone in a non-sterling country is subject to the further vagaries of foreign exchange. So price looks as if it expresses some eternal mathematical truth, especially to economists, but of course it doesn’t.

Economists, of course, will riposte that the eternal mathematical truth in question is not price as such but the law of supply and demand. That is to say, a good or service is worth what someone will pay for it. The market determines price. On Planet Economics, we all go around making free contracts with one another on the basis of perfect information, and thus we invariably arrive at the correct and fair price for everything. It’s marvellous.

Of course, this rosy picture is far removed from reality. It has the advantage of being a lot easier to model mathematically than reality is, but that’s about all that can be said for it. If you are an economist and your job consists of building mathematical models then this will be sufficient reason to adopt this notion, but the rest of us would probably prefer something more realistic.

Prices are rarely correct or fair. We all recognise this when we say that something is cheap or expensive; we’re saying in effect that the price is lower or higher than than the value it represents. And in a world of perfect information, there would be no such thing as arbitrage, let alone insider trading.

The economist’s definition does have one virtue, though, in that it reminds that a price applies to a specific transaction at a specific time between a buyer and a seller. If nobody wants to buy a good or service, then it has no actual price. (The vendor can offer it at a price, but if nobody’s buying then it’s meaningless.) Likewise, if nobody is selling a good or service, it doesn’t have a price either. You can offer me as much as you like for my first-born child; it will avail you nothing, not least because I have no children.

Now, money in the sense of an abstract representation of value is quite a recent development in human history. Value is both logically and historically prior to price, and distinct from it. Many things don’t fall into the category of things that can be bought and sold, and some of those things are the most valuable of all.

In the industrial world, we have tried to address this by trying to bring as many things as possible into the marketplace. Consider if you will the almost religious awe inspired by Gross Domestic Product. This is an entirely monetary measure and has much less basis in reality than is commonly supposed. Yet GDP growth is the only thing we seem to care about. When GDP goes up, things are assumed to be going well; if it goes down, things are going badly. But it ain’t necessarily so.

For example: consider a couple with a young child. In Scenario A, one of them goes out to work and the other provides unpaid child-care at home. (It doesn’t matter which of them it is, which gender they may be, or whether the couple is gay, straight or what have you.) In Scenario B, they both go out to work, and pay some proportion of what they earn to a third party for child-care. In monetary terms, Scenario B is to be preferred, because more money changes hands and GDP goes up. In terms of quality of life, though, and arguably the best outcome for the child, Scenario A is better, even though GDP isn’t increased. But we can only see the numbers.

An even more egregious example of this thinking is the (in)famous 2013 report which valued the planet’s natural assets at $7.3 trillion US. Now as we have seen over the last year, US dollars can be conjured from nothing in arbitrary quantities – $3.5 trillion or so already – so it is quite conceivable that the Federal Reserve could come up with $7.3 trillion to buy another planet’s worth of resources. The only slight problem with this wheeze, of course, is finding a vendor.

Prices are supposed to perform the miracle of measuring the relationship between incommensurable things. On Planet Economics, it’s supposed to go like this. I have a sack of potatoes. You have an electric toaster. I want the toaster but you are allergic to potatoes. What can I do? Well, by turning my potatoes into money – i.e. selling them to someone else – I will have something to give you for your toaster that you are bound to want. (They’ll need to be pretty expensive potatoes to pay for a toaster, but that’s another discussion.) I give you some nice pictures of Her Majesty Queen Elizabeth the Second, you give me the toaster, and everyone goes on their way rejoicing, with the possible exception of the sucker who paid twenty quid for a sack of spuds.

This kind of story goes back all the way to Adam Smith. It is of course nonsense; Adam Smith himself knew it to be nonsense; even economists must realise it’s nonsense, but they’re still telling it today to explain the origin of money. In reality, as the late David Graeber showed in Chapter 2 of his excellent book Debt: The First Five Thousand Years (Melville House Publishing, 2013) this has been a non-problem for almost everyone throughout history, because this kind of transaction is the exception, not the rule. As with the map of Newbury, this story leaves a lot of things out.

What sort of person, for example, only has potatoes? I’ll tell you: someone at a market with a potato stall. (And when have you ever seen a market stall that only sold potatoes?) The rest of us have other things, and we meet in other settings than the marketplace. You have a toaster that I want. We can have a conversation about what goods and/or services you would be willing to exchange for it. After all, in the real world, we probably know one another. (It’s an artefact of industrial civilisation, and another historically recent development, that so many of us now live surrounded by strangers.) Maybe I have a pregnant cow. I might consider giving you the calf, provided you throw in that comfy armchair and a bottle of your home-made vodka. And because you know me, you’ll be willing to give me the toaster now on the strength of the future calf, and we trust one another enough that if the cow miscarries we’ll sort it out.

We may not even discuss the bottle of vodka or the chair. We understand that calf is a lot more valuable than an electric toaster, even if it can do bagels. When I give you the calf, you’ll owe me… something. Or perhaps you won’t, because we already have a history of mutual credit, and I already owe you… something else. We’ll work it out.

We need prices as a stand-in for value because we live in a commoditised, impersonal society that is in love with numbers and abstractions. This is a highly abnormal, even perverse, way to live. It only seems normal because we grew up with it and it is everywhere. Like so much in our industrial civilisation, however, it is gives us a distorted view of how things really are.

Remember that the next time you buy a toaster.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On wealth

There is no wealth but life.

John Ruskin, Unto This Last

I’m going to start an occasional series of essays enquiring into economics by examining the notion of wealth. Adam Smith himself defined economics as “the science of wealth” and his most famous book is entitled An Inquiry into the Nature and Causes of the Wealth of Nations. We often hear of a mysterious process called “wealth creation”.

Let us begin by distinguishing wealth from money. The Pharaoh Tutankhamun, was, we can all agree, a wealthy man. His mask (shown above) is of solid gold and weighs in at 321.5 oz; at the time of writing, just the metal would set you back a tad over £469,110, even if we ignore the work that went into it and its artistic value. However, the Pharaoh Tutankhamun had no money, either in the form of coinage or of credit. Coinage would not be invented for a good six centuries after he and his mask were interred in the Valley of the Kings, and as pharaoh he owed nobody anything. His wealth consisted in owning all the land in Egypt, and taxes were paid to him in kind from its produce.

So his wealth consisted of physical goods and services. Anyone claiming to be a “wealth creator” in that kind of economy is going to find themselves up against the laws of physics, which has the interesting consequence that the wealth that such people deal in is non-physical. So what exactly might that wealth be?

I would characterise wealth as comprising the following:

  • access to breathable air – without which one is dead; Tutankhamun may have been wealthy when he was alive, but I wouldn’t want to claim that he still is. Of course this isn’t a very exclusive requirement, but clearly people who are obliged to breathe the heavily polluted air of some urban environments would be considered less wealthy that those who are not.
  • access to drinkable water – again, basic stuff, but plenty of people don’t have it and those people are definitely not wealthy.
  • access to food – not just sufficient to keep one alive, but in more than sufficient quantity and of more than merely tolerable quality; Fortnum & Mason rather than the Trussell Trust.
  • access to shelter – and again more than the bare minimum required for survival. Bill Gates does not live in a tent.
  • access to medical care – within the limits of what may be available in your particular circumstances, of course.
  • access to community – we are social creatures, and we need one another. If the Covid-19 pandemic has taught us nothing else, it has taught us this.
  • access to luxuries – whatever those might be for us. Exactly what that means is always contingent on time, place, cultural values and personal tastes, but as the judge said of pornography, you know a luxury when you see it.
  • security of tenure – we need to have some confidence that all of this won’t be taken from us at a moment’s notice. This is a relative thing, of course, as we are all ultimately going to end up the same as Tutankhamun, who doubtless expected to enjoy his wealth for rather longer than he did.

Looking at this list, we can see that there is indeed a mixture of both material and non-material things, and some of them may have quite elastic definitions. Even luxuries may be non-material. For my part, if I were to win the lottery, I would certainly not buy a Lamborghini, but I would still enjoy the luxury of never having to worry about money. This is not a tangible thing.

Again, security of tenure has both material and non-material aspects. If I did own a Lamborghini and someone took it away from me, I would be reliant both on the legal concept of ownership (a non-material thing) and on the police force (very much part of the material world) to make good my loss. Wealth is therefore not simply a matter of physical stuff, although physical stuff is always involved at some level.

This is where money re-enters the story. It may surprise you – it certainly surprised me when I started looking into it – that there is no universally accepted definition of what money is. David Graeber discusses this in some depth in in his excellent Debt: The First 5,000 Years (Melville House, 2012), especially Chapter Three, but for our purposes I’m going to define money as an abstraction of value. That is to say, it is a tool for making quantifiable and commensurable the values of different and potentially incommensurable things, as in when a company pays someone £10.00 an hour.

We take this so much for granted that we often confuse money with value itself. But of course the worth of money consists entirely in its acceptability. I remember travelling to the Netherlands in the days before the Euro and being given a bunch of these things at the Foreign Exchange desk:

I simply couldn’t take this seriously as money. Nevertheless, Dutch people all seemed quite happy to accept them and give me goods and services in return.

Nor is this just a characteristic of paper money. Try taking a bag of sestertii down your local supermarket and see how far you get. Of course you could sell your coins to a collector but then you would be exchanging them for your locally acceptable currency, effectively turning them into money even though they were minted as money in the first place.

None of this was an issue for our friend Tutankhamun, who as noted above had no money, although he had plenty of access to value. And the foundation for that was an intangible thing: his prestige as the divine ruler of all Egypt.

This gives us some context for understanding the magical phrase “adding value” which is what wealth creators claim to be doing. Again, I think we are looking at a mixture of physical and non-physical things, which I am going to call transformation and pixie dust.

A basic example of transformation would be to take a chunk of flint and turn it into a hand-axe. You can do a lot of things with a hand-axe that you can’t do with a lump of unworked rock; this is the added value. Another example would be to cut down a tree and use the wood to make a chair. This is less clear-cut, because the tree had value which it has now lost – you won’t find many birds nesting in a chair, nor is it a CO2 sink. You are exchanging one bunch of value for another. In other words, there are often trade-offs with this kind of thing, and not all of them are going to be obvious.

Where you can really cash in, though, is by adding pixie dust. This is what brands are, fundamentally. You can have two functionally equivalent items – T-shirts, say – and by the simple act of attaching a designer label to one of them you can charge far more money for it. This only works, of course, to the extent that people buy into it, which is why we have the advertising industry. Naomi Klein’s book No Logo (Fourth Estate, 2010) goes into this in depressing detail.

Another popular way to apply pixie dust is to add pointless features. Cars are an excellent example of this. At this point, we know how to do cars. There are no longer any killer features to distinguish one from another; they all cover the same essential bases. Therefore the two avenues to added value are branding (of course) and adding extra bells and whistles. Do you really need a powered cup-holder? Probably not, but you’re going to be offered one if the manufacturer thinks they can charge you a bit more for it. All of these extra bits and bobs are of course more things to go wrong.

The trouble with pixie dust, though, is it has no substance to it. If I need to go down to the shops, I don’t need a Lamborghini, and I’m going to need a lot of persuading that I do, especially as there isn’t much room for your groceries in the back of one of these:

So what is the foundation of true wealth? I think there’s a clue in the fact that all of the things on my list depend on the co-operation of others. (Including breathable air – remember the tree that was cut down to make a chair?) We are social creatures, and our well-being depends on social factors – for example, some kind of arrangement that ensures reasonable security of access to one’s needs. This doesn’t have to look like our current property laws, by the way, but that is the purpose they are intended to serve.

Now I’m not putting forward any kind of legislative programme, and I doubt any my readers are in a position to do so. What I would encourage is the development by each of us of local, personal networks that can provide mutual, practical aid and support, because we’re all going to need it. But if we have that, maybe we can not just weather the forthcoming storms but even prosper.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On slogans

War is peace.

Freedom is slavery.

Ignorance is strength.

George Orwell, Nineteen Eighty-Four

The word slogan comes to us from Scottish Gaelic, and its original meaning was battle-cry. As such, its basic function is a declaration of group identity. The meaning of what is being yelled is less important than the fact that all of us over here are yelling it together. My intention here, however, is to unpack a few slogans as if the meaning of the slogan mattered.

I’m doing this because the slogan is one of the favourite rhetorical devices of our age, and as such it is used to persuade. By encapsulating what one desires the audience to believe in a slogan, one bypasses their critical faculties, because the content of a slogan is not usually examined. Instead it is swallowed whole.

My first example is a golden oldie that will be familiar to UK readers, originating with the Women’s Institute (although it has since evolved into its own charity):

Keep Britain tidy.

I chose this because it seems quite innocent on the face of it – nobody suspects the WI of dark ulterior motives – but it packs a surprising number of assumptions into three short words.

Firstly, it implies that Britain is already tidy, in the teeth of the evidence; or at least that tidiness is somehow Britain’s default mode of being. In its original context, the aim of this slogan was to get people to pick up litter, or at least not to drop it, so this is a little surprising.

Secondly, it assumes that tidiness is a good thing, in and of itself. I would argue that like many things it’s a good thing up to a certain point. My point here is that the slogan elides any discussion of how tidy we want Britain to be, or indeed what tidiness is or should be.

Thirdly – and this is something we find in many slogans – it is expressed as an imperative. The WI is a fine body, but it has absolutely no authority to command anyone to do anything. Advertisers love this. They often command us to buy whatever it is they’re selling, presumably because it works, even though they have even less authority than the WI, which at least has some claim to the moral high ground.

Apart from telling you to buy their product, advertisers are reluctant to make definitive statements even in their slogans. My favourite example of this, which again is an old one, is:

Ford gives you more.

Four words, but so many begged questions. What is it exactly that Ford gives you more of? One could perfectly well interpret this as “Ford gives you more trouble,” but presumably that wasn’t what they had in mind. And what is it than which Ford gives you more of this thing? I assume we are supposed to insert the name of Brand X here. Notice how they are careful not to say anything that could be objectively tested. Had they said, for example, “Ford gives you more miles between services than Peugeot,” we would be able to look at the facts and decide whether it be true or false. But that would cease to be a slogan and become a claim that invites verification.

What we have instead is a vacuous form of words which, if repeated endlessly, will leave you with a vague warm fuzzy feeling towards Ford, possibly to the point that you end up buying one of their cards. This sort of thing is very popular in car advertising, because there isn’t really all that much to choose between different makes of car. Hence the saying – I’d even call it a well-known fact -“You are what you drive,” a maxim that renders me non-existent but which many people appear to believe.

This characteristic vagueness on the part of the motor industry shows up in our next example:

You can in a Nissan.

What exactly is it you can do in a Nissan that you can’t do in some other make of car? The slogan prudently refrains from telling us, because the answer is: absolutely nothing. It does, however, suggest an ill-defined notion of empowerment. After all, the main thing you can do in a Nissan – as you can equally well do in a Fiat, an Audi or a Hyundai – is to drive from one place to another. The motor industry has spent decades trying to instil in us all the notion that this is the true meaning of freedom. And freedom is good, right? You’re probably lacking much other freedom in your life, what with all the time and effort you put into making enough money to pay for your car, amongst other things. Ivan Illich went so far as argue that when this time was taken into account, the actual speed of a car was around walking pace.

But the queen of all “empowering” slogans must surely be:

Because you’re worth it.

Devised back in 1973 on behalf of L’Oréal, a company which markets cosmetics to women, this is a beautiful instance of the personal being political. What, after all, is the point of cosmetics? Surely to make yourself more attractive to others. And why should you want to do that? This was a hot topic in 1973, when second wave feminism was in full swing. L’Oréal’s business proposition could easily be characterised as: “Hey, fish, would you like some help getting a bicycle?” Not an easy sell.

Previously, cosmetics advertising played quite straightforwardly on the insecurity of women, with headlines like “How to Bring Your Husband Straight Home at Night.” The genius of this slogan is that it continues to do so while appearing to do the opposite. Clearly there is an underlying sense of worthlessness which is assumed women have; by seeming to affirm the opposite, L’Oréal tacitly acknowledges its existence. You will feel better about yourself if you use our products, it suggests, and you deserve to feel better, therefore you should give us your money.

The unspoken corollary of this is that if you don’t buy our products it is because of a sense of self-hatred. Once this assertion is exposed to the daylight, of course, it becomes self-evidently ridiculous. But of course the art of slogan is conceal its underlying assumptions under a plausible surface.

Which brings me to my most recent – and controversial – example:

Black lives matter.

On the face of it, this is an entirely reasonable statement. It was coined in response to racially-motivated police brutality in the US, which as far as I can tell – and I live many thousands of miles from the US – is a real and appalling issue. Nevertheless , it can be read to imply things that are far from reasonable, as became apparent when an alternative version was proposed: “All lives matter.”

This would seem even more reasonable than the original, but was vehemently rejected by the BLM campaign. Which leads us with inexorable logic to the unpleasant conclusion that what is really meant by saying “Black lives matter” is that “Some lives don’t matter.”

Which lives? And why not? It seems to me that these questions need to be brought out into the open and honestly discussed.

At the back of this is a curious notion of virtue which has somehow evolved in the recesses of US academia. It has long been considered a truism that two wrongs don’t make a right, but we are now supposed to see wrongs as the only genuine source of right, and that the more oppressed a person is, the better and righter they are. I confess I struggle to understand how anyone thinks this could work, but there it is. Statistically, some not very nice people must surely have perished in the Holocaust. That doesn’t justify what was done to them, but it seems weird to pretend they must all have been angels.

When those who have been persecuted become persecutors in turn, they do not get a free pass. Consider the history of Christianity. For the first three centuries of its existence, its followers were subjected to various penalties, sometimes very severe. (How severe tended to fluctuate from one emperor to the next, but it could certainly include being thrown to the proverbial lions.) As soon as the Church was established as a branch of government, however, the burning of heretics could begin. Arguably, a similar pattern is visible in the Israeli state’s treatment of Palestinians.

People have an unlovely tendency to identify other groups of people as evil, on racial or religious or otherwise arbitrary grounds. Sometimes they maltreat or even attempt to exterminate these groups of people. It’s never justified, regardless of who does or it who is on the receiving end. It was bad when the USA passed the Chinese Exclusion Act because of racist feeling against the Chinese, and it is bad when the Chinese persecute the Uighurs on religious (and possibly racial) grounds. Again, nobody gets a free pass.

So the questions that a slogan begs can be deep and sometimes disturbing. Next time you encounter one – and you won’t have to search far – take a moment to unpack it, and see what’s underneath. You might be surprised, and disturbed, at what you find.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On hope

Hope is being able to see the light despite all of the darkness.

Desmond Tutu

The turn of the year is a season when people naturally turn their thoughts to the future, as well as looking back on the year just passed. (Hence the appropriateness of naming January for Janus, the two-faced god of boundaries.) This year is particularly dramatic. It has of course been dominated by the Covid-19 pandemic, which is very much still with us. Here in the UK,we are also looking forward (with varying degrees of trepidation) to Brexit. Early in the year, that was all that the news media seemed to talk about, and now at the eleventh hour it has crept back onto the news agenda.

Under the circumstances, hope seems frankly irrational. But then that is the nature of the beast; if one had certain knowledge of the future, it wouldn’t be hope. Think of all those tombstones bearing the phrase “in the sure and certain hope of the Resurrection” – if it were sure and certain, it wouldn’t be hope. As the ecclesiastical historian Owen Chadwick said in another context, faith would not be faith if it were knowledge, and hope is a close cousin to faith.

There is nevertheless great strength to be had from this kind of irrationality. I can personally vouch for its usefulness in getting through bad times, and if these aren’t bad times they’ll do until bad times come along. Sometimes pig-headed persistence is all there is. It worked for us in 1940, after all.

It also worked for Barack Obama, who won two terms as US President on the basis of hope. He even wrote a book entitled The Audacity of Hope (Canongate Press, 2008). The fact that, for example, he failed to close down Guantanomo Bay despite this being one of his initial election pledges and having eight years in which to do it, proves that hope springs eternal in the bosom of the electorate. No doubt the current inmates of Camp X-Ray are hoping that Joe Biden will come through.

There is a saying – I don’t know its ultimate origin – that “hope is hopeless.” I rather like this as a counter-balance to the fetishisation of hope in and of itself which is so prevalent in our culture. Hope is important and necessary, but in itself it is not a substitute for positive action. We are often prone to forget this.

You will often find people of a vaguely New-Agey cast blathering on about the law of attraction, which Wikipedia helpfully summarises as “the belief that positive or negative thoughts bring positive or negative experiences into a person’s life”. Now there is something in this, insofar as your thoughts influence your behaviour, and your behaviour influences your life. But behaviour is about action. Winston Churchill did not simply light a candle and trust the Universe: on the contrary, he took vigorous action to maintain the struggle with Nazi Germany in the face of apparently impossible odds.

We should also bear in mind that apparently impossible usually are indeed impossible. “Wizards know,” says Terry Pratchett, “that million to one chances come up nine times out of ten.” This is funny because it’s only true in the kind of fantasy universe that Pratchett is affectionately satirising. Yes, it always happens in the movies; in real life not so much. We remember and celebrate Churchill in 1940 because it was improbable. The previous year, the Polish President Ignacy Mościcki had also faced apparently impossible odds, and that didn’t turn out so well.

Yet even if the light which Desmond Tutu speaks of is not actually there, we are still well-advised to hope. It may be that no action we can take will avert disaster. In many areas, I would say that is clearly the case; we are not going to “fix” climate change, for example, however much pious hot air politicians may contribute. We may however be able to mitigate disaster, or at least to adapt to it. (Jem Bendell’s notion of “deep adaptation” is relevant here.) Crossing your fingers is definitely not going to help, and neither is giving way to mere despair.

How then to sustain and nourish hope? Archbishop Tutu has the consolation of a strong personal religious faith, and if you have one of those I strongly advise you to make the most of it, unless your strong personal religious faith is atheism, which may not help much. It does seem to me that faith of some sort is going to become more important to many people in the future, if only in the sense of there being no atheists in foxholes. Stoicism is certainly an approach that will be of help to many and is entirely compatible with atheism, for that matter.

For my part, I take comfort in the larger view that life on this planet is incredibly resilient. It has been through much worse things than we can throw at it: the Permian extinction, for example, and before that the Great Oxidation Event. The grass will still grow, albeit at slightly higher latitudes.

I remain impressed and encouraged by the vision of the future outlined by Chris Smaje in his book A Small Farm Future, which I recently reviewed. There are people all over the place doing good and useful work: off the top of my head I can think of Incredible Edible, the Agroforestry Research Trust, Joel Salatin and Gabe Brown in the US, and that’s just talking about food. Every day the penny drops for more people that we can’t go on like this. and at least some of those people are starting to take action. That which is unsustainable will not be sustained, after all.

It’s easy to get too fixated on current events, the froth on the surface, and overlook the deeper currents. Things will be rough, certainly, but they won’t stay that way for ever. These upheavals may be what we need in order to bring about necessary changes. We need to cling onto that hope if we are to notice the opportunities that may emerge; but we also need to seize those opportunities and make something of them.

At any rate, those are my thoughts. Let me know yours in the comments.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

Predictions for 2021

It’s tough to make predictions, especially about the future.

Yogi Berra

It is the season for pundits everywhere to make predictions about the coming year, most of which will turn out to be complete bobbins. I hereby present to you five suggestions for things I think are likely to happen in 2021. It will be interesting to revisit this post this time next year. Some of these are quite specific, others are less so, but it should be clear enough in each case how far off the mark I am.

(1) Julian Assange to be extradited to the USA

This is of course pretty basic realpolitik, and given that we are already half-way through the extradition hearings and that the UK is even more desperate than usual to curry favour with its transatlantic overlords this one is pretty obvious. I only mention it because the Assange case isn’t getting much airtime these days, and he is unlikely to see daylight again once the Americans get their hands on him. He made them look like idiots, which is the surest way to irritate the powerful.

In the unlikely event that the judicial route is unsuccessful, he may well suffer some unfortunate “accident”. I’ll still count that as half a hit. However, if the end of 2021 finds Mr Assange enjoying a pina colada on some tropical beach, I’ll have missed this one. To be honest, I would prefer that outcome, but it isn’t up to me.

(2) Boris Johnson to leave office

If this happens, I don’t think many people outside the UK will miss him. It might seem unlikely, given that his government has a huge majority, but it will be his own party that does for him. They have plenty of form in this area: Conservative leaders who don’t cut the mustard are traditionally disposed of without mercy. Even Mrs Thatcher got the chop from her own side rather than the electorate.

The motivation for this will be Brexit. At some point in 2021, and sooner rather than later, it will become apparent that this was not the masterstroke which we were assured that it would be. We don’t need to get to the point where there are food riots for this to become an issue. The obvious move to limit the political damage, from the Conservative Party’s point of view, is to blame it on Boris. Brexit would have been marvellous, the line will be, if only this bumbling incompetent hadn’t been in charge.

Now I hold no brief for Mr Johnson. He is indeed a bumbling incompetent, as has been shown multiple times throughout his career. I will shed no tears for him if he is bundled out of Downing Street. But I don’t think you can pin all of it on Boris. Still, this is going to be the best option available, and I expect the Conservatives to give it a go.

It may be wondered who will succeed him, given the startling assembly of third-rate no-hopers whom he has gathered into his Cabinet. History tells us, though, that being a third-rate no-hoper is no bar to leading the Conservative Party. They had Ian Duncan Smith in charge not so long ago. Those with longer memories may recall, with some effort, John Major. As for David Cameron, least said soonest mended.

So the office of Prime Minister will be filled by someone or other. I don’t expect them to be a spectacular improvement on Mr Johnson, although they can’t be much worse. Unless we get Gavin Williamson. Or Priti Patel. Or… I’ll stop now before it gets too depressing.

(3) The USA to suffer its Suez moment

The Suez Crisis of 1956 is generally thought of as the moment when the UK was obliged to recognise that it was no longer as big a force in the world as it had been. Essentially, the Egyptians had nationalised the Suez Canal – which was an entirely legal act – and we decided that we would relieve them of it, with the assistance of France and Israel. We failed to get the permission of the USA to do this, and were forced to desist.

For a very long time, the USA has been accustomed to throwing its weight around in foreign affairs, replacing national governments as it saw fit. After the collapse of the USSR, there didn’t seem to be anyone who could stop them doing whatever they wanted. This led to a somewhat euphoric period, summed up in Karl Rove‘s notorious declaration: “We’re an empire now, and when we act, we create our own reality.”

Those who create their own reality sooner or later collide with some solid object that disillusions them, and this is what I think will happen to the USA next year. Foreign adventures are the traditional way for a precarious regime to cement national solidarity – this was why Argentina invaded the Falkland Islands in 1982, for instance. It is pretty generally agreed that the USA’s internal affairs are in a somewhat parlous state right now. They are likely to try and push their luck, and my guess is they will embarrass themselves.

What form will that take? There are plenty of hot-spots in the world where Uncle Sam might choose to plant his size nines. The South China Sea suggests itself. Perhaps one or other of the Gulf States might implode. But the USA will try and cross a line, and either Russia or China – or Russia and China acting together – will tell them no. And they will find themselves having to take no for an answer.

This will come as no surprise to anyone outside the USA and will cause complete bafflement and consternation within it, as 9/11 did. Of course I could be wrong about this happening in 2021, but I don’t think I’m wrong about it happening some time soon. We shall see.

(4) Covid-19 to rise again after victory has been declared

Governments everywhere will be keen to claim that the whole Covid-19 thing is now under control, thanks to their brilliant handling of it, and that we can all get back the serious business of creating shareholder value. At least one of them is bound to declare this prematurely, and another major outbreak will ensue.

I won’t state categorically that it will be the UK government that does this, but I wouldn’t bet against it either on current form. It’s likely to occur in a country that has suffered heavily from the virus, because that’s where the most points can be scored for “defeating” it: the US, China, Spain, France and Italy are all candidates.

I am not saying that the pandemic will go on forever. Pandemics don’t. Nobody developed a revolutionary vaccine against the Black Death, but it’s no longer a major problem. This is more about some government claiming to have overcome the virus and then being proved embarrassingly wrong.

When I come to assess this one next year, much will turn on the vague phrase “major outbreak” – I’m sure there will be at least one unambiguous claim of victory.

(5) Another major global financial crisis will hit

This is a matter of when, not if, since there was little done to address the systemic issues in the global financial system that were so cruelly exposed in 2007-8. Which domino will fall first is anyone’s guess. The Italian banking system has been a disaster waiting to happen for some time, and I doubt that the pandemic has helped the situation. Deutsche Bank is also in less than perfect health.

We could also be looking at a currency crisis: sterling, or (heaven forfend) the US dollar could come under pressure. Money is being created hand over fist to prop up industrial economies in the face of the pandemic. Certainly the UK government has been throwing it around like a sailor on shore leave, having apparently discovered the elusive magic money tree. The Federal Reserve in the US has also put in eye-wateringly vast sums.

Plenty of major national economies have been flying on one engine for a while. Brazil is in serious trouble. China might be, as nobody really believes the official government statistics. Lord knows what will happen to the UK, but it’s not going to be pretty. If even a second-tier economy has to default on its international debt obligations, something somewhere in the financial system is likely to break.

Nobody really knows how long all this can keep going. It’s entirely possible that I’m wrong about the timing and 2021 will not turn out to be the moment Wile E Coyote finds out he has run out of cliff. But if not next year, soon enough.


So there you have it: my five cheerful prognostications for 2021. I don’t expect to be right on all of them. Let’s reconvene in twelve months’ time and see. Tell me what you think in the comments!

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On festivity

“If Winter comes, can Spring be far behind?”

Percy Bysshe Shelley, “Ode to the West Wind”

The day before this post was published, up here in the northern hemisphere we had the 2020 Winter Solstice, the shortest day of the year. At these latitudes, the day doesn’t get noticeably longer until December 25th, which not coincidentally is also the birthday of Sol Invictus. He doesn’t get much airtime these days, but he was Constantine the Great‘s deity of choice until he eventually decided to swap over to the other guy with the same birthday. (Or rather, the other guy who was assigned the same birthday; the historical Jesus probably wasn’t actually born on 25th December.) This is why we have Christmas, although I always raise a glass to Sol Invictus.

Christmas is one of the few fixed points of celebration left in the calendar, at least here in the UK. Back in the day, there were so many that where we would specify a date by the day of the month, people would simply reference the nearest feast-day, as for example in the title of Keats’ poem “The Eve of St Agnes” (which by my reckoning would refer to the 20th January). But these days, thanks partly to the Reformation, very few of these days are celebrated or even remembered. There’s Bonfire Night, Remembrance Day and Easter – which isn’t all that fixed – and apart from some lukewarm awareness of the various national saints’ days that’s about it.

I’ve mentioned the Reformation, but even without its help I think we would have seen this effect at some point as industrial culture developed. One of the tenets of that culture, which is so deeply ingrained that it’s hard to notice, is that things come in standard interchangeable units. This works pretty well for screws, but it gets applied to everything else, including time. One day is therefore to be the same as every other day. This is why heroic efforts are made by the food industry to erase seasonal eating, and you can buy strawberries in December, or at least something which is botanically a strawberry. This is why we have lavish artificial lighting, so that the length of the day is barely noticeable. When we do have a feast-day that can’t be ignored, its time-period is ludicrously extended so that ceases to be a special day and becomes more like six months. As the saying goes, you can tell it’s Christmas because the shops are full of Easter eggs.

Part of this, of course, is just to sell us more stuff. Presumably there are some people who want to buy tinsel in July, or who can be convinced that they do. But I wonder if there’s something more insidious at work.

For reasons that may well be material for another post, I often find myself comparing life in the UK to life in Spain. Spain, of course, missed out on the Reformation, but large parts of it are also comparatively unscathed by the Industrial Revolution. There are notoriously many public holidays in Spain, many of them saint’s days, and many of them local. The smallest political unit in Spain is the parish, and of course every parish has a patron saint, and that saint’s day will be the occasion of a fiesta. In some places, that can be a major affair that lasts for several days.

I don’t think it’s a coincidence that the Spanish, in my experience, are much more community-minded than the British. A Spaniard is never just a Spaniard. They have a place and a heritage and an identity beyond that. They will have opinions – not necessarily complimentary – about Spaniards of a different place and heritage. Spanish car registration numbers used to include a code indicating the province of issue; when the government changed over to a new scheme without them, people demanded to have them back again. In the UK, car registration numbers also used to include a code showing where the car was registered; this scheme went away in 2001, and I had to look that up because it happened without a murmur.

So a sense of time can also be a sense of place, and neither of those conforms easily to the uniformity of industrial culture. These things tie us back to the natural order, to the passage of the seasons. They can also lead us into a dangerous localism where things like food are concerned. As Charles de Gaulle is said to have lamented, “How can you govern a country which has 246 varieties of cheese?” (I suspect this is a considerable underestimate of the number of French cheeses.) If today is different from yesterday, and here is different from there, and one person is different from another, suddenly it becomes extremely complicated and difficult to manage the world.

(It’s another deeply-ingrained assumption that everything needs to be managed, but that is definitely matter for another post.)

Awareness of the seasons is something everyone always had, at least in those parts of the world such as this which have seasons. Of course different places divide the seasons differently – there are at least three recognised seasonal schemes just amongst aboriginal Australians – but the key thing is that time is not uniform and that people need to be aware of that.

To every thing there is a season, and a time to every purpose under the heaven:

A time to be born, and a time to die; a time to plant, and a time to pluck up that which is planted;

A time to kill, and a time to heal; a time to break down, and a time to build up;

A time to weep, and a time to laugh; a time to mourn, and a time to dance;

A time to cast away stones, and a time to gather stones together; a time to embrace, and a time to refrain from embracing;

A time to get, and a time to lose; a time to keep, and a time to cast away;

A time to rend, and a time to sew; a time to keep silence, and a time to speak;

A time to love, and a time to hate; a time of war, and a time of peace.

Eccesiastes 3 i-viii (King James Version)

Traditionally, this is a time to close ranks, to defy the cold and the dark, to affirm life at the time that people have always been most likely to die. (In Old English, people’s ages were usually expressed as so many winters.) The days are getting longer, the sun is getting stronger, and spring is on its way. Feasting in the middle of winter is an act of collective bravado when you are completely dependent on the food you have managed to store up. It’s a morale-booster at the time that morale is likely to be lowest. People need this kind of thing.

Christmas has of course been hugely commercialised. We now have the secular feast of Black Friday, also known as Buy Nothing Day amongst those who object to such commercialisation. I have some sympathy with that view. Personally I don’t have enough disposable income to wallow in consumerism, and wouldn’t do so if I could, but I am still going to mark the season as best I can.

I find it an encouraging thought – and goodness knows they’re in short enough supply nowadays – that although we can conceal the march of the seasons from ourselves with central heating and artificial light, it still goes on despite out best efforts. The world is bigger than us, bigger even than Walmart, and it’s going to keep on doing what it does.

Whatever you’re doing, or not doing, for Christmas, spare a thought for Sol Invictus, the Unconquered Sun. It really will be spring again, eventually.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On food

Eat food. Not too much. Mostly plants.

Michael Pollan, In Defense of Food: An Eater’s Manifesto

Before we start, I want to make one thing clear: this is not a post recommending that you should become vegan. I have no problem with people being vegan if that’s how they want to eat and they can source a healthy diet from local and sustainable ingredients. If that describes you, great. It is not, however, something I would recommend to everyone. Generally there are very few things I would recommend to everyone, because individuals are different.

I reference food a lot on this blog because it’s something we all have in common. Everything that lives requires nourishment. There are two aspects to this that I want to discuss:

  • quantity – you need access to enough food to keep you alive, and:
  • quality – you need your food to nourish you so that it keeps you healthy.

The first point is pretty obvious. There are various recommendations as to how many calories a person needs per day, depending on how energetic their lifestyle is (and many lifestyles are likely to become a good deal more energetic in the future). But the second point is that not all calories are created equal.

This is also fairly uncontroversial. If we imagine two people who get exactly the same number of calories, but one of them is getting their calories from a balanced diet while the other one only eats chocolate-chip cookies, I don’t think we need to be Nostradamus to forecast which of them will be healthier.

On the other hand, there’s a fair amount of wiggle-room in that happy phrase “a balanced diet.” Human beings are omnivores, and they can adapt to a surprising range of diets. The Inuit, for example, contrive to be healthy while eating heroic quantities of fat and hardly a vegetable of any kind – not surprising when you consider their environment. Then you have the Maasai of Kenya, who traditionally live off their herds of zebu and consume a lot of milk, blood and fat. They also thrive on it, in spite of eating levels of cholesterol that would give the average dietitian a panic attack.

Now there is probably a genetic component to some of this, but that reinforces my earlier point that individuals are different and one size will not fit all. I don’t know if there are any vegan Inuit, but if there are they may well struggle with such an alien – to them – diet. (Do get in touch if you are, or know, a vegan Inuit; I’d be interested to know if my guess is correct.) But you don’t need to be pre-adapted to an extreme diet to have particular dietary need.

Consider the various allergies and food intolerances that plague so many people today. This would appear to be a comparatively recent phenomenon, and it’s hard to resist linking it to the rise of industrial food production. We know from the history of infant formula, for example, that what was supposed to be a scientifically correct replacement for a natural food turned out to be seriously deficient (missing essential vitamins in this case). I would be wary of believing that we have a comprehensive grasp of human nutritional needs even today, especially given the degree of individual variation that I’ve already mentioned.

I would love to see research done on the incidence of gluten intolerance in those who eat traditionally baked and thoroughly leavened bread compared to eaters of the curious bread-like substance produced via the Chorleywood process (that’s your usual supermarket loaf). There are already widespread concerns about its nutritional value. If you really want to be put off processed foods, I can highly recommend Joanna Blythman’s book Swallow This (Fourth Estate, 2015). After all, the phrase “processed food” itself means “food to which something or other has been done, which we aren’t telling you about” and if that doesn’t bother you, well, it probably should.

There’s also evidence that our food has become less nutritious over time, due to the depletion of agricultural soils. Plants are extraordinary things, but you can’t expect them to conjure nutrients out of nothing. I’ve read that there are some (US-grown) oranges which contain no vitamin C at all. This can’t be a healthy development.

In the last century, a Canadian dentist called Weston Price did some research into traditional diets versus the typical Western diet of the time. Originally he was interested in the effect on dental health but he later widened his focus to consider overall health. He was able to correct for genetic variability by comparing members of the same tribe – even siblings, in some cases – some of whom had moved into town and were eating a modern diet and some who we still eating the traditional way, whatever that meant for them.

His conclusion was that people did far better on the traditional diet, widely variable as that might be. Those who moved to the Western diet tended to suffer from chronic health problems that had previously been rare or unknown amongst them, such a diabetes, hypertension and heart disease. These problems are of course rampant throughout the industrialised world. It’s hard to see this as a coincidence.

Now if you suffer from a chronic health condition (full disclosure: I am myself a Type 2 diabetic) you will be dependent to some extent on medication. The pharmaceutical industry probably won’t vanish overnight, but it’s still a potential issue in an uncertain and probably troubled future. If you could control your condition through diet, that issue would go away. As Hippocrates is supposed to have said (presumably in Greek): “Let food be thy medicine, and let medicine be thy food.”

This is all well and good, but what can we do about it in practice? Well, let’s return to the quotation from Michael Pollan at the head of this article:

  • Eat food – what Pollan means by this, as he explains in the book, is to avoid what he calls “edible food-like substances”, i.e. most of the stuff you will find in the food aisles of your local supermarket. I would expand this advice to suggesting eating food where there are the fewest number of steps between you and the producer: grow it yourself if you can, or get it direct from the grower (proper farmer’s markets are always worth patronising), or at least get it from a local greengrocer or butcher where you have some chance of finding out where it came from. When we lived in Kent, we used to shop when we could from a small independent butcher with their own slaughterhouse; they claimed that all of their meat came from a one-mile radius of the shop. Such places are all too rare, but they exist. Seek them out.
  • Not too much – prioritise quality over quantity. You will spend more on food, but you will also get more for your money. There is an obsession with cheap food, especially in the UK. If you can buy a chicken for £3.50, a lot of corners have been cut in the production of that chicken. Buy less other stuff that you don’t need, and spend the money you save on food. It will change your life.
  • Mostly plants – if you have to eat industrially produced food, you will do yourself less harm with fruit and veg than eating factory-farmed meat. I was a vegetarian for many years, and it’s a lot easier now than it used to be. If you can’t afford to eat meat every day, because you’re buying the good stuff, well, you don’t have to.

Being conscious of what you eat will also lead to enjoying it more. Food can be one of life’s great pleasures. We treat it as if it were an inconvenience. If you don’t know how to cook, learn. I recommend concentrating on the food eaten by peasants, in whatever cuisine you fancy; that will give you cheap, simple, but nutritious meals. If you have good quality ingredients, you really don’t need to do much to them to create a good meal.

Moreover, getting closer to the practicalities of how food is produced – ideally growing a bit yourself – will have you in a much better place in the event that you need to provide for yourself in the future. There’s a hilarious scene in the film Withnail and I in which two clueless young men are faced with trying to turn a live chicken into dinner. If you find yourself in that position, you really don’t want to be those guys. There are plenty of books – John Seymour’s The New Complete Book of Self-Sufficiency is the classic – but there’s no substitute for practical experience.

Now I’m not saying this is necessarily easy. The way we live now is not set up to support this kind of eating, and that’s not an accident. Where I live now, my food-buying choices are two supermarkets and a small weekly market which sells some fruit and vegetables of unknown origin. A lot of people are in that position. In modern society, people tend to have either time or money, but rarely both together. You’re going to have to work around that in whatever way you can, with the resources you have where you are.

But food is not a luxury. Fifty years ago, people in the UK used to spend a far greater proportion of their income on food than they do now. I was alive then, and there definitely weren’t food riots. It’s a question of priorities. At the very least, be informed about what you eat, and make your choices deliberately. You’ll learn a lot, and maybe it will change the way you see the world.

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.

On putting people into boxes

Either you are with us, or you are with the terrorists.

President George W. Bush

There is much talk nowadays about polarisation, in politics and in society more generally. This is often blamed on social media, and the algorithms that tend to show you more of the same rather than anything likely to challenge your preconceptions, and there is undoubtedly some truth in this. If you never have to engage with someone who disagrees with you, or even has a different perspective on the world than you do, then you are unlikely to change your mind about anything.

I suspect, however, that it goes deeper than this. Industrial culture is built upon the Abrahamic religions, and they tend to be so saturated in notions of good versus evil that it is almost invisible to us. It isn’t at all necessary to see the world in this way; classical civilisation didn’t, for instance. Aristotle saw virtues as character traits rather than abstract things in themselves, and he defined each virtue as the mean between two vices. Courage, for example, he takes to be not the opposite of cowardice but the mid-point between cowardice and recklessness.

We don’t think like this. Instead, we think of virtues and vices (if we think of them at all) as pairs of opposites. When we define a thing, we often do so in terms of what it isn’t. We like nice clean edges around things. For example, we like to define languages by compiling grammars and dictionaries, even though in practice languages are always changing and always more complicated than any fixed definition. We all use words that don’t yet appear in any dictionary, and actually language continues to function perfectly well. We can still understand what Yoda says even though his word-order is quite different from regular English (a language in which word order is generally pretty significant, compared to Latin, say). Nevertheless, we all pretend that language is a fixed thing, and we have solemn debates every year when some new word appears in a dictionary as to whether it should do so or not.

In the quotation at the head of this post, Bush was invoking the familiar trope of good versus evil, us versus them. Of course, as social primates we naturally tend to divide into the in-group and the out-group, but that doesn’t necessarily have an ethical implication. The ancient Greeks, for example, designated all non-Greek speakers as barbarians, but that didn’t stop them from treating them as human beings and (at least in the case of Egyptians) admiring them. The good-and-evil thing is an extra layer on top of basic primate behaviour.

Bush also initially described his “war on terror” as a crusade (I suspect someone quickly took him to one side and explained how that would go down in the Middle East). The original crusades were very much framed as a campaign of the good Christians against the evil Saracens, explicitly carrying out the will of God. This kind of thinking is by no means confined to the medieval world; God was claimed to be with both sides in the First World War, for example.

In his remarkable and fascinating book The Master and His Emissary (Yale University Press, 2009), Iain McGilchrist explores the right and left hemispheres of the brain and how they collaborate to give us our picture of the world around us. Many of us have a vague notion of how this works, but McGilchrist presents a more nuanced view grounded in both current neuroscience and philosophy. I won’t attempt to summarise it here – I’d urge you to read the entire book – but I’d like to quote a few choice passages from his conclusion:

Let us try to imagine what the world would look like if the left hemisphere became so far dominant that, at the phenomenological level, it managed more or less to suppress the right hemisphere’s world altogether. What would that be like?

We could expect, for a start, that there would be a loss of the broader picture, and a substitution of a more narrowly focussed, restricted, but detailed view of the world, making it perhaps difficult to maintain a coherent overview. The broader picture would in any case be disregarded, because it would lack the appearance of clarity and certainty which the left hemisphere craves….

Expertise, which is what actually makes an expert (Latin expertus, ‘one who is experienced’), would be replaced by ‘expert’ knowledge that would in fact have to be based on theory….

… The world as a whole would become more virtualised, and our experience of it would be increasingly through meta-representations of one kind or another….

Numbers, which the left hemisphere is familiar with and excellent at manipulating (though… it is less good at understanding what they mean), would come to replace the response to individuals, whether people, places, things or circumstances, which the right hemisphere would have distinguished. ‘Either/or’ would tend to be substituted for matters of degree, and a certain inflexibility would result.

… There would be a preoccupation, which might even reach to an obsession, with certainty and security….

Reasonableness would be replaced by rationality, and perhaps the very concept of reasonableness might become unintelligible. … Anger and aggressive behaviour would become more evident in our social interactions…. There would be a rise in intolerance and inflexibility, an unwillingness to change track or change one’s mind.

McGilchrist, op. cit., pp. 428ff.

As you might guess, his view is that this is where we are collectively headed, if we haven’t arrived there already, and it’s hard to argue with this. Certainly a world in which it is considered sensible to declare war on an abstract noun, as Bush famously did, is well along that path.

Industrial society is a highly technical and specialised society, and it relies on highly technical and specialised people. The left-hemisphere mindset, in which one focuses on a small tightly-defined area of knowledge, is rewarded. Of course life being what is one has to deal with lots of stuff outside this area of knowledge, and the smaller and more tightly-defined (and abstract) one’s area of knowledge the more such stuff there is left over.

We therefore depend more and more on experts in those other domains, assuming that such people exist, have the knowledge that we need, and can be trusted to give it to us. We also like this information to be provided in the form of a sweeping generalisation, because that encompasses the maximum amount of stuff with the minimum of mental effort on our part. If we believe, for example, that all Jewish people are evil, we are absolved from any need to deal with the particularities of any individual Jewish person we may happen to encounter. This is quite the time-saver, although the downsides are obviously severe.

Politically, this leads to the much-lamented situation in which people of differing views – and in this vein of thinking, these will be taken as opposing and mutually exclusive views – no longer speak to one another. After all, there is the One True View – Deus lo vult! – and all the others are stupid and/or evil, and even to converse with the heretics is to become contaminated. We know where this kind of thing can take us, because Maoist China has already shown us.

As far as I know, there has never been a time or place in which people have categorised one another so elaborately. Of course, there have been things like the caste system in India and elsewhere, and the concept of the three estates in medieval Europe, but nowadays we have an entire scheme based on generations whereby everyone born between two dates is Generation Foo and therefore can apparently be treated as a unit, along the lines of the Chinese zodiac. There are vast enterprises based on the analysis of online data to tell them who we are. Your personality can be classified on various axes – here is an online test that will place you on the Myers-Briggs index, to take an example almost at random. Pollsters state, with greater confidence than accuracy, how people will vote. The failure of Hilary Clinton’s presidential election campaign in 2016 was a particularly farcical example of blind faith in statistics, although the Vietnam War offers another precedent.

Reality is never tidy, though, and the more you know about any particular person, place, thing or circumstance the less it will seem to fit neatly into any preconceived scheme. Consider left-handedness. There are plenty of sweeping generalisations about it, but I have idea to what extent any of them apply to me because I am left-handed at some things (writing, eating with a spoon) but right-handed at others (playing the guitar, eating with a knife and fork). I’m sure you can multiply examples from your own experience.

A person is more than a collection of facts. Indeed, it’s hard to think of anything worth knowing about that is just a collection of facts. We get irritated with computer databases precisely because they are collections of (supposed) facts, and just those facts and no others. All bureaucracies necessarily partake of this species of idiocy. If we can free ourselves from this tendency to put other people into boxes, it will be the first step to having meaningful conversations with them. and who knows where that might take us?

Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.