Once upon a time, our ancestors lived almost entirely in the moment. The evidence for this is simply the fact that they had descendants, and thus you and I are alive today. The early hominids who were too busy thinking beautiful thoughts to notice the leopard in that tree over there tended not to last very long. Darwinism in action, you might say.
Of course, the “almost entirely” bit is the interesting part. In his magisterial book The Master and his Emissary (Yale University Press. 2019), Iain McGilchrist writes at length about the two modes which the human brain has for understanding the world: a highly structured ‘close-up’ view, mostly associated with the left hemisphere, and a broader ‘big-picture’ view, involving the whole brain but largely driven by the right hemisphere.
In order to achieve the ‘close-up’ view we need to discard a lot of information. In much the same way that a map omits details of the territory it depicts, we adopt mental models that abstract away a lot of the world’s complexity. Which parts we abstract away, of course, will depend on what we are trying to do: maps made for different purposes will show different things. A road atlas will give you quite a different view of London than a map of the Underground system, for example, and a map showing average rainfall in England will be different again.
You can get quite a long way by looking at the world close up, and I’m not just talking about inscribing the Lord’s Prayer onto a grain of rice, The scientific method, in its purest form, is an example of this approach. “Controlling a variable” in an experiment is really a means of making that variable go away. Ideally, you want to be studying only one thing at a time. This is why well-conducted clinical trials of drugs are so expensive (and hence why they tend not to be well-conducted in practice); you need to control for lots of variables – age, sex, ethnic group, and of course the placebo effect. Only when you have eliminated all the possible confounding factors can you be confident that drug X has effect Y, and also doesn’t kill more than Z% of patients.
As the saying goes, all models are wrong but some models are useful. Newton’s laws of motion are “wrong” according to our current notions of physics, but they’re good enough to put men on the moon. The map of the London Underground will reliably steer you from Knightsbridge to King’s Cross, although it won’t show you your height above sea level, or any water-courses smaller than the Thames. And if all you care about is getting from Knightsbridge to King’s Cross that’s fine. What that map is not, however, is a complete representation of London as it actually is. No map could be. That’s the point.
The industrial world is very much built around these abstractions. A sign of this is how obsessed we all are with statistics. Government spokespersons are forever releasing mounds of them, on everything from juvenile crime to foreign trade to the cost of living. There’s something fitting in the fact that our latest and greatest artificial intelligence systems are statistically-based, which is why, for example, chat-bots trained on social media posts turn out to be rude and bigoted. (Who knew there were lots of rude and bigoted posts on social media? Thanks, science!)
Another example of this tendency is our increasing reliance on technology to mediate our experience of the world. Many people seem to be unable to believe they are seeing something unless they are filming it on their phone. Viewing the film of it afterwards, and/or posting on the Internet, is somehow more real to them than the thing itself. A sane society would consider this to be pathological.
I wonder how such people will cope when the “cloud” where they fondly imagine all this stuff is stored forever goes away – as it will, because the ethereal-sounding “cloud” is in reality a very earthly set of date-centres, which use an awful lot of valuable raw materials and electricity, and at some point it will be decided that those raw materials and electricity would be better used for something else. Future historians will no doubt consider this another Dark Age, for much the same reason as the previous one was so called, namely that not much got written down.
Furthermore, as McGilchrist points out, systematically throwing away the bulk of the available information is likely to bite you in a tender part of your anatomy. That leopard from the first paragraph might not be required for your model of the world, but you could easily find out the hard way that perhaps it should have been. It is pretty obvious that our inability to see the bigger picture is at least in part responsible for the fact that the bigger picture is currently looking so bad.
Take carbon emissions. If a nation decides to offshore the bulk of its industrial production to China and/or the Third World, as many of the industrialised western nations have, then it can claim to have lowered its carbon emissions. But from the atmosphere’s point of view, it is completely immaterial whether a given molecule of CO2 was emitted in Belgium or Bangladesh, Wuhan or Washington. The physical effect will be the same.
But the average Joe filling up his SUV isn’t thinking about that. We’re all making decisions, all the time, based on what we see immediately in front of us. I don’t personally know the twelve-year-old girl on the other side of the world who made these trainers. What I can see is the price-tag.
And we’re always trying to simplify the world to the point where we can understand it. If you live in a city of ten million people, probably the best you can do is to put almost all of them into one stereotyped box or another. In doing so, of course, you essentially stop treating them as if they were people. As with all abstractions, it gets between us and our own lives. Life is a set of interactions with the world or it is nothing.
The provocative subtitle of E.F. Schumacher‘s classic book Small is Beautiful is: “A Study of Economics As If People Mattered.” Industrial civilisation has reached the point, which was perhaps inevitable, where people don’t matter. They can’t matter. There are just too many of the blighters. You can treat them as data-points and feed them into a statistical model, and that will give you some results, which may or may not correspond to reality.
Chances are, though, that that statistical model doesn’t allow for leopards.
Comments are welcome, but I do pre-moderate them to make sure they comply with the house rules.