Gaslighting the Woke Cancel Culture

If you’re trying to work out what “gaslighting the woke cancel culture” means, you can stop now: it’s just a bunch of words I threw together that sound as thought they ought to mean something but actually don’t.

People have been complaining that political discourse has become increasingly meaningless ever since George Orwell wrote “Politics and the English Language”; in fact, I’m sure back in the Roman Republic, senators were complaining that the word had drifted away from its original meaning of res publica (“the public thingummy”, which just goes to show that the original meanings aren’t necessarily more precise). Words have a kind of half-life: gradually the meaning leaches out of them and they become milder versions of themselves, turn into semantically empty place-holders, or, like uranium turning to thorium, transform into something different, like “naughty” changing meaning from “poor” to “nihilistic” to “mischievous”. Some words, particularly grammatical words like “the” and everyday objects like “house”, have very long half-lives; political words, however, decay much more rapidly, and social media has accelerated this.

Gaslighting

“Gaslighting” is an example of a once-useful term that rapidly shed meaning as it became more generally used. It refers to the film noir classic, Gaslight, in which a man convinces his wife that she is going insane by dimming the house lights whenever he’s away (his aim being to institutionalise her and steal her money). The catch is that this, along with the mysterious noises in the attic that he also arranges, does actually damage her mental health. Gaslighting thus originally referred, in the words of the AMA, to “manipulation so extreme as to induce mental illness or to justify commitment of the gaslighted person to a psychiatric institution.”

In the early 2010s, the word broke onto social media and morphed accordingly. It started to be used to refer to more general social propaganda designed to make oppressed groups doubt the reality of their oppression and attribute their distress to personal failings or even mental illness. Now this, I think, is a clever metaphorical extension of the term that actually makes it more useful. While few people actually try to get their family members incarcerated in mental hospitals, powerful people often try to convince the less fortunate that there is really nothing wrong, and if it doesn’t look like that to them, they must be weak, lazy, or crazy. Unlike the Soviet Union, Western countries can’t lock up dissidents in mental hospitals, but they can do their best to make out that people are unhappy because of their “mental health issues” rather than because, say, they are working long hours in poor conditions for wages that don’t allow them to feed their children properly. It’s like a twenty-first century take on the old idea of the feckless poor merged with the “hysterical woman” trope of early psychiatry and the post-war tendency to view any unusual behaviour as pathological.

Of course such a useful concept was bound to have a short half-life, and before long it was being applied to any kind of propaganda and even to sincere differences of opinion. Somebody who genuinely believes that capitalism is the best way to make everyone wealthy or that traditional families are ordained by God is not gaslighting you; they’re arguing with you, and you would do well to argue back rather than accusing them of trying to make you doubt your soundness of mind. By the second half of the decade, the word was not only being used to describe propaganda for the status quo, it was being hurled at progressives who dared to say that things were not quite as awful as they looked. (Many on the left are stuck in the Christian/Marxist attitude that things have to get worse before they can get better, but that’s something for another article.) For me the turning point was 2016, when I saw a tweet from someone who complained that a friend was gaslighting her by saying “It’s going to be OK” when Trump got elected. This was not an attempt to convince the tweeter that everything was peachy and she was mentally ill if she didn’t think so, but rather a clumsy attempt at reassurance that backfired. “Don’t gaslight me!” is becoming a cypher for “Don’t burst my bubble of gloom.” And this points to a deeper, and probably insoluble problem, which is that we often don’t know whether a problem lies in the world or within us (or both). Just as we shouldn’t blame ourselves for the world’s problems, we shouldn’t warp our view of the world to fit our own problems.

Woke

Historical linguists like to observe the processes whereby good words go bad and bad words are rehabilitated; it’s a kind of semantic criminology. I mentioned an example of the latter case, known as “melioration”: when a Shakespearian character is described as “a naughty man”, it means that he has no moral principles whatseover (the word comes from “naught”, like “nihilistic” comes from “nihil”, i.e. “nothing”), but now “naughty but nice” is a cliché. The former process, known as “pejoration” (think “pejoratory”) can be seen with the word “woke”.

Waking up as a metaphor for becoming aware goes way back — think of the way “awakening” is used as a synonym for enlightenment, or of Bach’s wonderful “Sleepers Awake” cantata, a setting of a hymn based on the parable of the ten virgins. In the nineteenth century it got more political; one of the many versions of the Internationale starts “Arise ye workers from your slumbers.” And in the mid-twentieth century, “woke” was used by American trade unionists and civil rights activists to mean aware of oppression, and racism in particular.

So far, so good. But then it became used as a slur by the right and its semantic content just dribbled out onto the floor, in much the same way that the word “fascist” did in the 1960s. Now even people I know who aren’t particularly right-wing are using “woke” (usually in scare quotes) to mean something like “politically correct” did a few years back. The difference is that although the term “politically correct” originated on the left, it was pretty much always used a joke — I suppose there may have been a few Stalinists who used it with a straight face back in the day, but it was mainly just us lefties poking fun at ourselves and each other. I guess conservatives didn’tget the joke.

The odd thing is that when people throw around words like “woke” or “politically correct”, they often have no idea what they are talking about, but in spite of this they are often trying to grasp at something which is quite real. There is no “wokeness” or “wokism” in the sense that rabid right-wingers use the word (i.e., a huge conspiracy by feminists, cultural Marxists, transexuals and tofu-eaters to destroy Western civilisation), but there is a tendency among progressives to focus more on form than substance, and to enhance their own standing by catching others failing to observe those forms. Progressives don’t do that because they’re progressives; they do it because they are human, but it’s still annoying, and we could do with a word for it. I rather liked Tim Ferriss’s term “bigoteer”, but it never took it off.

Cancel Culture

Back in the 1980s, I went to a meeting of the Leeds University Debating Society where the holocaust-denying historian David Irving had been invited to speak. Things were about to get ugly, with our generation’s version of Antifa wanting to throw him out, and the Young Conservatives (who were a considerable force on campus then) wanting to throw the protestors out. Then the president of the Student Union came up with a clever idea. According to the constitution of the Debating Society, all members of the Student Union (i.e., all students) were members of the Debating Society, and therefore had the right to uninvite anyone that they had invited, so we did. And lo and behold, I became part of cancel culture, decades before the term arose.

Now you can argue the pros and cons of what we did to the poor little Hitler apologist (especially since no one knew quite what a Nazi he was then), but it’s a debate that has been going on for a long time. If we’re talking about the law, I go along with John Stuart Mill, who argued that no opinion should be suppressed except where it would lead to violence (and it is pretty clear that Nazi propaganda leads to violence, which is why Irving eventually wound up in an Austrian prison). In contrast, private organisations and companies can give or deny a platform to anyone they choose. There are, however, intermediate states like public institutions and oligopolies where things get muddy. Unlike some of my friends on the left, I do think there is a phenomenon that could reasonably be called “cancel culture”, in which unpopular views are suppressed rather than debated and attempts are made to remove the people expressing them. When well-known figures on the left like Noam Chomsky and Margaret Attwood warn you, it’s worth sitting up and paying attention even though you may not agree with everything they say. If someone insults or blocks me on social media for being, in their mind, a transphobic radical feminist or a misogynistic trans activist (both of which have happened), I just laugh, but when this happens in a university to someone who could lose their job as a result, it gets more serious. Related to the bigoteering mentioned earlier, there is a real problem with people making social and political capital out of accusing others of bigotry, and sometimes this has more serious consequences than people blocking each other on Instagram. (Note that this is not just a problem on the left — if you want to see real cancel culture in action, check out the House Unamerican Affairs Committee.)

So yes, “cancel culture” was briefly a useful term. But of course conservatives have to spoil everything, and in the blink of an eye, it became applied to anyone calling anyone out for anything. The idea that it is usually better to debate opposing views than suppress them has morphed into the idea that we are obliged to listen to people spouting abuse. The move to defend Donald Trump’s “freedom of speech” (i.e., his Twitter account) is a case in point. The man incited a riot that led to deaths, which is not the same as soberly expressing a view that there may be a problem with the voting system. To return to John Stuart Mill: “An opinion that corn-dealers are starvers of the poor, or that private property is robbery, ought to be unmolested when simply circulated through the press, but may justly incur punishment when delivered orally to an excited mob assembled before the house of a corn-dealer.”


So there you go, we’ve gaslit the woke cancel culture, or something like that. At different times and in different groups, the same words can mean different things, or nothing much at all. As Wittgenstein said, “Don’t ask for the meaning; look for the use.”

What I’ve Been Up To

What I’ve Been Listening To

I’ve been listening to a lot of Alevi music recently. For those of you who aren’t familiar with Alevism, it’s a Turkish version of Islam, named because of their devotion to Ali, the son-in-law of the Prophet Mohammed and fourth Caliph according to Sunni Muslims, and the first Imam according to Shii Muslims. Personally I am not into any caliphates or imamates, but I’ve always had a soft spot for Alevis – they’re like Muslim Social Justice Warriors, and I mean that in a good way. They also mix in a lot of Asiatic Turkish culture — if you look at their ceremonies (semah) they look decidedly shamanic.

[Note: This is far too deep a topic to go into here — one Alevi girl I talked said there was very little in common between Alevism and its Syrian neighbour Alaawism, while some scholars have claimed that the term “Alevism” was a ninteenth-century invention, and before that there were only a variety of movements clustered around various spiritual leaders (pirs and dedes).]

Anyway, last week I had a weird dream set in a postapocalyptic Anatolia where a bunch of Alevis turned up and escorted me away from an impending battle between Christians and Muslims – it was like Mad Max meets the crusades. I’m not sure what that is supposed to mean, but here’s some music from the compilation Pirler ve Dedeler.

For some reason, this one from the Ahura Ritim Topluluğu always brings tears to my eyes.

What I’ve Been Watching

I’ve never found Wittgenstein easy, but then who has? I tried reading the Tractatus Logico-Philosophicus in my twenties and it was as indigestible as you’d expect a book with a title like that to be. Much later, I read Philosophical Investigations because it tied in with my MA dissertation (which was in linguistics, not philosophy, but there is a considerable overlap in the case of Wittgenstein) and I’ve also quoted the famous passage about the impossibility of defining games in several papers and blog posts. But Culture and Value, compiled posthumously from his personal notes, is the only one of his works that I can say I enjoyed, and it is quoted extensively in this TV programme I stumbed upon on YouTube:

As it turns out, Wittgenstein’s views on religion, to the extent that I undertand them, seem similar to my own (as expressed in “Harry Potter and the Spitting Haredim”). In line with his general advice of “Do not ask for the meaning, look for the use,” Wittgenstein says religious statements are part of a different “language game” from everyday conversation, and also from science. As he puts it in Culture and Value, “I believe that is a German plane” does something different from “I believe in the Father, the Son and the Holy Ghost.” Anyway, I won’t bore you trying to explain that; just watch the video, which is the clearest explanation I’ve come across so far.

In lighter vein, I’m enjoying Netflix’s animated series Cyberpunk: Edgerunners. I was won over to the cyberpunk genre when my friend Rodney Orpheus read out the first line of Gibson’s Neuromancer — “The sky above the port was the color of television, tuned to a dead channel,” exclaiming “That is pure Raymond Chandler!” That also shows how quickly cyberpunk dates — who watches television any more? — but it’s a genre thay keeps renewing itself. This new series is based loosely on the game Cyberpunk 2077 and is cyberpunk to the core — not just the tech but the characters, the setting and, most importantly, the aesthetic.

If this is not safe for work, you need a new job.

What I’ve Been Learning

For the past month or so I’ve been teaching myself Python. It’s a simple but quite powerful programming language that is as suitable for complete beginners as it is for old dogs like me who want to learn new tricks. I started by working through the excellent book Automate the Boring Stuff With Python, which left me with a basic knowledge and a script for generating random passwords that I use several times every work day (since a lot of my job is resetting student’s passwords). Now I’ve moved on to more fun stuff with Earsketch, a web app designed to help people learn coding and create music at the same time. I’m doing it through Coursera, but really there’s enough instructional material on the site and its YouTube channel to let you plunge straight in. I’m not sure how far I can take this in terms of making music, but like most basic music software, it’s good if you just want to create something that’s built from a bunch of samples like lego bricks.

What I’m Up To

What I’m Listening To

After reading a lot of nonsense about “wokeism”, it occurred to me that the same people who complain about people being “woke” were not so long ago saying “Wake up, sheeple!” so do they want us to wake up or not? I’ll leave the answer with J.S. Bach, in this wonderful performance of Wachet auf, ruft uns die Stimme by the Netherlands Bach Society.

What I’m Reading

Future Crunch compiles the best of the world’s encouraging news into an email newlsetter that always lifts my mood when it pops into my Inbox. The current issue features good news on COVID, AIDS and Malaria, falling crime rates (again!) and conservation zones and much more. Contrary to what you’d expect from following both mainstream and social media, the world really is getting better; if we don’t heat or nuke ourselves into extinction, a wonderful future is possible.

What I’m Playing

Despite having taught courses on games for years, I don’t actually spend a lot of time playing games, simply because I don’t have a lot of time. I try not to miss our regular RPG sessions on Fantasy Grounds, but generally rather than games as such, I focus on gamified aps that help me with my goals, like Duolingo and Habitica. However, I was intrigued by the news that Guild Wars 2 is now free to play on Steam. I’d never played Guildwars 2 but spent quite a lot of time in Nightfall, which is the most visually beautiful MMORPG I’ve played. GW 2 is similarly pretty but has the same problem that when you’re soloing, you spend a lot of time just walking up to random people and clobbering them while their friends seem not to notice you. It’s like everyone in the Guildverse is totally stoned, so you have to go right up to them and say “Hey dude, I’m gonna fight you!” “Oh, whu … OK, then.” OTOH GW is great for group play, so I’m hoping to set aside some time for that later. The other thing I couldn’t resist was the 2K Megahits offer from Humble Bundle (which unfortunately seems to have finished) that included Civilization VI. I lost the better part of a summer to Civ II – let’s hope this doesn’t eat into my life too much, especially since I’m trying to get in two hours of meditation and martial arts practice a day.

Where I’ve Been

Went to visit our Birmingham campus last Wednesday:

It’s in Digbeth, an interesting area about 20 minutes’ walk from the train station — one of those old industrial areas that got run down then revived rather patchily, so it’s a mixture of light industry, pubs’n’clubs, trendy cafes, schools and arts centres, with some great graffiti.

Streaks

This morning I lost a streak.

I woke up in a good mood, looking forward to a nice laid-back weekend, then I looked at my phone and shit, there it was. A 139-day streak gone.

Scary Duolingo memes are all over the web

Streaks as a way to track and motivate good habits are attributed to Jerry Seinfeld. He claims that he never thought of the idea and never even used the method, but it got into an early issue of Lifehacker and that was that: it’s even known as the “Seinfeld Strategy” now. Because we didn’t have apps in those days, faux-Seinfeld had a big wall calendar and would add a red X for every day in a row that he did whatever it was that he was trying to turn into a habit. He called it a “chain”, but because of the sports analogy, it soon became known as a “streak”, and now it is in thousands of gamified apps, most notoriously Duolingo.

I got into to the streaks thing with an Android app called simply Streaks about ten years ago, but have mainly been using Habitica, which is a bit more sophisticated, combining daily habit streaks with more intermittent activities (good and bad) and a to-do list, all wrapped up as a role-playing game with retro bitmap graphics.

The current state of Habitica

As you can see in the screenshot, streaks can be habit-forming – which is after all the idea – to the point of obsession. That there is a 2,385-day streak for “Morning routine”. This is just a few exerises I do on waking, so it’s actually no big deal, but the thought of losing that after so long is kind of terrifying. Probably I’ll keep it going until I hit 2,555, or seven years, then decide enough is enough. (I just caught myself thinking “Wait, I also need to factor in leap-years” which shows I am sliding down the slippery slope to the Quantified Self). The one I missed is “TQM 60”, meaning a minimum of 60 minutes of taijiquan, qigong and/or meditation, though fortunately its less ambitious sibling, TQM 30, is still going.

Outside this lifehacker geekery, the interesting question is not whether streaks are useful in developing and maintaining good habits — they obviously are — but what to do when you lose a streak (assuming Duo doesn’t come round to kill you and your entire family). Obviously a certain amount of disappointment is in order, something between “Pfft” and burning your house down, but as Tim Ferriss says, “there’s a thin line between doing the work and beating yourself up.”

Could use thse for positive reinforcement

So I didn’t actually let myself be miserable for more than a few minutes but instead sat down with a cup of cocoa (my new habit being cocoa with coconut oil for breakfast) to work out a strategy. Should I (a) just start again, (b) give up tracking this habit, (c) start again with a lower target or (d) start again wth a higher target? All of these work in some situations. If it’s a typical streak where you go for about twenty days then lose it, it’s best to just start again and not think about it too much. If it’s become pretty automatic but you just lost it because of something unusual, you might want to stop tracking it because actually you’ve succeeded in creating a habit — I just did this with flossing, for example. If you keep failing after a few days, you want to set the bar lower – I recently did this with language learning, where I stopped timing it and just settled for doing a lesson a day to keep the owl at bay. But in this case, I think I’ll go for the counterintuitive one, which is to actually raise the bar. If I keep aiming for 60 minutes a day, then it’s going to be four months before I get back to where I was. I could increase to 90 minutes, but I did that before and managed to clock up 500 days (because pandemic), so what the hell, I thought, let’s go for two hours. Now I probably won’t get very far with this, what with holding down a job and all, but if I can just manage a month, that’ll be something. I’m also going to change the name from “taijiquan, qigong and meditation” to “meditation and martial arts”, partly because I want to add some capoeira drills and partly because it will shorten to “M&Ms”, which will make me smile when I look at my phone.

Time to post this and practice some German before the owl gets me.

What I’m Up To

Here’s what I’ve been doing of late …

What I’m Listening To

Yamma Ensemble are a group who sing in Hebrew and Ladino, the language developed by the Sephardic Jews of the Iberian peninsula (which I first encountered before in Turkey where many “Sefarat” went when they were expleed from Spain). I find the music eerie, moving and both exotic and familiar (the familiarity largely because of the Turkish influence. Here they are singing in Biblical Hebrew: the hymn to King David from the Book of Samuel:

What I’m Reading

My current bedtime reading is Mirror Gate by Jeff Wheeler, but as it is the second in his Harbinger series, I really ought to talk about the first, Storm Glass. That tells the story of Cettie, an orphan from the slums who is taken in, Oliver Twist style, by a rich and powerful lord. There is also a princess, an evil housekeeper, scheming politicans, ghosts and magical devices. It’s basically a steampunk fantasy, with the lower classes leading Dickensian lives on the ground while their overlords live it up in floating estates that are kept airborn by magic (though liable to crash to the ground if the owner goes bankrupt). A lot of the fun comes from watching how the author mashes tropes from different genres together.

What I’m Watching

Last week was all about The Rings of Power, and I’ve blogged about it already here, so I won’t go on about it again. Instead, I’ll reommend Netflix’s Inside the Mind of a Cat, which I also watched recently. The nice thing about watching cats is that no one gets worked up and makes half-hour YouTube videos about why they aren’t canonical or whether “cat” is a biological reality or a socal construct. There again, I’ve always stayed clear of the world of cat breeders, which I suspect can get a little, well, catty.

What I’m Learning

I’ve just dicovered a wonderful YouTube channel called Learn Italian with Songs. Since at the moment I’m staying on my ownsome in a detached house, I can belt these out at the top of my voice.

Some Thoughts on The Rings of Power

A few weeks ago, some students on my old fantasy literature course were kind enough to ask for my opinions on The Rings of Power. Now I have watched all four of the episodes currently available, I reckon it is time to answer the question.

When I saw Simon Tolkien’s name as an adviser in the closing credits, I thought, “OK, they need the name, but it would have been better if they’d chosen Tom Shippey.” Shippey is the number one Tolkien scholar and the obvious candidate for the job. Then I started reading the fan commentary, which I had been avoiding as much as possible before watching the show, and found they had hired Shippey, then fired him. Anti-fans say it’s because he objected to Amazon’s “polluting the lore” or their “woke agenda”; calmer voices say it’s because an interview he gave violated a non-disclosure agreement. Really, who knows, but I would have preferred more influence from Shippey in there, not just because he’s a cool guy and my former prof, and not just because his knowledge of the lore is unsurpassed by anyone now Christopher Tolkien has passed away, but because more than anyone else, he understands where Tolkien was morally and philosophically. This is what I find lacking in the series so far, but as I said, I’m only four episodes into it, so I should cut them some slack.

In terms of recreating Middle Earth, they’ve done a great job visually, though I suspect Peter Jackson’s films provided a more direct influence than Tolkien’s words. With a few exceptions, it looks and feels like very much like my idea of Middle-earth in the Second Age, just like Jackson’s films looked very like my idea of Middle-earth in the Third Age. I loved the way Numenor was a bigger, better version of Jackson’s Gondor , Lindon and Khazad-Dum look pretty much how you’d expect, and the proto-hobbits are great, even though there’s no canonical reason for them to be there.

That, however, brings us to the main problem the film-makers needed to surmount, which is that (as Shippey points out in the interview that maybe got him fired) they couldn’t possibly make it completely canonical because they didn’t have nearly enough material to work with. Amazon only bought the rights to the appendices to The Lord of the Rings, so really all they have is a timeline; all the juicy stuff in The Silmarillion and Christopher Tolkien’s novels (based on The Silmarillion and his father’s notes) is off limits. (Irreverant/irrelevant thought: HBO should counter Amazon’s move by buying the rights to Children of Hurin and getting the Game of Thrones team to make it.) They have to make up a lot of stuff, to the extent that it is basically Tolkien fanfic. There’s nothing wrong with fanfic, but it does create certain problems. How much extra material can you create? How much is your voice and how much is an attempt at channeling the original author? What happens when your ideas and values differ from the author? These questions aren’t so important if we’re dealing with Harry/Snape slashfic on some obscure subreddit, but when it’s a flagship production distributed by Amazon, they are crucial.

In the aforementioned interview, Shippey says “Amazon can answer these questions [about the holes in the history of the Second Age] by inventing the answers, since Tolkien did not describe it. But it must not contradict anything which Tolkien did say. That’s what Amazon has to watch out for.” This seems a good principle. So far I haven’t seen anything that flagrantly violates it, though I am really not an expert on the Second Age, and I’m sure if I could be bothered to wade through all the anti-fan vitriol, I’m sure some cases would turn up. I am also sure that such violations would be fairly trivial, since the Tolkien Estate had a veto on content.

Having said that, let’s look at the two things that seem to have annoyed the most people.

  1. Action-Hero Galadriel

I don’t know whether the people who object to the sassy, sword-swinging Galadriel of TROP are offended because she isn’t like the Galadriel of The Silmarillion or because she isn’t like the Galadriel of the Jackson films. She certainly isn’t like my idea of Galadriel, but that doesn’t actually bother me too much. While Tolkien says that Galadriel was the most powerful of the elves that remained in Middle Earth, I’d always viewed this as magical power; when she threw down the walls of Dol Guldur, I assume she didn’t use her fists. On the other hand, her powers were lesser at the start of the series since she didn’t yet have her ring, and if even a wizard like Gandalf uses a sword from time to time, it doesn’t seem unreasonable that a magical elf who fights in wars would do the same. It’s worth noting, by the way, that Galadriel didn’t join the war against Morgoth because she didn’t think he could be defeated, not because she didn’t want to get her hands bloody.

In terms of casting, I have a slight preference for Cate Blanchett over Morfydd Clark, perhaps because her hair is closer to Tolkien’s description (and for those who think that’s trivial, remember her name comes from her hair; it means “maiden crowned with a garland of bright radiance”). I also think they go too far in portraying a young, impetuous Galadriel given that she is already a few thousand years old at this point. But these are minor quibbles; overall, I’m happy with TROP’s Galadriel so far.

  1. The Race Thing

NOTE: I am aware that “race” is a term with a lot of different definitions and that it is a pretty arbitrary concept; I just don’t want to go into that right now. Just bear with me when I use the circular definition: “a bunch of people that are generally regarded as a race.”

Whenever somone inserts non-white people into a fantasy setting outside some stereotyped role (ninja, jungle lord, desert nomad) there are howls of protest. Most of these howls are hypocritical; for example, none of the people I have seen complaining about Black hobbits as non-canonical also complain about White hobbits as non-canonical, yet Tolkien clearly states that hobbits are brown-skinned. We don’t know how brown, but we can be sure that Jackson’s lily-white hobbits are as uncanonical as the African-Eriadoreans of TROP. What often lies under this invocation of “canon” or “folklore” is a xenophobic fear of infiltration by other races that has its origin in real-world politics.

That said, I’m going to go out on a limb and say that TROP’s casting is at best poorly thought-out, and at worst, American cultural hegemony, albeit of a well-meaning liberal kind. Starting with Star Trek, American fantasy and SF worlds have increasingly tended to portray an idealised America. You have a society where most people are kind-of-white, but there are plenty of other races thrown in, and everyone gets along fine. While this is a vast improvement on the all-white fantasy and SF of earlier years, it has a number of problems.

The first is, of course, that America becomes the prototype for all other worlds. There seems to be an assumption that because America is a multi-racial society, imaginary societies should be multi-racial along the same ines. Of course history has other examples, notably the Roman and Hellenistic empires, but in the distant past societies tended to be fairly homogenous; people moved around the globe but generally did so as groups rather than individuals, and the genes of individuals and smaller groups eventually got lost in the gene pools they joined (e.g., the so-called “Hunnic” haplogroup has been found as far afield as Ireland, but you don’t see many Irish people who look like they’re from Central Asia).

The default in these fantasy Americas, though, is still White. This isn’t such a problem with Middle-earth, which is supposed to be the Europe of a distant mythical past, but when it’s applied across the board, it becomes tokenism. “We are a White society, but see how good we are – we also accept Brown and Black people!” There are of course some noble exceptions that reflect the appearance of humans worldwide rather than just the USA, but TROP doesn’t quite make it into that category.

Most importantly, people come in different shapes and colours for a reason. America became a multiracial society because of conquest, slavery and immigration, and it still contains identifiable racial types partly because these events were comparitively recent (an eye-blink compared to the history of Middle-earth) and partly because of racism, both informal and institutional. (When I was born, mixed-race marriages were still illegal in several US states.) If you want to insert different modern human races into the same Tolkienian “race” like they do in TROP, you need an equally compelling reason for why (a) they evolved differently in the first place, and (b) why they did not merge completely over the centuries. All things being equal, a society as introverted as the dwarves of Khazad-Dum would have reached genetic near-homogeneity a long time ago (which of course does not mean that they would be White) and the Harfoots, a nomadic people who live in small groups shunning all outsiders, would be totally inbred by now (incidentally, these are the brownest of the hobbits according to Tolkien, so again, it’s the White ones you should be objecting to if you want to be canonical). As an aside, if you are writing science fiction, the opposite applies, since the default for a future society would be a glorious mish-mash of human variability with no easily identifiable racial groups; it is homogenous societies that would need some special explanation.

This Tatar girl might make a good elf

Now I’m not saying that you shouldn’t have dark-skinned elves. I’m saying that if you have them, they should be more than tokens dropped in to make a diversity quota. There are three major divisions and different sub-groups within Tolkien’s elves, and he didn’t describe all of them (or any of them in much detail). Since they not only got around quite a bit but also stayed apart from each other for a long time, there’s no reason why they shouldn’t look different. There again, why should these differences correspond to the most familiar racial types? Even in our world, there are plenty of ethnic groups who don’t fit into the arbitrary American categories of Black, Caucasian, Hispanic etc. – think of Tatars, who tend to be pale-skinned with blue or green eyes, and even sometimes blond hair, but have Central/East Asian facial features. (You can have a good giggle reading the comments on Quora when someone – almost always American – asks a question like “Is Dua Lipa White?” or “Are Turks White or Asian?”)

So all in all, they could have handled the race thing a lot better. That is not a reason to damn the series. Overall, TROP is enjoyable epic fantasy. It’s not the faithful rendition of the Second Age I’d have hoped for, and so far it lacks the moral depth of Tolkien, but neither is it the travesty I had feared. And it’s certainly nothing to get into a tizzy about.

The Boundaries of Fantasy

If you look at all the different things that are called fantasy — and I mean just in the context of fantasy literature, not sexual fantasies etc. — you start to wonder if there is anything that they all have in common or whether, in Wittgenstein’s words, they simply “form a family.” Fantasy is one of those things that gets harder to define the closer you look at it, like gender or — as Wittgenstein famously pointed out — games. In “Philosophy and Fantasy”, Laurence Gagnon gives us a definition which seems appealing at first sight: “any story might justifiably be called ‘a fantasy’ which gives us some explicit indication of the personality of one or more of the characters and which is also about a world that is conceivable but physically impossible.” However, there is the difficulty of saying exactly what is physically impossible; given what we know of physics, dragons are a far more likely possibility than faster-than-light travel, yet the latter is a staple of science fiction, not fantasy. Moreover, as Gagnon admits, the term “fantasy” is here used “in a very general way such that some writings called ‘fairy-tales’, some labeled ‘science-fiction’, and, perhaps, some designated ‘dream-stories’ will fall under the concept of fantasy.” Since Gagnon is interested in fantasy as a philosophical tool, that will do for him, but if we are interested in fantasy literature, we need something that will explain why The Lord of the Rings is definitely fantasy, The Day of the Triffids definitely isn’t and Star Wars and Twilight are on the fuzzy borders with SF and horror respectively. I recently saw Twilight described as “urban fantasy”, which is silly considering that it takes place in a village, but does at least note that fangs do not a horror film make. Twilight could fairly be described as low fantasy (i.e. a tale where fantastic elements are found in the normal world, as opposed to high fantasy, which has a world all of its own). But if that is true, then should we say the same of Dracula?

It could be that the distinction between fantasy and horror is of a different kind than the distinction between fantasy and science fiction. Horror is like comedy or pornography, in that it is a genre defined by the feelings it is designed to arouse, whereas fantasy, like westerns, is defined by the kind of things it describes. That is why such disparate creations as The Saw and The Omen can both be called horror films, and why when you reduce the scare quotient in a lot of so-called horror, you see that it is fantasy or SF. (Of course there are people who are genuinely scared of the vampires in Twilight, but they’re just wusses.) The categories of fantasy and horror overlap, not because of Wittgensteinian vagueness, but because they should. If you have an overlap between the sets of plants and animals, you assume that the concepts “plant” and “animal” are a bit fuzzy, but there is nothing surprising about an overlap between the set of plants and the set of edible things.

Coming to the more notorious overlap between fantasy and science fiction, we therefore need to ask which kind of overlap it is: is it a plant/animal or a plant/edible overlap? Both the fantasy and science fiction genres are defined largely in terms of what they describe, and both involve describing things which we are fairly sure do not exist and have never existed. They are also the kind of things which not only do not exist but would surprise us if they were to exist. If a connoisseur of nineteenth-century fiction were to read in the Times Literary Supplement that Madame Bovary was actually a real person, he might put down his teacup and murmur “Well I never!” This is probably not how we would react if it were proved that Sauron was a real person.

Both fantasy and science fiction, then, deal with things that make us go “wow!” They are “astounding tales,” and in this respect, the genres are also a little like horror, in that their definition includes the feelings they are designed to evoke. A novel set in a world which was exactly the same as ours with addition of toast that always falls with the buttered side up would fit Gagnon’s definition of fantasy, but would not be fantastic; neither would it make for interesting science fiction. But is the “wow” of fantasy the same as the “wow” of science fiction? If that were the case, fantasy would be decidedly less impressive, as Ryan Somma argues in a fictitious dialogue between “fanboy” and “scientist”: for every impressive fantasy creature, device or journey, science fiction has something bigger, stronger, faster or whatever. Shadowfax may carry Gandalf faster than any horse, but that’s still well below the speed of light … or even the speed of a family car. But this is not how it works: the “wow” of fantasy is subtly different from the “wow” of SF. As I said, dragons are a much more feasible proposition than faster than light travel, but dragons strike us as more magical and mysterious.

Let us imagine, then, a science fictional account of dragons (something Anne McCaffrey comes close to in the Pern books). Someone, somewhere, messes with the genes of birds to make them very big, featherless and scaly (in other words, to make dinosaurs). Then they work on the digestive system so that the creature produces methane which can then be ignited in its mouth. Voila, a dragon, which can then make the story interesting by escaping and laying waste to cities. We’re talking something between Jurassic Park and Godzilla here.

This would make passable, if rather unoriginal science fiction, but despite the presence of dragons it definitely wouldn’t be fantasy. The fact that the dragons’ genesis is explained identifies it as SF, but this is not the most important point; it is a side-effect of an essential feature of science fiction, which is that it follows, or at least claims to follow, the rules of our universe. It may bend them, as with FTL travel or telepathy, but it cannot flout them. If a SF novel has spaceships travelling faster than light, it doesn’t give a satisfactory explanation of how they do it; anyone who could provide one would already have a Nobel prize. They may have explanations of a kind (“tachyon drives”, “wormholes” etc.) but this is just a way of saying “This is happening in our universe, according to the laws of that universe.” They are most definitely not saying “Faster than light travel is physically impossible, but our hero can do it because he has a magic spaceship.” That would be fantasy. What fantasy does is not to bend or even flout the rules; it says “The rules here are different.” Not only are we not in Kansas any more, we aren’t even in hyper-Kansas. This may be what makes the “wow” of fantasy different from the “wow” of SF. When Shadowfax gallops at the speed of a Citroen, we aren’t saying “Wow, that’s fast!” We’re saying “Wow, a magic horse!”

This makes sense for high fantasy, when the author has created a world that has different rules from ours. But what about low fantasy, where strange things happen in our world? I think there are two ways to look at this. One is that our world is basically the world as we know it, but beings from other worlds have entered it (low fantasy is also tellingly called “intrusive fantasy”). In this view, low fantasy is like portal fantasy — where our heroes enter another world through some kind of gateway, as in The Lion, the Witch and the Wardrobe — but in reverse. Buffy the Vampire Slayer uses this idea, with the Hellmouth being a kind of portal allowing various kinds of nasties to congregate in an otherwise normal American town. In other types of low fantasy, however, the supernatural creatures are very much part of our world. Here the “wow” factors comes not from the idea that there are magical worlds, or that magical creatures can enter our world, but that our world is itself magical, and we just need to wake up to the fact. As Vampire Bill puts it: “You think that it’s not magic that keeps you alive? Just ’cause you understand the mechanics of how something works, doesn’t make it any less of a miracle … which is just another word for magic. We’re all kept alive by magic, Sookie. My magic’s just a little different from yours, that’s all.”

If it is true that what makes fantasy is the idea of different rules, then that would explain why Star Wars sits so uncomfortably (but effectively) on the fence between fantasy and SF. It has all the trappings of space opera, but we are in no doubt that we are being told a fairy tale. When we see those words “A long time ago, in a galaxy far, far away …” we don’t think “Hang on, all galaxies are far away. I mean the nearest galaxy to us is Andromeda, and that’s 2,500,000 light years away.” What we think is “Once upon a time …” and what we understand is “The rules are different here.”

5

Magic Numbers #3: 10,000

I have an ambivalent relationship with Malcolm Gladwell (a relationship of which he is naturally unaware). On the one hand, I love his ability to seize on apparently insignificant details (Goliath’s myopia, varieties of spaghetti sauce, the Norden bombsight) and draw interesting conclusions from them. On the other hand, it’s not a good thing that such a skillful writer and charismatic speaker manages to get things wrong in ways that a little critical reading of the data could have prevented. Gladwell attracted some criticism for his attribution of New York’s falling crime rate to Giuliani’s “No broken windows” policing when in fact New York’s crime fell more because crime across the whole of the developed world was falling. (To his credit, Gladwell admitted he “oversold” the idea.) What intrigued me more was that his famous “10,000 hour rule” was deflated by Anders Ericsson, the very person he got the idea from.

Malcolm Gladwell
Gladwell at Pop!Tech 2008 (photo by Kris Krüg)

As a naturally lazy person, I was inclined to be skeptical of the 10,000 hour rule as soon as I heard about it. I was also reminded of my time as a music student. Gladwell’s claim was based on research by Anders Ericsson, Ralf Krampe and Clemens Tesch-Röhmer that found the best students at a music school in Berlin had, on average, put in 10,000 hours of practice by the time they were twenty. What I observed, though, was that beyond a certain point, the amount of practice my fellow students put in didn’t correlate particularly strongly with their performance. Some people were just good, and only had to practice enough to stop their technique from getting rusty. Some people could practice all day and would never be more than competent, because actually being a great musician isn’t primarily about technique; it’s about feeling. (I’m speaking here as less-than-great musician; as my teacher at the time put it, “You’re playing virtuoso material, but you’re not a virtuoso yet.” Ironically, I was one of the ones who might have benefited from those 10,000 hours of practice.)

Only 9,998 hours to go.

Another person I admire (but also take with a grain of salt) is Tim Ferriss. In The Four-Hour Chef, he points out the problems with the idea that 10,000 hours of practice are necessary to master any skill. Firstly, we can’t say anything on the subject without a clear idea of what we mean by “master”. Is your standard for mastering golf the best player at your local course, a national champion, or Tiger Woods? (Ferriss takes being in the top 20% world-wide, which I think is reasonable.) Secondly, the amount of effort, practice or talent necessary to master a skill varies according to the skill being practised. As Anders Ericsson himself points out, “Steve Faloon, the subject of an early experiment on improving memory, became better at memorizing strings of digits than any other person in history after only about two hundred hours of practice.” At a more modest level, I’ve used YouTube to relearn several skills, from peeling a banana to tying my shoelaces. I can say with some confidence that I’ve mastered them (except for folding fitted sheets — I’ve got a way to go there) but it certainly didn’t take 10,000 hours of practice. Usually it didn’t even take one.

Most importantly, Ferriss queries the cause-effect relationship. Remember that the data come from intensely competitive fields, as Ericsson says: “The reason that you must put in ten thousand or more hours of practice to become one of the world’s best violinists or chess players or golfers is that the people you are being compared to or competing with have themselves put in ten thousand or more hours of practice.” But as Ferriss notes, if you’re in a highly competitive field where everyone is practising like crazy, you are likely to practise like crazy too, regardless of how much practice you actually need to do. Maybe it’s not just that practice makes perfect but also that perfectionism makes you practice.

So why did the “10,000 hour rule” become popular? Firstly, people immediately ignored what Gladwell actually wrote and assumed that 10,000 hours of practice was not only necessary to master a skill but was also sufficient. If you want to become a brilliant painter, computer scientist, athlete or opera singer, just put in 10,000 hours of practice and you’ll get there. Naturally this can-do mentality is popular, but a moment’s thought will reveal that it is absurd. For those who think enough practice is all it takes to be an opera singer, I suggest watching Florence Foster Jenkins. As well as natural talent, circumstances determine not only whether 10,000 hours of practice produce the intended results but whether it is possible to put them in at all. Although Chapter 2 of Outliers is called “The 10,000 Hour Rule”, the examples Gladwell chose are also intended to illustrate the importance of luck: Bill Joy and Bill Gates both had access to computers at a time when they were scarce and the industry was taking off; the Beatles were lucky enough to get a regular gig in Hamburg.

Only another 99,998 hours to go

Misplaced optimism aside, another reason for the popularity of the 10,000 hour rule is that 10,000 is a magic number. It’s big, but not impossibly big. Think about the Duke of York — not the one who likes teenage girls, but the grand old one with ten thousand men. If he’d marched ten men up to the top of the hill and down again, that would be unremarkable. If he’d marched a million men, it would have been absurd. Ten thousand men sounds about right. It’s the same with practice: we like to think in orders of magnitude. One thousand hours sounds like something anyone could do — only about 6 months of full-time study. 100,000 hours sounds impossible — more like fifty years of full-time study, which is Zen-level mastery. 10,000 sounds just right, as Goldilocks might say.

Of course the thing you apply this magic number to has itself to be in that order of magnitude. People are fond of telling us how many glasses of water we should drink a day, but 10,000 obviously isn’t going to work. On the other hand, 10,000 paces a day sounds just right. “Sounds” is the key word, though. The 10,000 figure comes from the father of the first popular pedometer, Y. Hatano, who called it manpo-kei, meaning “10,000 steps meter”. He published research to support the figure (coincidentally in 1993, the same year that Ericsson’s formulation of the 10,000 hour rule was published) but nobody really knows how many paces you should walk a day for optimal health; most recently, a range of 7–8,000 has been suggested. Whatever the science behind it, the product was taken up and marketed by Yamasa Tokei Keiki, and the 10,000 figure stuck because it sounds cool in Japanese.

But Japanese, like several languages, has a single word for 10,000 (man), and thus does not lead us to expect precision. After all, when the Chinese classics talk about “the ten thousand things” (wan wu), they do not mean that there are actually that number of things in the world. In contrast, when we write “10,000”, those zeros give the number a misleading scientific aura. Sometimes it is more accurate to be less precise. The next time you hear someone say “ten thousand”, just translate it as “oodles”, or “a bazillion”.

Harry Potter and the Spitting Haredim:

How religion can make us moral or submoral

Note: this is a mash-up of two posts that originally appeared on livejournal.com in 2012.

I hate forms that ask you to tick a box for your religion. Apart from the fact that there’s never a box for “Wittgensteinian fideist”, or the fact that putting people into boxes according to religion is a first step to putting them in concentration camps, focussing on what religion people are distracts from what I think is really important, which is what their religion does. My militant atheist friends would say that is simple: religion makes people stupid and obedient at best and turns them into crazed killers at worst. But we could say the same about money, and that is hardly the essence of what money does. Like money, religion is a kind of universal motivator: it can make us moral, immoral or — and this is what really interests me — submoral.

Not In Harry’s Name: How Religion Makes Us Moral

I once quipped that religious wars were like the Lord of the Rings fans beating up the Harry Potter fans. Glib though that may be, I think examining the relationship between religion and fantasy may help us understood both of them a bit better.

Some years ago, my attention was caught by a picture of a chocolate bar bearing the slogan “Not In Harry’s Name.” It turned out to come from a successful campaign to get Warner Brothers to only use Fair Trade chocolate in their Harry Potter merchandising. As a result, I found myself joining the Harry Potter Alliance, “an army of fans, activists, nerdfighters, teenagers, wizards and muggles dedicated to fighting for social justice with the greatest weapon we have — love.” Sweet, aren’t they? Believe it or not, the HPA can give us an insight into religion, and specifically how religion encourages moral behaviour.

If you’re not a whack-job fundamentalist who thinks Harry Potter is a Satanist, you probably don’t think that the Harry Potter books promote any kind of religion, and you would be right. But the books, while not being Holy Scripture, operate in a similar way to religion. In fact, it’s tempting to say that they keep the good parts of religion while getting rid of the bad bits, like terrorism and child abuse, but that might be taking the argument too far.

Let us think of religion as a combination of three essential elements. The first is a moral vision. Every religion has some idea of what human beings are ideally like, including their relationship to each other and to the non-human world (Nature plus any gods, spirits etc. you may happen to believe in). Secondly, it has a set of practices which are thought to be helpful in realising that vision: prayer, fasting, meditation, holy war, church jumble sales etc. Finally, it has what I call its mythos. I originally used the term “supportive fantasy”, originally coined by Pete Carroll to refer to a magical belief that, regardless of whether you think it is literally true, is there to help produce a result. However, this risks confusing religion with literary fantasy, which relies on the convention that both writer and reader regard the work as not only untrue, but impossible. Religious beliefs, on the other hand, are regarded by their believers as true — literally if you’re a fundamentalist, figuratively if you incline more to liberal theology. Nevertheless, what makes them religion rather than poetry or bad science is their function, not their truth value. Take the role of Buddhism and Taoism in China, for example. Buddhists hold that life is full of suffering, and you’ll have to repeat it in countless incarnations unless you curb your desires, live a blameless life and meditate a lot. Taoists believe that life is just dandy, so to attain immortality (or at least longevity) you should curb your desires, live a blameless life and meditate a lot. Hmm.

Leaving aside the question of whether any particular religious belief is true in the sense that Boyle’s Law is true, it seems clear that when religion works well, it is like the Harry Potter Alliance on steroids, or whatever illegal performance enhancers kids at Hogwarts take. The HPA uses a popular fantasy as a way of creating a sense of community, providing fictional role models and generally motivating people to do good. Just imagine how much more powerful that would be if people not only enjoyed the fantasy but believed Harry Potter was a real person. OK, they would be stark raving bonkers, but they would be a potent force for good (so long as they could conceal the fact that they were stark raving bonkers).

Is religion, then, a kind of controlled insanity which — when it is not doing indescribable evil — can be harnessed as a force for good? Not quite, and not just because a belief in gods or spirits is not as obviously nutty as a belief in, say, horcruxes. I would say it was rather more like a placebo. You take the big red pill that you believe is a powerful medicine, so you get better. The pill is not a real medicine because you only get well because of the placebo effect. But if the placebo effect means you get better because you took the big red pill, then the pill really is medicine. And the fact that it is big and red is important; studies show that big red pills work better than small blue pills. It may sound like I’m just playing with words here, but I think there’s an analogy with religion. (And more than an analogy with magic; for all practical purposes, the placebo effect is magic.) If my faith in some god lets me work miracles, then is it justified? The pill-as-object and the pill-as-healing-agent are different, and we believe in them in different ways, or as Wittgenstein might put it, in different language games.

This leads us to the problem with liberal theology. While it is better than illiberal theology (largely because it doesn’t kill people) it is weaker. Kelly McGonigal informs me that you can be cured by a placebo even if you know it’s a placebo, but I would assume the effect would be less potent. This doesn’t seem to apply so much to fantasy/religion because, as we have seen, people can still be motivated by the Harry Potter books even though they don’t believe in their literal truth, but liberal religion still packs much less of a punch than literalism. It’s the difference between saying “Well Harry Potter represents some noble qualities of the human soul, such as courage, compassion and a sense of justice, so it’s ironic that people are using him to sell chocolate produced in an exploitative way” and “Harry Potter is real and HE’S REALLY ANGRY with Warner Brothers!” The first one is just so C. of E. The best we can hope for, I suppose, is something like this headline from the HPA website:

DID YOU EVER WISH THAT HARRY POTTER WAS REAL? WELL IT KIND OF IS.

Spitting On Schoolgirls: How Religion Makes Us Submoral

In 2011, an eight-year-old Jewish girl, Na’ama Margolese, was shouted at and spat on while entering her primary school. Not just once, but every day. The mob hurling abuse at her were not Neo-Nazis; they were Haredim — not a nation of Middle-earth but Ultra-Orthodox Jews. They thought she was immodestly dressed, even though the girl was herself from an Orthodox family, this was a religious school, and her dress would make Amish girls look like sluts. I am used to the idea that certain people of a religious persuasion are overly concerned with how much skin their neighbours reveal (motes in eyes and all that) but this was so extreme as to be not just comic but creepy. We are talking here about an eight-year-old girl being castigated as a whore. Does that mean Ultra-Orthodox Jews are a bunch of paedophiles? I think not, but in that case, why do they demand that little children cover themselves up as though they could wreck marriages just by hanging around the school gates?

We have seen, thanks to Harry Potter, how religion can make us more moral. A brief look at history can also show us how religion makes us capable of monstrosities in the name of morality. But this is something different. The people spitting on schoolgirls are not, I think, in the mold of Torquemada, who tortured and killed from a fervent moral conviction. This seems to be more submoral, a word which has been used in various ways, but which I take to mean the following: a submoral person is one who, while having moral intuitions and being capable of moral reasoning, elects to let them atrophy in favour of a set of quasi-moral principles for behaviour. Of course we all do this a lot of the time because thought is hard; we might even argue that it is the normal state of tradition-bound societies. What is interesting here is that the people concerned have thought very carefully about their religion; they are just refusing to reason about the moral basis of their actions. What we have is religious kitsch, by which I mean not plastic Virgin Mary table lamps or glow-in-the-dark crucifixes but a certain attitude to religion which is in a way a failed version of the Harry Potter Alliance. To explain this, we need three tools: moral reasoning and intuition, the idea of kitsch, and the view of religion I put forward earlier.

Philosophers and neurologists may debate endlessly about the nature and validity of moral intuitions, but it is plain that nearly all of us have them; not to have feelings that certain things are right or wrong is pathological. We react at a gut level against murder, incest, and torturing cute puppies. We also have moral reasoning, by which we argue from general principles to specific cases and strive (usually unsuccessfully) for consistency in our moral judgments. This is what enables us to decide that gay people have rights even if we may personally find the idea of gay sex totally icky, or that it is as bad to torture your enemies as it is to torture cute puppies. Again, everyone has this capacity, albeit in varying degrees. Both moral intuition and moral reasoning can be wrong, but we are generally better off with them. To be submoral, then, is to refuse to use both of these moral faculties. The submoral person may even do the right thing, but by chance, because the particular moral code they have, for non-moral reasons, adopted, happens to prescribe it.

How kitsch applies to morality and religion is less straightforward. I’m using it in a broader sense than just tacky art, of course. I actually started thinking about it in a broader context while proof-reading a book written by my friend Ulrich Steinvorth. Steinvorth examines the idea of kitsch put forward by Milan Kundera in The Unbearable Lightness of Being, a long passage which I’d skimmed over at the time because I find polemics in novels irritating, and besides, I was more interested in seeing how far things would go with Tereza and Sabina. The key idea is where Sabina describes the reaction to a sentimental painting:

Kitsch causes two tears to flow in quick succession. The first tear says: How nice to see children running on the grass! The second tear says: How nice to be moved, together with all mankind, by children running on the grass! It is the second tear that makes kitsch kitsch.

Steinvorth expands this idea:

First, we may wonder why the picture of fluffy kittens or a sunset is kitsch while real fluffy kittens or a sunset that look exactly like the pictures are not. Similarly, a Gothic cathedral is often great art, but the same cathedral rebuilt in our time is felt as kitschy. The reason is the picture or copy is made to trigger not so much a first emotion as a second one that indulges in our agreement with what we consider all mankind’s love of kittens or the love of Gothic cathedrals by all people of our ilk.

(draft of The Metaphysics of Modernity. What Makes Societies Thrive)

Kitsch is a lot more than tacky art. Steinvorth argues that when we descend into kitsch, we stop doing things for their own sake (which is the theme of his book) and start doing them for our sake. When we go “awww” at a fluffy kitten, our focus is on the kitten; when we hang a picture of fluffy kittens on the wall, our focus is on ourselves. We are not saying “Look, a fluffy kitten!” but “Look, a person who thinks fluffy kittens are adorable!”

To return momentarily to Hogwarts, the epitome of kitsch is Dolores Umbridge, with her cat-themed china collection. And as J.K. Rowling herself points out, “a taste for the ineffably twee can go hand-in-hand with a distinctly uncharitable outlook on the world.”

To apply this to religion, let’s recap the pragmatic view I described earlier: a religion combines a vision of what people should be, a set of practices designed to bring us closer to this, and a mythos; i.e., a system of beliefs which motivate and provide meaning to the first two elements. When it works like this, whether in its liberal or literalist forms, the practice of religion may often be wrong, but it is not kitsch. The focus is on the belief, but with the aim of becoming a better person, however the religion defines “better”. Torquemada may have been totally evil and depraved, but he was not in the least kitschy; his problem was in his view that torturing people over theological niceties fell within the parameters of being a good person. Religious kitsch leads to more mundane, but much more widespread badness; it is what we could call “religiosity”.

Artistic kitsch focuses on the feeling of satisfaction that we get from having certain feelings, which is what leads to its disregard of aesthetic standards. Similarly, religious kitsch moves the focus from moral or spiritual behaviour to the feeling of satisfaction we get at feeling like a moral or spiritual person. The Voodoo syncretist who puts a plastic Virgin Mary lamp on their altar because they think it works magic is not being kitschy; the good Catholic who puts it on their bedside table may well be. The lamp says “See, I am a good Catholic who loves the virgin Mary!” but this is not mere show because we say it to ourselves as much as to others. We all do this to some extent, but when it becomes the main focus, then we become submoral, because we have abandoned moral reasoning, and even perhaps moral intuition, in favour of feeling moral about being moral.

This is why people can spit at schoolgirls. They probably are not demented Torquemada types who, after consulting with their conscience, really, truly think an eight-year-old is the Whore of Babylon. They are simply being spiritually kitschy.

Magic Numbers #2: Dunbar’s Numbar

There is a phenomenon in anthropology known as Dunbar’s number, after its creator, Robin Dunbar, who speculated that there was a correlation between neocortex size and the maximum number of regular social contacts that a primate could maintain. Since social contact in primates is maintained primarily by grooming, I would have thought the crucial variable would be manual dexterity rather than neocortex size, but then I’m not a primatologist, so what do I know? Anyway, the idea caught on like lice on an ungroomed primate, and Dunbar proposed a number for humans based on the data from other primates. This number is 150, which sounds like a reasonable estimate. I mean, could you handle more than 150 friends on Facebook or whatever? (This, by the way, is the reason why so many IT movers and shakers are interested in Dunbar). There again, could you handle more than 100? There’s the problem: Dunbar’s number is actually 148 with a 95% confidence interval of 100 to 230. So we can predict with a fair degree of confidence that if a group grows to have a hundred members, it will either start to experience problems cohering and start to fragment, or continue growing up to as much as double its current size.

In other words, Dunbar’s number tells us nothing that common sense doesn’t. Meanwhile, other anthropologists have come up with some different numbers: Russell Bernard and Peter Killworth reckon the maximum could be a hefty 230 or 290 (depending on whether you take the median or the mean). But, as Wikipedia notes, “the Bernard-Killworth number has not been popularized as widely as Dunbar’s,” despite its being replicated in a variety of studies. To explain this, I propose Dunbar’s Law: “Where there are two hypotheses to explain the same data, the one with the cooler name will be adopted.” “Dunbar’s number” beats “the Bernard-Killworth number” by sheer assonance.

Magic Numbers #1: The Tragic Yet Inspiring Story of Vilfredo Pareto

With those wild staring eyes and bushy beard, he just had to be a visionary.

Pareto’s Law, Pareto’s Principle or, more popularly, the 80/20 rule, is all over the place these days, yet few give much thought to the man who created it, Vilfredo Pareto. This is probably just as well. Pareto was a classic case of a scientist who stumbles across an interesting fact, elevates it to a universal law, speculates increasingly wildly, and eventually descends into absurdity. Pareto started with the rather obvious observation that 80% of the land in Italy was owned by 20% of the population. He then studied income distribution in a few different countries and found the same ratio. This discovery prompted him to propose the 80/20 ratio as an immutable law no different from the laws of physics, implying that well-meaning attempts to reduce economic equality were doomed to failure. Thus armed, he embarked on a crusade for extreme laissez faire economics before becoming infatuated with Mussolini and generally disgracing himself.

Even Pareto’s one great idea is not as universal as one might think; after all, the 19th-century Western societies (Italy, Britain, Prussia, Saxony, Ireland, and Peru) he studied were not remarkably different from each other, and I imagine he would have found a very different pattern of wealth distribution if he’d studied the Kalahari Bushmen, for example. Even in modern capitalist societies, inequality is not a constant: in the USA the top 20% did indeed own 80% of the wealth in 1983, but by 2010 it was over 95%. That’s a big difference if you’re in the bottom 80%.

Given the status of economics as “the dismal science”, it’s not surprising that Pareto’s spectacular real-world failure should make him revered among economists, but why has he been so influential elsewhere? As it happens, the ubiquity of Pareto’s 80/20 ratio comes down to a fellow called Joseph Juran, who noticed that even if it wasn’t an immutable economic law, it was a really handy way to say that for x and y, a little x gets you a lot of y. 80% of sales are generated by 20% of customers. 80% of a language uses only 20% of its vocabulary. 20% of workers do 80% of the work, and everyone counts themselves as part of the 20%. Tim Ferriss has also done great things with this magic number in his “4-Hour” series of books (The 4-Hour WorkweekThe 4-Hour Body and The 4-Hour Chef). The important thing is that it’s a magic number. This is inspiration, not hard science, so don’t quibble about whether the real ratio is 80/20 or 70/30 or 95/5.

Attack of the MOMS

Or: Why the Malaise of Modern Society is a Myth

[This article is compiled from a series of posts on Livejournal I wrote in 2009. To those who say it sounds like recycled Steven Pinker, I’d like to point out that this is way before Better Angels of Our Nature came out, let alone Enlightenment Now, but I’m reprinting it (with a few minor edits) because it’s part of the same debate.]

Part 1: The Non-existent Crime Wave

In the past few days I’ve read around sixty exam papers dealing with the question of restorative versus retributive justice. Of these, I’d guess around twenty start off with a sentence like “All over the world, crime rates are soaring.” This piqued my curiosity. All of these students are studying social sciences, so we might expect them to know that in most developed countries, crime rates, while continually fluctuating, have in general fallen over the last two decades (crime rates in Third World countries vary wildly because there are so many factors involved, from endemic corruption to civil war). I am not saying that this is something to be jubilant about: the US homicide rate is still higher than it was in 1960 while in some European countries, crime in general has fallen but violent crime has risen. In Japan, street crime is now widespread; a common scenario is for an elderly person to approach a group of street-toughs to ask for directions, only to find that they give him the wrong directions. Moreover, reasons for the fall are obscure; even decreased lead levels in the atmosphere have been credited. But whatever the reasons, one thing is clear: crime rates are not soaring. The interesting part is why people believe that they are.

The simplest explanation is just that it takes a while for information to spread; by the time most people have noticed the fall in crime, crime will probably have started rising again. However, I shall lay Occam’s razor to one side for a moment in order to contemplate another hypothesis, which I call the MOMS syndrome, MOMS here standing for “malaise of modern society”. The idea that crime is increasing is attractive not just because for a while it did increase, but because increasing crime is part of the MOMS: modern society has a high crime rate because modern society is fundamentally flawed. You can choose one or more of many aspects of the malaise to explain crime: decline in religious belief, rampant consumerism, single mothers … take your pick. Any of these can be pulled in to say why, for about thirty years, crime rose to almost nineteenth-century levels. Ah yes, that’s the problem. The murder rate in Britain was 1.7 per 100,000 in 1850. By 1900, that had dropped to 0.8, whence it fell slowly to an all-time low of 0.7. It’s true that it then climbed quickly to a high of 1.4 in 1990, but — and this is the part people forget — it then started to fall again. [Source](Figures for the USA are similar but higher overall, and have a spike around 1920–1930 because of prohibition.)
Furthermore, when we look at crime on the basis of centuries rather than decades, crime is not soaring but plummeting (if you can talk about something plummeting over centuries). As I mentioned previously, the murder rate in thirteenth-century England was 20 per 100,000, which is around four times what it was in 1700 and fourteen times the last peak of 1990. Whatever disadvantages modernity may have brought in its wake, crime is not one of them.

Part 2: Diseases of Affluence — Why Cholesterol is Better Than Cholera

Having established that modern society not only is not more violent than traditional societies, but is actually much less violent, I would like to talk a bit about health. I’ve always been a fellow-traveller of the alternative health movement. The blame (or credit) for this goes partially to my parents and even more so to my grandfather (an antiquarian book dealer), who exposed me to books written during an earlier wave of alternative health movements: the first yogis, vegetarians, hikers, carrot juice addicts, naturists, practitioners of eurhythmics and advocates of perfect eyesight without glasses. Add my hippie friends in the 1970s and you have wholefoods, t’ai chi, orgone accumulators and all the other holistic razzamatazz. These days I’m in Turkey watching the process of health anarchy repeat itself, and jumping in with a peculiar mixture of nostalgia and skepticism. Anyway, from all of this, you’d think I’d be highly critical of modern society with it’s “diseases of affluence” and modern medicine with its magic bullets, invasive surgery and other military metaphors.

In fact, I’m not. I admit that there are diseases of affluence. Americans raised on junk food are in just as bad a position as eighteenth-century aristocrats with their gout brought on from pheasants and port, or Roman patricians suffering the ill-effects of gorging on larks tongues in aspic. And that, dear readers, is the wonderful thing. At the beginning of the twenty-first century, in the developed world, those considered poor have diseases that for most of history were limited to a privileged few. Now I admit that this in itself would not be a powerful argument in favour of modernity, but bear with me. Millions of people in rich countries suffer from obesity, heart disease and so forth because of their diet and lifestyle. Given that they have the alternative of living healthily, this is obviously not good, but let’s not forget two things. First, they do have the alternative of living more healthily. Second, they have these diseases of affluence because they are affluent, and believe me, diseases of affluence are better than diseases of poverty. I live in a country on the fringes of the developed world, which means it still has some pretty undeveloped bits. I know people from these undeveloped bits, and the first thing I notice about them is that they’re short, they look older than they really are, and a lot of them aren’t too bright. There are several reasons for this, but an important one is childhood malnutrition. Peasants on the Aegean coast of Turkey generally enjoy good health because the geography favours fruit, vegetables, olives and fish; the same goes for the Black Sea coast, which is the world’s main producer of hazelnuts (and also has plenty of fish), but move into central Anatolia and we’re talking bread, onions, the odd legume, bread, meat on a good day, and more bread. This is also how most of Europe lived until recently.

Of course, there are many things wrong with the modern Western diet. However, we are making the mistake of comparing what the average person eats today with what a highly fortunate person (e.g. an Aegean peasant) ate a century or two ago. If we compare the diet of an industrial worker in Manchester today with that of an industrial worker in Manchester in 1850, then I think we have to admit that the evils of saturated fats, sugar, salt and esoteric food additives pale into insignificance when compared to a diet of bread and gin. And speaking of additives, we shouldn’t forget that two hundred years ago, people put chalk in flour to make it look whiter and sulphuric acid in beer to give it more of a tang.

Modern diseases may be bad, but would you really want to swap them for pre-modern diseases? I’ll admit that the onset of modernity — i.e., the industrial revolution — brought terrible diseases in its wake, such as cholera, or the influenza epidemic of 1918 that killed more people than the war that preceded it. But really, we’re over that now. Nowadays, people panic when a few hundred deaths happen as a result of some new kind of flu. If less than a thousand deaths is news, times are good.

Finally, there’s the fact that although many alternative therapies and lifestyle practices are based on traditional methods, the phenomenon as a whole is recent. Using Turkey as an example again, there have always been traditional spiritual healers, bonesetters and herbalists here, plus a wealth of health-related folklore. But until comparatively recently, that was all there was. Then along came modern medicine, first restricted to the urban elite, but now available throughout the country. Whatever the disadvantages of modern medicine, you have to admit it’s good for some things, such as painless, infection-free surgery, vaccinations, antibiotics and, in general, stopping people dying. Now, thanks to globalisation, Turks not only have modern medicine, but also traditional therapies of other countries (plus traditional-modern hybrids) and are now enthusiastically embracing yoga, acupuncture, reiki and pilates. We could argue about whether this is a modern or a postmodern phenomenon, but whatever it is, it’s new.

Part 3: The Pursuit of Unhappiness

This leaves the big question: are we happier? And if so, is it real happiness or some Brave New Worldly pseudo-happiness?

Both questions are important, because those who believe in the Malaise of Modern Society argue both sides: some say that we are less happy than our ancestors; others lament the fact that our lives are so comfortable we have lost our sense of tragedy and succumbed to the anodyne, superflat happiness of contented pigs when we should be discontented Socrateses. Let us examine these one at a time.

It is hard to say whether we are more or less happy than our ancestors, given that in the Middle Ages, no one was wandering around with clipboards and microphones asking members of the public how happy they were. In fact, the sum of human happiness — meaning the ordinary happiness of ordinary people, not the beatitude of a few saints — wasn’t even much of an issue until the nineteenth century. My guess is that people were pretty happy in the Paleolithic era since they were living in the kind of environment they had evolved for, but even then, life probably wasn’t completely untroubled: watching your kids getting eaten by a sabre-toothed tiger can’t be much fun. I would also guess that people in nineteenth-century Europe were somewhat less happy than we are, but this is only a guess, based on the fact that the nineteenth century had a lot of the things that generally make people unhappy, such as poverty, high infant mortality, cholera etc. What we do know is that most people in developed countries today think of themselves as happy. We could tell them that they’re lying and in fact they’re totally miserable, but what would this achieve? You could just as well tell someone with depression that deep down, they’re positively joyful.

Speaking of depression, I am a little suspicious of all the talk about an “epidemic” of depression. Again, I’m not sure, but I suspect that this apparent mushrooming of misery is largely due to two factors: one, depression is now recognised as an illness, so people go to their doctor and get diagnosed with it; two, suicide is more acceptable in Western societies than it was in the past. On the other hand, I have to admit that loneliness is a major factor in depression, and modern societies provide more opportunities to be lonely than, say, a medieval village. Try feeling lonely when you have to share a bed with three siblings. It is also possible for a society to have both a high rate of depression and a high average level of happiness (hence the famous Scandinavian suicide paradox). We should also not forget that there is considerable variation between developed nations: “Around a quarter of British people, and more than a quarter of Americans, experience mental problems in any given year, compared with fewer than 10 per cent in Japan, Germany, Sweden and Italy.”
So really, we can’t know for sure, but I’d still say that your chances of happiness are higher in modern society. If someone asked me if I’d be happier living at a time before universal suffrage, the welfare state, antibiotics, sexual freedom and painless dentistry, it wouldn’t take me long to make up my mind. For this reason, I’ll go with the assumption that we are, on the whole, at least as happy as, and probably a little happier than, our ancestors.

Part 4: My Rich Inner Life, Your Spiritual Malaise

What, then, of the objection that our supposed happiness is a fake? We’ve seen that telling someone that they don’t feel happy is absurd (unless we are simply accusing them of lying). However, it may not be absurd to tell them that even though they feel happy, this doesn’t mean that they really are happy. Certainly a fair number of philosophers would do just that, because the distinction between happiness as a feeling and happiness as living well has been around since Aristotle. There is plenty of empirical evidence to tell us that most people seem pretty happy, but it’s mainly based on variations on two types of question: “How often do you feel happy?” and “How happy are you with your life in general?” People could answer positively just because they have low standards. A heroin addict feels happy and may well be contented with their life so long as they have a reliable supply of heroin, but this is obviously not the kind of life we would recommend, and most people would not describe this state as “true happiness”.

The MOMS argument is that we have substituted feeling good for living well (or if you prefer the Greek, hedonia for eudaemonia). This is a harder claim to disprove, not least because there is less than complete agreement on what it means to live well. For Homer’s Greeks, living well meant, as Tad Williams wonderfully put it, “sticking a spear in you then writing a poem about it.” Aldous Huxley in Brave New World expects us to be horrified that Shakespeare is banned but thinks nothing of rewriting Oedipus Rex to give it a happy ending in Island. Attempting to answer the dual question “What is it to live well, and how well are we living?” smacks of hubris, but I’m going to try anyway.

When people say that there is an epidemic of violent crime, you can show them the statistics that show how crime has actually fallen. When they claim that people in the West are unhappy, you can show them statistics that show that in fact, most of them are pretty happy. But when they say that we have lost our spiritual bearings and are wallowing in a false contentment, then it’s not so easy to come up with a counterargument. We can’t quantify virtue or measure meaning.

Another reason why it is hard to oppose the idea of spiritual malaise is that the very fact that the question gets raised implies that something has gone wrong, especially when some of the people raising it are, by all accounts, very clever people. Something happened in the middle of the twentieth century to make some of the best minds of Europe and America decide that we had taken a wrong turning: T.S. Eliot, Karl Jaspers, Martin Heidegger, Carl Jung, Aldous Huxley, C.S. Lewis, Jean-Paul Sartre, Herbert Marcuse, Hermann Hesse, J.R.R. Tolkien … all very different people, but all united in a conviction that Western society had gone off the rails. Of course the twentieth century also had its complacent and self-congratulatory intellectuals, and modernism has famous apologists, but here I’m more interested in the nay-sayers. Why, after the optimism that opened the twentieth century, were so many intelligent people saying that everything was getting worse?

It is a feature of the best of times as well as the worst that a lot of people will think that things are getting worse, and in particular, that people are getting worse. But in the middle of the twentieth century, it really did seem like Western civilisation was destroying itself. No sooner had society started to recover from “the war to end all wars” than Europe started gearing up for another one. World War I threw the old values of patriotism and tradition into question in a way an army of nineteenth-century intellectuals could not; the rise of Nazism, though, threatened the new values of science and social progress. Some intellectuals rallied to their defence, some rushed headlong into Stalinism, some journeyed to the East, some returned to the Church, some even flirted with fascism. Like any period of rapid change, it was a mess, with more people cursing the dark than lighting candles. With half the world going crazy, it is hard to blame them.

If this intellectual nausée had been merely a reaction to the ugly state of Europe from 1914 to 1944, though, it would have dissipated, but if anything, it became stronger in the second half of the century. Whether the criticism comes from progressives or conservatives, mystics or existentialists, there is enough disillusionment with modern society to indicate that some things really did change in a serious way. Whether these changes are all that bad, though, is a matter of debate. I would say that the criticism focusses on four closely related areas: the decline of religion, estrangement from the natural world, consumerism and mass society.

Some critics of modernity, such as C.S. Lewis, T.S. Eliot and a host of Catholic converts, see the decline in religion as the main source of the malaise of modern society. Even those with no particular religious axe to grind have misgivings about the “disenchantment” (to borrow Weber’s term) that modernity brought. And yes, there has been a decline in religion in Europe (and to a lesser extent, America), and yes, this does mean we’ve lost some good things, like the way a shared faith can give purpose to a community. On the other hand, we shouldn’t forget that religion often gives us bad purposes, and if we want all the warm fuzzies that come with a strong religious community then perhaps we shouldn’t complain when they burn the occasional heretic. My personal view is that there’s a kind of ideal ratio of faith which occurs when a third of the population have definite religious beliefs, a third have some vague notions of spirituality and the rest are either committed atheists or apathetic agnostics. In any case, what we definitely do not have in today’s society is the “spiritual vacuum” that religious anti-modernists complain about. If modern society were so materialistic, how come books on spirituality sell so well? Usually what these people are complaining about is that people are getting interested in other people’s spiritual beliefs and practices. There’s a double standard involved: if someone goes to church, they are spiritually fulfilled, but if they go to a reiki class, they’re trying vainly to fill the spiritual vacuum inside them.

Estrangement from nature and the evils of industrialism have been a theme of anti-modernists on both the right and left since William Wordsworth and William Cobbett, and to be fair, they often make some very good points. The industrial revolution made life hell for a lot of people, and even after the living standards of the working class rocketed in the twentieth century, it left us with a lot of ugliness and the likelihood of a global catastrophe of Biblical proportions. (There again, we shouldn’t forget that they had Biblical catastrophes in Biblical times too, and they were much less equipped to deal with them.) It’s a major theme of my favourite anti-modernist, J.R.R. Tolkien, whose hobbits live in a rural utopia, while Saruman and Sauron go all out for industrialisation. Tolkien was influenced by his childhood, when he moved from a Hobbiton-like village to Birmingham, which in those days was a pretty good model for Isengard. Even my own childhood visits to Birmingham in the 1960s were enough to put me off big cities for a long time. But we shouldn’t forget two things: firstly, while English village life may be idyllic, it is only possible because it is (and was even before Tolkien’s day) supported by industry; secondly, technology has advanced a long way since the industrial revolution. The centre of Birmingham, which used to be black with soot, is now a pleasant place to wander around.

Even if we concede that technology can be made clean, comfortable and eco-friendly, though, will we ever regain the bond with Nature that the industrial revolution cut? Or did we really have such a bond? I’m quite happy to admit that hunter-gatherers have a relationship with their environment which is so spiritually intense I can’t really grasp it (in fact, it’s because I can’t grasp it that I’ll give them the benefit of the doubt on the spiritual side). I’m not so sure I’d grant such a holistic vision to a sixteenth-century peasant. Sure, my forebears spent a lot more time in the open air than I do (most of them were farmers, after all) but did they look at Nature any less instrumentally than we do? If anything, I’d say they had a more instrumental attitude, since for them, Nature was not something to commune with, but a way of making a living. They might not have been estranged from Nature in the way that modern city-dwellers are, but Olde England was no Findhorn. In those days, people who talked to nature spirits tended to come to nasty ends.

The loss of spiritual values and the disenchantment of Nature lead to a preoccupation with acquiring material goods. Well, that’s what we’ve been told, and I suppose it’s true to an extent. Somebody who has no spiritual life to speak of and doesn’t get off on daffodils is more likely to spend their time at the mall. But as I’ve said, the fact that we don’t all believe in the same things doesn’t mean we’ve all lost our spiritual values, nor is material greed unique to modern societies. People have probably been lamenting human acquisitiveness ever since the combination of agriculture and pottery gave us the means to acquire things. Take the Vikings, for example. Do you think they rampaged across Europe just to take in the scenery? They were after gold. OK, gold and slaves, who could later be sold for gold. And maybe a bit of amber, too. Henry VIII didn’t dissolve the monasteries because he was a pious Protestant, but because he was a greedy bastard. History is one long, sad story of people killing each other out of greed.

Now I dislike consumerism as much as the next lefty, but I have to ask myself if it is any different from normal human greed. We are told that it creates artificial desires for things we don’t really need, but then did Erik Skullsplitter really need that goblet he looted from Lindisfarne? Consumerism is bad because it places what may become unbearable strains on the planet’s resources, makes people work harder than is good for them and encourages exploitation of poorer countries. But the only thing that makes it new and different is that it allows almost all of us to do this. In the past, you really had to be somebody to pillage the world’s wealth; now any Joe Proletarian can go down to his local department store and do it. I admit I’m playing the Devil’s advocate here, but doesn’t part of the disdain for consumerism come from elitism? After all, it’s all about mass-produced junk for the masses, and the masses are never us.

This brings us neatly to the last cause of our supposed spiritual malaise: mass society, and with it, popular culture. I already (in the poetically titled “Mass Society, My Arse”) debunked the notion that the twentieth century saw the individual crushed by mass society. The forces of collectivism did their utmost to create a conformist mass society and failed; the twentieth century was the first time ever that ordinary people were able — and were sometimes even encouraged — to think for themselves. Even openly conformist mass movements like Fascism were only able to come to power because the masses had become a potent political force. Political propaganda and advertising, obnoxious though both of them may be sometimes, only arose because, again for the first time ever, ordinary people were forming their own opinions, and those opinions were making a big difference. Before the modern age, people didn’t need to manipulate the masses so much because the masses didn’t matter all that much — you might want to give the yeomanry a stirring speech before sending them to be mowed down by enemy cavalry or egg on a mob to lynch one of your political rivals, but you didn’t need a full-time propaganda machine because most of the time nobody gave a lark’s tongue what the plebs thought.

Does this render all criticism of mass society invalid? Not entirely, because even though it’s preferable for ruling classes to control the masses through TV than torture, manipulation is still not nice. But all too often, criticisms of mass society are really just criticisms of popular culture, which like criticisms of consumerism, often have elitist overtones. To say everything I want in defence of pop culture would take far too long, so I’ll just stick to one simple but often overlooked point. When people have a bone to pick with modern pop culture, they generally compare it with what they know of the culture of bygone days, so that they compare, for example, Madonna with Mozart. In other words, they are comparing someone who appeals to the masses with someone whose music was listened to by an aristocratic elite, and whose genius was recognised by a handful of them. If you want a fair comparison, you need to compare what ordinary people today listen to with what ordinary people listened to a few hundred years ago. Let no one complain that pop music is banal and repetitive until they’ve listened through all twenty-nine verses of Mattie Groves.

Taking all of the above points into consideration (as my students love to say) is there a malaise of modern society, or have we never had it so good? The people who claim that crime is skyrocketing, that we are less healthy than we were a hundred years ago or that people in the West are less happy than people in the East are simply wrong, and there is plenty of evidence to prove them wrong. The philosophical objections to the modern world are more complicated and less easily dismissed, though as I think I’ve shown here, all of them are problematic in one way or another. For me, though, the real clincher is that I am sitting at a computer writing about the malaise of modern society, and millions of people can (should they so wish) read what I write. My great grandmother couldn’t read anything anyone wrote, because she’d never learnt to read. And sure, Paradise Lost may be better written than The Da Vinci Code, but I’d still rather live in a world where the masses get to read Dan Brown than one where a handful get to read Milton.

Are E-Sports Really Sports?

The word “sport” has gone through a number of meanings. When Shakespeare said “As flies to wanton boys are we to the gods; they kill us for their sport,” he obviously wasn’t talking about sport in the same way as the International Olympic Committee, which as far as I know does not recognise fly-swatting as a sport. While we expect that kind of thing in Shakespeare (for whom “naughty” meant “nihilistic” and “punk” meant “prostitute”), some changes in meaning are comparatively recent. When a friend of mine was applying to Oxford, the octogenarian professor interviewing him said “I see you say you’re interested in sports — huntin’, shootin’ or fishin’?” When my friend replied that he actually meant football and cricket, the professor sighed and said “Ah, you mean games.” For an English gentleman of his era, a real sport had to end with the demise of an animal; propelling a ball around a field was a mere pastime. It is not surprising then, that as new activities get called sports, some will stand up and deny that they are really sports. This is nowhere truer than in the case of e-sports, which some critics seem to regard as a contradiction in terms.

A reasonable answer to the question “Are e-sports really sports?” would be “Well, kind of,” but that would take the fun out of it, and would make it hard for people to write academic papers on the subject. And write they do. This article is largely a response to Jim Parry’s paper “E-sports Are Not Sports,” from the journal Sport Ethics and Philosophy, though I will also mention Seth Jenny’s counter-argument “Debate: are eSports sport?” and name-drop Bernard Suits, Aristotle and Wittgenstein for good measure. What makes it interesting is that not only did Parry get it wrong about e-sports, he also inadvertently revealed that the way we generally think about sports is extremely muddled. His argument against e-sports as sports rests on three characteristics he assumes are essential to sports, namely rules and institutions, direct human competition, and physical prowess. Of these, only the last is an essential feature of sports, and even that criterion is rather fuzzy.

Rules and Institutionalisation

Parry sees all sports as requiring rules. While rules are a defining characteristic of games, they are also generally found in sports; most sports are also games (e.g., football) and those that we might not count as games (e.g., mountaineering) will usually have some rules, if only for practical reasons. They may be safety rules rather than game rules in the strict sense detailed by Bernard Suits (i.e., rules that exist only to make the activity possible) but this is not an important point here. Since e-sports have a multitude of rules over and above the constraints embedded in the game’s code, that should satisfy those who insist on sports being rule-based. As Seth Jenny points out, ESL One has a “thirty-page rule book which covers event, player and game-specific regulations.”

A much bigger WTF moment comes with Parry’s assertion that a genuine sport must be “institutionalised”. Citing an article by Cem Abanzir, he claims that e-sports tournaments are organised by game publishers, and thus impose arbitrary changes on the competition. They cannot be compared to Olympic sports, which are governed by the Olympic Committee, who sit on over a century of tradition and are at least in theory beholden to no commercial interests. As it happens, though, there are independent e-sports organisations. To quote Jenny again, “On the world’s stage, the International eSports Federation has been created while in the United Kingdom, the UK eSports Association and in South Korea the Korean Esports Association (KeSPA) have been created to standardize the sport in those respective countries. In the United States, this is being done by Major League Gaming (MLG) and ESports League (ESL).”

Jana Rodianov, famous for winning a world hula-hooping championship and having her head split open in a knife-throwing accident, which just shows that some things should not be sports

In any case, institutionalisation is an accidental rather than an essential characteristic of sports, as Aristotle would put it. Sports tend to generate sporting institutions; sporting institutions do not make sports. Parry claims that hula-hooping is not a sport because it has no regulatory body. This is not actually true (there are hula-hooping organisations who organise competitions and provide certification for instructors), but in any case it is irrelevant. Let us imagine that hula-hooping took off in a big way and was accepted as a competition by the International Olympic Committee. Would this mean that hula-hooping had suddenly transformed from a pastime to a sport? If so, does that mean that before the foundation of the Football Association in 1863, football was not a sport?

Direct Human Competition

Parry claims that while e-sports are highly competitive, they do not involve direct human-to-human competition, but are mediated by a computer system. Counter-Strike is thus no more a direct competition than a spelling bee is. This argument falls flat on two counts. Firstly, the fact that the competition is mediated is irrelevant. In motor sports, the competition is mediated by machines; drivers do not push cars around the track like Fred Flintstone but manipulate controls to power and steer their vehicles in much the same way that electronic gamers do. In response to this objection, Parry employs a No True Scotsman argument, saying, “This is motor sport, not (Olympic) sport.” Note the sneaky parentheses. Are we talking about sports or about Olympic sports? In the former case, enough people class Formula 1 racing and rally driving as sports to assume that either (a) motor sports really are sports, or (b) there is mass delusion going on. Given that Formula 1 is a sport, would there be a crucial difference between a Formula 1 race and an electronic simulation involving consoles which perfectly reproduced the feel and performance of an actual racing car? A game that simulated a sword fight by tracking players’ bodies and their manipulation of a fake sword would not be different in any important way from a fencing match (which already employs technology to determine whether a hit has occurred). The differences between simulation and reality are interesting, but irrelevant to the question of whether something is a sport.

Really not a good time to be tickled

More obviously, though, it only takes a moment’s thought to realise that there are many sports that do not involve direct human-to-human competition. Free-diving is competitive in that divers try to break each others’ records, but there is no direct competition — everyone dives alone, usually on separate occasions. It would be fun if divers could poke or tickle each other, but that would turn an already dangerous sport into a suicidal one.

Many traditional sports are non-competitive. The old professor talking about hunting, shooting and fishing may have enjoyed them but probably didn’t compete in them. Field sports (a.k.a. blood sports) can have competition grafted onto them, but it is not in their nature. The aim is not to out-perform a human, but to kill an animal. (And the days of human-animal competition are by-and-large over; salmon fishing in Scotland is not exactly Moby Dick.) Similarly, mountaineering may occasionally inspire competition (who gets to climb a mountain first) but it is not central to it. The important element in all these examples is challenge, not competition. What makes mountaineering a sport rather than mere exercise is the possibility of failure: you may be forced to turn back from the summit because of fatigue, injury or bad weather.

Parry is aware of this, and even mentions field-sports and mountaineering, but again his solution is not to revise his definition, but to claim that these activities are not sports. Now there are certainly times when we can and should say that something commonly called X is not in fact X. You can make a case that a party called “The Socialist Party” isn’t actually socialist, or that a language widely regarded as Altaic is actually Finno-Ugric. But the first implies a deception that needs to be exposed, while the second requires a misunderstanding that needs to be cleared up. Neither is the case with field sports or mountaineering; as with the motor sports example, these are well-known activities about which there can be no deception or misunderstanding. Neither are they historical curiosities, like gladiatorial “games”, which aren’t really games in the modern sense of the word. If a well-known sport doesn’t fit a certain definition of “sport” then it is probably the fault of the definition, not the sport. It is true that competitive sports have risen to prominence in the last century, but it is a mistake to regard competition as a defining feature of sport because it is a means to an end, and the end is challenge.

Physical Prowess

Pankration. Not porn. Honest.

Although Parry doesn’t use the word “prowess”, I will, partly because it’s a nice word and partly because it describes that combination of acquired skill and innate capacity we associate with success in sports. It is here that the detractors of e-sports are on their strongest ground. When we think of sports, we may think of Olympic athletes pushing the limits of the human body or heavily armoured American footballers slamming into each other. These are prototypical sports for the reason that sports began, and to a large extent continue, as ways for men to test their manly abilities. The original Greek games were tests not just of physical prowess but of specifically martial prowess, involving not only running and throwing things but also chariot racing and pankration, which was like MMA only naked. But while prototypes are useful in helping people place something in a category quickly, they are less useful at the boundaries of a category. We can tell immediately that rugby is a sport, we concur that rhythmic gymnastics is a sport, and we hmm and hah about whether darts is a sport. E-sports are right on the fuzzy boundary of the sports category.

Let’s examine the case against e-sports. Few people regard chess as a sport. It’s a game, and a very challenging one at that, but it is only physical in the very indirect sense that you need a certain amount of physical health for your brain to be in shape to compete in a chess tournament. The physical activity in chess — moving pieces around on a board — is completely unrelated to the game events. A player can move the piece with maximum efficiency to its intended destination and the effect is the same as if they had thrown it in the air, bounced it off their head and caught it before slamming it down on the square. (Note: some people actually do consider chess to be a sport, and the World Chess Federation is part of the International Olympic Committee, but counting chess as a sport would make the word “sport” almost meaningless.)

Like the mediated competition argument, some critics argue that since there is no inherent relationship between physical acts (moving a mouse, hitting a key) and game events (running, shooting) computer games are like chess: there is a physical side, but it is not relevant to the game. This is true to the extent that there is usually no simple analogue relationship, but there usually is some relationship. Unlike chess, the speed with which a player moves is crucial. No matter how hard I practiced, I would never be much good at e-sports because at 57, I don’t have the reflexes of a twenty-something. Jenny again hits the nail on the head: “professional eSports players have been known to skillfully perform more than 300 keyboard or mouse actions a minute (some up to 10 per second).”

This is what a professional gamer looks like. (Alyson Bridge of WCG Ultimate Gamer, who now writes gluten-free cookbooks.)

The objection to mentions of manual dexterity is to claim that sports need gross motor skills, not just fine motor skills, or in other words, that the whole body must be used. Parry claims that this is true even for target shooting, but in that case it ought to be true also for shoot’em’up games. Anyone who’s played a lot of these games will know that although the observable movement is all in your hands, the the rest of your body is as important as it is in target shooting: you need to have a good posture and relax if your hands are going to do their work freely. You also need to be pretty fit overall; the stereotype of the couch potato gamer certainly doesn’t fit e-sports competitors.

To be fair, it’s probably best to leave both target shooting and virtual shooting on the fuzzy boundary of the sports category, and admit we can’t make a firm decision. This is one case where insitutionalisation has its uses; if the Olympic Committee decides Counter-Strike is an Olympic sport, then it is.

The Roots of the Problem

If we think more about the physicality problem, it becomes clear that it has two roots:

  1. lumping competitive computer games together under the heading of e-sports;
  2. a borderline case becoming prototypical.

Trends in sports and gaming are not overseen by the kind of people who agonise about definitions — that kind of thing comes much later, when activities become institutionalised. When people started calling activities as diverse as paragliding, mountain-biking and free-running “extreme sports”, they probably had no strict criterion of extremity in mind; it was just “stuff that could get you killed”. (Fun fact: the sport with the most fatalities is actually fishing.) Putting competitive computer games under the heading of e-sports probably seemed a good a idea at the time, but “involving a computer” is not actually a good way to classify activities in an age where almost anything seems to involve a computer somewhere along the line.

Let’s look at two extremes on the physicality spectrum. No matter how competitively Civilization is played, and no matter how strictly such competitions are regulated, it should not be classed as a sport for the same reasons that Monopoly, poker or (the IOC notwithstanding) chess are not sports. It’s a turn-based strategy game in which physical prowess plays no part. Unlike many computer games, you do not need fast reflexes or good hand-eye coordination; you just need a good strategy. At the other extreme, Dance Dance Revolution is pretty damned physical, requiring not only gross motor skills but considerable stamina. It would be fair to call it a sporting activity because although it’s based on dance, the aim is not to dance aesthetically but to perform movements rapidly and accurately. (Incidentally, it has been officially recognised as a sport in Norway since 2004.)

This is how it all started — trying to build stuff before the Zerg stomped all over it.

In the middle, we have the games that are most popular in e-Sports, like StarcraftCounter-Strike and Dota 2. As said, they involve physical prowess, but it is limited and not the main focus of the game or determinant of success. This is what I mean by the problem of taking a borderline case as prototypical. E-sports really got off the ground with Starcraft, a real-time strategy game that requires physical skills but is still first and foremost a strategy game. This results in the prototype for the e-sports category being far removed from the prototype of the sports category. This phenomenon is known in cognitive linguistics as a radial category: you have a base category with its own prototype (e.g., MOTHER) that then radiates other categories, each with their own prototype (e.g., SINGLE MOTHER, STEPMOTHER, BIRTH MOTHER). Whether you see your stepmother or your birth mother as your “real mother” depends largely on how you feel about them.

Conclusion: A Better definition of (E)Sports

It should be clear by now that Parry’s definition of “sport” as “an institutionalised, rule-governed contest of human physical skill” is too restrictive. We have seen that institutions are an effect of an activity’s being a sport rather than a cause, and that competition is a means, not an end. We thus need a better definition of sport. I propose the following:

A sport is a structured activity performed primarily for the physical challenge it provides.

“Structured” is important to distinguish sports from spontaneous play, and is what results in all those rule books and committees. “Physical” is important to distinguish sports from cerebral pastimes like card games or crossword puzzles. “Challenge” can include but is not limited to competition. A physical challenge could be anything from kicking a ball into a net to climbing a mountain to shooting a target. Finally, the challenge should be a primary motivation. Performing a difficult physical activity to some other end is very different from performing it for its own sake; the former is what Suits calls “technical activity”, or in other words, work, in the broadest sense of the word. Running a race is different from running to escape a charging bull. Of course, people do sports in order to get fit, but if that is the only purpose, then it is just physical exercise that happens to resemble a sport, just like fit-boxing bears a resemblance to real boxing.

This is not a perfect definition because there is no perfect definition of sport, just like there is no perfect definition of many things (as Wittgenstein pointed out while talking about games). However, it seems to cover most things we call sports and exclude most things we don’t, which is as good as we can hope to get. At the moment, the only reasonable answer to the question “Are e-sports sports” is another question: “Which e-sport are you talking about?”

Players or Fighters?

Taijiquan “master” about to go down

The martial corner of YouTube has been busy of late commenting on a match between two martial artists, the MMA (mixed martial arts) fighter Xu Xiaodong and taijiquan (t’ai chi) player Wei Lei. The fight was over in seconds, with Wei Lei going down almost immediately and getting pummeled on the floor until the umpire ended it. Most of the online arguments were pretty silly since a practitioner of one martial art getting beaten by a practitioner of a different art says almost nothing about the relative strengths of those arts — after all, when MMA champion Conor McGregor was bested by Floyd Mayweather, no one said “See, MMA is useless, you should all learn proper boxing.” What I want to talk about here, though, is not whether taijiquan is an effective method of fighting (it is, but only if you train accordingly). The interesting thing for me, as someone who teaches English and writes about games (and sometimes writes about English and teaches games), is why we talk about taiji “players” and MMA “fighters” when taijiquan is not a game but MMA arguably is.

So what is a game? There is no definition that fits every use of the word “game” (as Wittgenstein famously pointed out) but the one most commonly used in ludology — the study of games — comes from the philosopher Bernard Suits: “to play a game is to engage in activity directed toward bringing about a specific state of affairs, using only means permitted by specific rules, where the means permitted by the rules are more limited in scope than they would be in the absence of the rules, and where the sole reason for accepting such limitation is to make possible such activity” (“What is a Game?”). In other words, a game has a goal and some rules that prevent you from using the most efficient means to reach that goal, and the rules only exist for the sake of the game, rather than for moral, legal or practical reasons. The example that is always quoted is golf: you have a specific state of affairs you want to achieve — getting a ball into a hole — and rules that limit how you can do it — you have to hit the ball with a stick rather than picking it up and putting it in the hole, which would be more efficient. Moreover, there is no reason to do it like this other than the fact that that is how you play golf. Let’s apply this definition to taijiquan and MMA and see how far it gets us.

Yang Zhengfu looking not at all playful.

Taijiquan is well known as a gentle form of exercise, kind of like yoga only not as stretchy. It’s also a martial art, though both in mainland China and in the West the combative side tends to played down, hence the comments on YouTube about it only being for old folks in the park. Although it’s a leisure activity, it’s hard to see how taijiquan could be a game as Bernard Suits describes it. The bulk of taiji practice consists of doing the form: a series of slow movements based (often obscurely) on fighting techniques. There is no goal here; you aren’t trying to get to the end of the form by overcoming obstacles in your path. You’re not trying to finish it as quickly as possible, or even as slowly as possible. You just do it. And because there is no goal, there are no rules to limit the means you can use to get there.

There is one taiji activity, pushing hands (tuishou), that could meet the criteria: it has a “specific state of affairs” players try to achieve — getting their opponent on the ground or out of the ring — and rules which severely limit the means of doing it — no punching, kicking or prolonged grappling. The rules exist to make it an effective training exercise, but you could say they turn it into a kind of game, and indeed it is played competitively. (I nearly entered one of my students for one of the first pushing hands competitions in the UK, but unfortunately he had to go to a wedding that day.) But taijiquan as whole is not a game. You can say its goal is health, spiritual development or kicking the crap out of people, but these are not what Suits called lusory goals. (Like “ludology”, the word comes from Latin ludus, meaning game or competition.) Like the rules of a game, the goal of a game is there to enable us to play the game. There’s no inherent value in getting a ball into a hole or a net; we do it because it lets us create an enjoyable game. This is not the case with taijiquan or other traditional martial arts.

Hard to think of this as a game (andriuXphoto)

Now let’s apply Suits’s definition to MMA, and in particular to UFC (Ultimate Fighting Championship) matches. Firstly, there is a specific state of affairs to be achieved, or rather a couple of possible states, since a match can be won by either a knockout or points, with judges “giving the most weight in scoring to effective striking, effective grappling, control of the fighting area and effective aggressiveness and defense” (Unified Rules). We can take this as a lusory goal because the contestants would not normally want to punch, kick or grapple, let alone knock each other out (Conor McGregor being a possible exception). There are also rules that limit the means you can use to achieve this goal. Fans like to think of UFC as “real fighting” (as opposed to pansy martial arts like judo or taekwondo, one presumes), but there are actually 31 different activities that are classed as fouls. When you think about it for a moment, that makes sense. In boxing, you are only allowed to hit your opponent with your fist, so you don’t need rules to specify how you can kick or grapple, like foul #12, “Kicking to the kidney with a heel” or #9, “Small joint manipulation.” The interesting question is whether these rules are game rules. Kicking someone in the kidney with your heel is not something one is normally allowed to do anyway, and as Suits points out, moral and legal rules are not game rules: they exist for their own reasons, not because they enable a game to take place. In contrast, the offside rule in football and the en passant rule in chess are pure game rules; they were added specifically to make the game more interesting, and they have no meaning whatsoever outside the game. Rules #9 and #12, like the majority of the UFC rules, aren’t quite like these, though, since they were imposed for safety reasons. It’s not that the game is less enjoyable if you allow elbow strikes to the back of the head (fouls #10 and #11 combined); it’s that people die.

Nevertheless, there are some rules in MMA that mark it out as a game. Contestants are required to wear gloves, and although that may look like a safety measure, gloves came to be used in boxing to make matches more exciting, not safer. In bare-knuckle fights, contestants are very wary about punching each other because if you hit too hard it’s easy to break your hand. Punch a wall if you don’t believe me. Watching a couple of guys cautiously circling each other until they can get in a haymaker isn’t very entertaining. Then there’s foul #26, “Timidity, including, without limitation, avoiding contact with an opponent, intentionally or consistently dropping the mouthpiece or faking an injury.” There is nothing dangerous, immoral or illegal here; in fact, timidity is a pretty normal reaction in a situation where somebody keeps trying to punch you. But timidity spoils the game of MMA as much as aggression might spoil a game of pat-a-cake.

Strictly speaking, then, an MMA match is a game, and so is a match in any other combat sport, like boxing, wrestling or judo. So why don’t we talk about “MMA players” or “taekwondo players” just like we talk about “basketball players” or “Candy Crush players”? Why do we prefer to call them “fighters”?

The second question is easy to answer: we call MMA participants fighters because we call an MMA match a fight, and that’s because it looks like a fight. Whether it really is a fight depends on how strictly you define the word “fight”. If you see any kind of competition as a fight, then of course it’s a fight, but then so is a game of chess or a business takeover. If you see a fight as an all-out attempt to kill, injure or physically dominate an opponent, then it isn’t. It’s actually a play-fight, or if that sounds too much for kiddies, a fight simulation. Combat sports simulate combat just like flight simulators simulate flying or Civilization simulates history. The difference is that they’re much closer to the real thing, and out of the combat sports, MMA is the closest.

The reason it sounds odd to call an MMA match a play-fight is the reason we don’t call MMA fighters players. English is perhaps unusual in having different words for “play and “game”. In many languages, the verb and the noun are just different forms of the same word, like German spielen and Spiel, Italian giocare and gioco, or Turkish oynamak and oyun. What English doesn’t have is two different verbs for children’s play and playing a serious game. In Portuguese, for example, you can say:

As crianças estão brincando. (The children are playing.)
Os homens estão jogando cartas. (The men are playing cards.)

Similarly, ludologists sometimes exploit the difference in Latin between ludus, a serious game with rules, and paidea, children’s play. This is why we talk about gladiatorial games, but we don’t think of gladiators as playing. Similarly we don’t think of boxers or MMA fighters as playing because we don’t think of these as playful activities.

This leads me to another definition of “game” which I’ve used elsewhere: “A game is a structured activity designed to facilitate play.” This is designed to complement and refine, not replace, Suits’s definition, and although it doesn’t work for some peripheral games, like gladiatorial games or the games of game theory, it works well for most activities we’d call a game in everyday life, from tennis to Monopoly. In terms of the ludus/paidea distinction, Suits’s definition lays out the conditions for ludus, while mine adds the paidea aspect. This puts MMA right on the fringe of the “game” category, and explains why we don’t talk about “MMA players”.

That leaves us with the question of why we talk about “taijiquan players”, given that taijiquan is not a game. I suspect the main reason is just that it’s an over-literal translation from the Chinese. In China you can say of someone who does taijiquan, Tā wán tàijíquán — literally “He plays taijiquan” — but you could equally well say Tā wán yǒngchūnquán — “He plays Wing Chun.” There is nothing special about taijiquan here, and although wán translates as “play”, it’s used more like “practice”. Incidentally, the same person in the ring would be referred to as quánshǒu — literally “fist hand”, though normally translated as “boxer”. (Thanks to Steve Lee for the clarification.) However, I think the term “play” stuck not only because it translates the Chinese, but because people like the idea of “playing”. Taijiquan is full of Daoist paradoxes — speed coming from slowness, hardness from softness and so on — so it’s not surprising that the idea of fighting coming from playing would be popular.

Some years ago, I briefly studied hapkido, a crazy Korean martial art that’s like a really painful mixture of taekwondo, aikido and judo. After classes (which were usually one-on-one sessions where I got beaten up for a couple of hours) we’d go home and have drunken conversations about life, the universe and martial arts. During one of these, my teacher said, “When you study martial arts, you get to the point where you have to decide whether you’re a martial artist or you’re playing at martial arts.” My answer was “I don’t know if I’m a martial artist, but I’m definitely playing.”