As a voracious reader and great lover of language, C.S. Lewis was concerned about “verbicide,” what he called the “murder of words.” As Lewis describes in Studies in Words (7-8), verbicide happens in a number of ways:
- Inflation of a Word’s Value: “Inflation is one of the commonest; those who taught us to say awfully for ‘very’, tremendous for ‘great’, sadism for ‘cruelty’, and unthinkable for ‘undesirable’ were verbicides.”
- Fake Superlatives: “Another way is verbiage, by which I here mean the use of a word as a promise to pay which is never going to be kept. The use of significant as if it were an absolute, and with no intention of ever telling us what the thing is significant of, is an example. So is diametrically when it is used merely to put opposite into the superlative.”
- Politics and Advertising: “Men often commit verbicide because they want to snatch a word as a party banner, to appropriate its ‘selling quality’. Verbicide was committed when we exchanged Whig and Tory for Liberal and Conservative.”
- Show vs. Tell: “But the greatest cause of verbicide is the fact that most people are obviously far more anxious to express their approval and disapproval of things than to describe them. Hence the tendency of words to become less descriptive and more evaluative; then to become evaluative, while still retaining some hint of the sort of goodness or badness implied; and to end up by being purely evaluative—useless synonyms for good or for bad. We shall see this happening to the word villain in a later chapter. Rotten, paradoxically has become so completely a synonym for ‘bad’ that we now have to say bad when we mean ‘rotten’.”
We can see here that Lewis has some similar concerns as George Orwell in his “Politics and the English Language.” Words can be politicized or bent into the service of those who are peddling products or ideas. 2016 was particularly ripe as it was a deeply divisive political year (Brexit, Trump, ISIS, etc.).
Beyond the capital-P politics of the moment, though, is the social reality of a culture that is running out of effective superlatives. I find myself saying “super duper interesting.” How have I come this far? “Fine” is a loaded term, and “very” doesn’t do what we need it to do. And when we need a word because it is so very relevant–like “Trumpery,” or “truth,” or “evidence,” or “third way”–we find that it has died too, or has been nefariously co-opted.
It is not just a verbicidal age, but we are verbicides: we are word-killing maniacs wandering around the digital library of culture with guns for tongues.
Lewis warns us that we cannot recover these words by simply returning to the past, though there are some authors who have a nice way of helping good readers recover words. Wait for someone to mispronounce an uncommon word, and you will find a good reader who is courageously trying out a word or phrase in real life, never having heard it said out loud before.
Instead, Lewis suggests that we “resolve that we ourselves will never commit verbicide” (Studies in Words, 8). When we see words going bad–he mentioned “adolescent” as synonymous with “bad” and “contemporary” as synonymous with “good”–he suggests that “we should banish them from our vocabulary” (Studies in Words, 8). In so banishing words under societal threat, the best of these words might finally die and find new life (as his two examples, which are now more technical words). I’m suggesting, then, five words that are either on death row or being hunted by the hangman’s dogs.
I don’t know when “seriously” came into my mind with a Sweet Valley High accent, but when I hear the word “literally,” I now add that mindless, Mean Girls SoCal pain-streaked whine. “Literally?” Literally.
I suspect that “seriously” was McKidnapped in hte mid-90s, but we have been killing “literally” for a very long time. In pop culture and politics, this word was decimated long ago, becoming a synonym for “actually” or just a mindless verbal tick. I’m hardly the first to notice this–see Slate, The Guardian, NPR, and Boston.com. They didn’t literally beat me to the punch, but they did so metaphorically.
But this word has been bastardized in a second way. Almost anyone who says these phrases–“I only read the Bible literally” or “We can’t take the Bible literally”–have no idea what they are talking about. Literally. Actually.
Note: the guys in the video stole my idea before I said it out loud. Those sensitive to crude language might want to quit after 60 seconds or so. And here is a guy great at Plinko and very bad at “literally.” Totally.
Some words are simply digital: they have an on/off relationship to language. While we might say, “she’s the chief mind in that organization,” we never say “she is the very chief mind….” Or we shouldn’t, because it is dumb.
Yet, as of late, I hear phrases like:
- sort of an absolute decision
- kind of the main thing
- that speech was utterly meaningless
- the car is very stationary
I suppose we could defend phrases like “nearly worthless” and “almost unanimous,” if we had to. But do we want to have a phrase like “sorta pregnant?” Pregnancy is digital, on/off, even if it sneaks up on you. Besides, how often do you want to go around asking women how pregnant they are? Bad plan. If you don’t know, you probably shouldn’t ask.
This is the case with “unique.” It is an incomparable adjective, and should be left alone as one. Something is either unique or in some degree of commonplace. Now we hear, “very unique,” “kind of unique,” and, I’m afraid, “literally unique.”
Actually, that last one could work if people knew how to use “literally.” Sort of absolutely I guess.
I don’t know if this garbled toungueship is related to a culture that finds meaning in phrases like, “there is no absolute truth” or “language is a system of signs for which there is no ultimate meaning.” Or maybe we are just lazy. Either way, let’s banish “unique” and try to describe what we mean instead of just telling it. We may get this one back some day, but in a bigly world like ours, it is a poor thing to hope for.
Okay, I admit it. I’m soapboxing here a little bit. I get tired of people using the word “allegory” for almost any literature with symbolic layers.
The most tiresome–but most understandable–accusation of allegory comes against The Lion, the Witch, and the Wardrobe, but critics have said the same of The Lord of the Rings. Sauron’s Ring was suspected of being a secret representation of the nuclear bomb or the armies of Germany or the post-industrial technocracy that descended upon Europe. While Tolkien admitted that myth-making sometimes requires allegorical language (The Letters of J.R.R. Tolkien, 145), he disliked allegory and did not use it as a technique in LOTR.
Tolkien did use allegory in Leaf by Niggle, as Lewis used it in Pilgrim’s Regress. They knew how to use allegory, and knew where there were allegorical elements in their work. Lewis even wrote an academic treatise on the topic, and I argued in “Is Narnia an Allegory?” that he knew what he was talking about. We should consider listening to them.
However, people usually mean something a little different when they connect allegory to works like Narnia or LOTR–and more recently to Harry Potter, the works of Madeleine L’Engle and Ursula K. Le Guin, and, believe it or not, Margaret Atwood‘s Handmaid’s Tale, now in a miniseries. Sometimes they just mean that there is “something going on” in the text. Father Time and Aslan are obviously meaning-filled characters in Narnia. Harry Potter is a Christ figure, and Charles Wallace, well–something’s off with that weird little dude. Le Guin is a feminist tale-teller, and Atwood brings all the history of abuse against women into a single post-apocalyptic regime. There’s something going on here.
If people want to call those things allegory, there’s not much we can do. Both Tolkien and Lewis joked that anyone who wants to find allegory in a text is bound to find it. At best it’s a literary face at the bottom of the well; at worst, it’s ignorance.
But sometimes people really mean “that book is bad” when they say “that book is an allegory.” I know that seems like a stretch, but the logic is clear:
- I don’t like allegory.
- I don’t like Book X (Narnia, LOTR, fantasy, feminist books, books with lots of words).
- Therefore, Book X is an allegory.
Seriously, literally, I heard someone say, “Animal Farm can’t be allegory because I loved that book.” Okay folks, let’s commit to only using “allegory” if we have a clue what we are talking about.
And for a chuckle, check this out.
Have you encountered a Truther lately? Usually, this refers to someone who passionately believes something despite public opinion or the most obvious evidence. Truthers are different than people who believe against evidence (e.g., that the Toronto Maple Leafs will ever win the Stanley Cup) or those who go against public opinion (e.g., those that believe that it is worth providing rural kids with a great education).
Truthers combine conspiracy with puzzling intellectual oddities. 9/11 Truthers were interesting, coming both as a government conspiracy and an anti-Bush phenomenon. There is something that connects anti-vaxers, birthers, the Obama-as-antichrist crowd, and the Sandy Hook conspiracy theorists. I don’t have any doubt they have a desire for the truth, but this truth isn’t related to the evidence that is in front of us.
These folks are different than those who peddle “truthiness.” The spin doctors of politicians and celebritities have done their work vivisecting the word “truth,” so that it is unclear that it has any meaning left. So we end up now in a world where President Trump on Sunday can rip the comments of the grieving Mayor of London radically out of context, and we can’t expect any accountability for a president willing to speak with such committed ignorance and carelessness. Why should we? We live in a post-truth world. What’s evidence, fact, or even common decency got to do with it when you have the most powerful opinion and 154,231 twitter fans?This moment has not been helped by the Truth o’ Meter folks. And, honestly, the death of truth has been sped along by the media calling every misstatement in the last election a “lie” when they are against a candidate and “untruth” when they are for them.
Since nuance is impossible–and since any culture watcher knows this all leads to some sort of catastrophe–I call for a ban on mistreating the word “truth.” No words like truthiness, truthicity, trutharama, truthopolis, truth-gate, post-truth, quasi-truth, truth o’ meter, truther, trutheses, truthpocalypse, or truthishness. We’re going to need that word at some point. I suppose, though, I am a loser for expecting the truth.
This one is quite dear to me, and we may not yet be at the point of needing complete banishment. We are certainly at a point where there is a hunger for data and statistics. I have contributed to this myself, posting blogging data (here and here) and my reading data (see 2016 here). In case someone accuses me of being a flip-flopper, I’ll admit that I love a flowchart, graph, or statistical chart as much as the next guy. So this one is a bit of a self-check, in case I too may be in danger of verbicide.
“Big data” has become a real factor in thinking about public policy, investment, higher education, and immigration reform. There are new reports daily about a million different questions, and I do my best to follow the trends as they pop up in government data, surveys, and other types of research. And I’m not alone: “data” was a 2016 buzzword on a number of lists.
Intriguingly, this is a trend that seems to run exactly counter to the post-truth/truthiness/truther deal. People want data to help them read the cultural moment, and to a certain degree data can be helpful. But I think there are three dangers.
- Danger the First: Data is Most Useful for Longterm Trends: As people clamour for data to new questions that pop up–such as what happens if DC goes bankrupt, Brazil’s economy globalizes, the UK leaves Europe, or 100,000 international students change their destination from the U.S. to other countries–they don’t always get that some of our questions just don’t have data that goes back very far. Reading data takes patience and the wisdom of time: if you don’t have these, you might as well just make things up. It is, after all, just the assertion of data that can get you put in a position of power.
- Danger the Second: Most People Misunderstand Data: This is particularly true of survey data. Why did the media (with exceptions) get the 2016 election and the Brexit vote wrong? Because they don’t understand how the numbers work. People should simply stop looking to data if they haven’t taken the time to understand it.
- Danger the Second: People Aren’t Data Points: This is the biggest danger. While data on Millennials tells us interesting things about a generation, strictly speaking no individual is a “Millennial”–the perfect example of the whole age. Trends are too big and people too individual for data to tell us what is happening in the human heart. I suspect that Brexit and the Trump election both come down to this single point: it felt to many that the liberal elite didn’t understand what everyday life is like for normal folk. It is pretty hard to predict what any one person does, even if we can make some guesses in the aggregate. And sometimes it is the individual that matters.
Those are my five words that we should set out to pasture. What words have you had enough of? Or what words do you wish you had back? Let me know in the comments, on Twitter @BrentonDana, or on Facebook.