Writers Talk About Writing
Ten Grammar Myths, Debunked
Even though National Grammar Day is behind us, that's no reason to stop celebrating grammar — or overturning cherished assumptions about grammar. Every year for NGD, University of California, San Diego linguistics grad student Gabe Doyle compiles a list of grammar myths that require debunking. Here's his latest roundup.
As usual for National Grammar Day, I'm taking the opportunity to look back on some of the grammar myths that have been debunked here over the last year. But before I get to that, let's talk briefly about language change.
Language changes. There's no question about that — just look at anything Chaucer wrote and it's clear we're no longer speaking his language. These aren't limited to changes at the periphery, but at the very core of the language. Case markings that were once crucial have been lost, leaving us with subject/object distinctions only for pronouns (and even then, not all of them). Negation, tense marking, verbal moods, all these have changed, and they continue to do so now.
Some people take the stance that language change is in and of itself bad, that it represents a decline in the language. That's just silly; surely Modern English is no worse than Old English in any general sense.
Others take a very similar, though much more reasonable, stance: that language change is bad because consistency is good. We want people to be able to understand us in the future. (I'm thinking here of the introductory Shakespeare editions I read in high school, where outdated words and phrases were translated in footnotes.)
So yes, consistency is good — but isn't language change good, too? We weed out words that we no longer need (like trierarch, the commander of a trireme). We introduce new words that are necessary in the modern world (like byte or algorithm). We adapt words to new uses (like driving a car from driving animals). This doesn't mean that Modern English is inherently better than Old English, but I think it's hard to argue Modern English isn't the better choice for the modern world.
Many writers on language assume that the users of a language are brutes who are always trying to screw up the language, but the truth is we're not. Language users are trying to make the best language they can, according to their needs and usage. When language change happens, there's a reason behind it, even if it's only something seemingly silly like enlivening the language with new slang. So the big question is: is the motivation for consistency more or less valid than the motivation for the change?
I think we should err on the side of the change. Long-term consistency is nice, but it's not of primary importance. Outside of fiction and historical accounts, we generally don't need to be able to extract the subtle nuances from old writing. Hard though it may be to admit it, there is very little that the future is going to need to learn from us directly; we're not losing too much if they find it a little harder to understand us.
Language change, though, can move us to a superior language. We see shortcomings in our native languages every time we think "I wish there was a way to say…" A language is probably improved by making it easier to say the things that people have to or want to say. And if a language change takes off, presumably it takes off because people find it to be beneficial. When a language change appears, there's presumably a reason for it; when it's widely adopted, there's presumably a compelling reason for it.
The benefits of consistency are fairly clear, but the exact benefit or motivation for a change is more obscure. That's why I tend to give language change the benefit of the doubt.
Enough of my philosophizing. Here's the yearly clearinghouse of 10 busted grammar myths. (The statements below are the reality, not the myth.)
- Each other and one another are basically the same. You can forget any rule about using each other with two people and one another with more than two. English has never consistently imposed this restriction.
- There is nothing wrong with I'm good. Since I was knee-high to a bug's eye, I've had people tell me that one must never say "I'm good" when asked how one is doing. Well, here's an argument why that's nothing but hokum.
- The S-Series: Anyway(s), Backward(s), Toward(s), Beside(s). A four-part series on words that appear both with and without a final s. Which ones are standard, and where?
- Amount of is just fine with count nouns. Amount of with a count noun (e.g., amount of people) is at worst a bit informal. The combination is useful for suggesting that the pluralized count noun is best thought of as a mass or aggregation.
- Verbal can mean oral. In common usage, people tend to use verbal to describe spoken language, which sticklers insist is more properly described as oral. But outside of certain limited contexts where light ambiguity is intolerable, verbal is just fine.
- Twitter's hashtags aren't destroying English. I've never been entirely clear why, but many people insist that whatever the newest form of communication is, it's going to destroy the language. Whether it's the telegraph, the telegram, text messages, or Twitter, the next big thing is claimed to be the nail in English's coffin. And yet, English survives.
- Changing language is nothing at all like changing math. Sometimes people complain that allowing language to change due to common usage would be like letting triangles have more than 180 degrees if enough people thought they did. This is bosh, and here's why.
And a few myths debunked by others:
- Whom is moribund and that's okay. (from Mike Pope) On rare occasions, I run across someone trying very hard to keep whom in the language, usually by berating people who haven't used it. But the truth is that it's going to leave the language, and there's no reason to worry. Mike Pope explains why.
- Uh, um, and other disfluencies aren't all bad. (from Michael Erard, at Slate) One of the most interesting psycholinguistic papers I read early in grad school was one on the idea that disfluencies were informative to the listener, by warning them of a complicated or unexpected continuation. Michael Erard discusses some recent research in this vein that suggests we ought not to purge the ums from our speech.
- Descriptivism and prescriptivism aren't directly opposed. (from Arrant Pedantry) At times, people suggest that educated linguists are hypocritical for holding a descriptivist stance on language while simultaneously knowing that some ways of saying things are better (e.g., clearer, more attractive) than others. Jonathon Owen shines some light on this by representing the two forces as orthogonal continua — much more light than I've shone on it with this summary.
- Some redundant stuff isn't really redundant. (from Arnold Zwicky, at Language Log) I'm cheating, because this is actually a post from more than five years ago, but I found it within the last year. (This is an eleventh myth anyway, so I'm bending rules left and right.) Looking at pilotless drones, Arnold Zwicky explains how an appositive reading of adjectives explains away some seeming redundancies. If pilotless drones comes from the non-restrictive relative clause "drones, which are pilotless", then there's no redundancy. A bit technical, but well worth it.
Gabe Doyle is a graduate student in Linguistics at the University of California, San Diego. His research uses computer models to better understand how people represent language in their minds. He also writes about grammar myths and misconceptions at the blog Motivated Grammar.