English usage lore is full of myths and hobgoblins. Some have the status of zombie rules, heeded by millions despite being bogus and illegitimate since forever (split infinitives, preposition-stranding). Other myths attach to particular words and make people unsure how to use them ‘properly’ (decimate, hopefully), leading in some cases to what linguists call ‘nervous cluelessness’ about language use.
These myths spread and survive for various reasons. On one side is the appeal of superiority. On the other is fear of embarrassment: We play it safe rather than risk ridicule and ‘correction’. We are (often to our detriment) a rule-loving species, uncomfortable with uncertainty and variation unless we resolve not to be. We defer to authority but are poor judges of what constitutes good varieties of it.*
So if a self-appointed expert on English asserts a rule, some will lap it up no matter its validity. The unedifying results are laid bare in reference works like the Merriam-Webster Dictionary of English Usage (MWDEU), which, with rigour and wit, summarises centuries of confusion and argument over whether A or B is correct when often both are or each is appropriate in a different variety of English.
Huge effort is wasted on such trivialities. So, as a quick exercise in myth-busting (and amusing myself), I posted an A to Z of English usage myths on Twitter last week. Reactions were mostly positive, but some items inevitably proved contentious, as we’ll see.
You can click through on this initial tweet for the full A–Z plus supplements on Twitter, or you can read the lightly edited version below, followed by extra notes and quotes now that the 140-character limit doesn’t apply.
A is for ALTERNATIVE. Peevers say you can’t have more than two alternatives, because Latin. This is the etymological fallacy.
B is for BEG THE QUESTION. Pedants say it can’t mean ‘raise the question’, but the data beg to differ.
C is for CONJUNCTIONS or COORDINATORS. Because sticklers say you can’t start a sentence with them. And that’s just plain silly.
D is for DECIMATE = ‘destroy’; DIFFERENT THAN; DOUBLE NEGATIVES; and DUE TO = ‘because of’. All are fine, if not always appropriate.
Anyone who says DECIMATE must mean ‘kill one in ten’ should also call October ‘December’. That’s the rule.
E is for ENTHUSE, a back-formation in long, useful existence. Some purists consider it ‘wrong’ and ‘stupid’. It’s neither.
F is for FLAT ADVERBS, like ‘drive SLOW’. Peevers think they’re adjectives used wrong(ly). Doubly false.
G is for GRAMMAR: applied unhelpfully to spelling, punctuation, diction and random bugbears, seldom – outside linguistics – to syntax and morphology.
H is for HOPEFULLY. Die-hards say it can only mean ‘in a hopeful manner’. Hopefully they’ll cop on, because frankly that’s daft.
I is for IRREGARDLESS. Haters say it’s not a word, because that’s their wish. It’s nonstandard, but it is a word. Look it up.
I is also for INVITE. Peevers irrationally hate nouning and verbing. This one’s unsuited to formal use, but colloquially it’s OK.
J is for JEOPARDIZE. Webster 1828 called it ‘useless’; a later critic found it ‘foolish and intolerable’, jeopardizing his cred.
K is for KEY = ‘vital, essential’. Prescriptivists even in the 21stC reject its status as an adjective. The usage dates to 1832.
L is for LESS. Pedants loathe its use with count nouns (‘less pedants’) despite 1000 years of regular use in literature and speech.
L is also for LITERALLY. Some hate its non-literal use, but it’s been around for literally centuries.
Also: ‘literal meaning’ is literally a contradiction in terms. The peeve is (figuratively) hoist with its own petard.
M is for MAN. Reactionaries say it’s fine as a generic term for people. No: it’s sexist and insidious.
N is for NONE. Peevers say it must take a singular verb, ignoring a thousand years of common use with singular and plural verbs.
O is for ONGOING. Haters hate it; style guides try (and fail) to ban it. There’s nothing wrong with it.
P is for PASSIVE VOICE. Said to be bad by peevers, who usually misidentify and always mischaracterize it.
Q is for QUOTE. Purists say it can’t be a noun. They confuse formality with correctness and overlook the benefits of brevity.
R is for REASON WHY. Sticklers hate its redundancy but not that of ‘place where’ or ‘time when’. It’s 800 years old and standard.
S is for SPLIT INFINITIVE. The bogus rule against it is based on Latin-fetishism and leads to mangled English.
T is for TRANSPIRE. Peevers say it can’t mean ‘occur’ – a usage centuries old and standard. There’s no sound basis for the objection.
U is for UNIQUE. Pedants insist that it’s absolute: that nothing can be ‘very unique’. Their argument is illogical and spurious.
V is for VERBING. Peevers hate it, but only when they think it’s new – they constantly use verbings that were established earlier.
W is for WHICH in restrictive clauses. Peevers insist on ‘that’. The so-called rule is a useless upstart.
W is also for WHOM. Pedants say it’s obligatory in object position (‘who to follow’), but normal English is much more flexible.
X is for XMAS. Traditionalists abhor this short form, unaware of its thoroughly traditional use. X for Christ is a millennium old.
Y is for YOU, singular, once decried the way singular THEY is nowadays. Why accept one and not the other?
Z is for ZOOM. Theodore Bernstein said that this term, being from aviation, should only mean ‘upward mobility’. English went ¯\_(ツ)_/¯
A few readers were happily on board until they reached a particular bugbear. That’s fair enough. I should say, for the record, that I had to remove my editor’s hat for certain items. I’m not advocating for irregardless, despite what some inferred: I’m just saying (uncontroversially and irrefutably, I’d have thought) that it is a word. Yet even this observation drew real hostility.
Nor am I saying these usages are all hunky-dory: some are unsuited to formal or standard English. This brings us to a central problem in popular conceptions of language use. The belief that standard English is ‘correct’ and ‘proper’ and other varieties are therefore substandard or illegitimate is widespread, wrong, and destructive. Different forms of language are appropriate in their own domains. Standard English is socially privileged, not linguistically superior to any other dialect.
Less for count nouns drew a mixed reaction. I’ll be addressing this item in a future post. Someone said they despise all flat adverbs. That must be awkward. The tweet about man provoked some defensive reactions (from men). Beg the question was bewailed and lamented. My post on it suggests solutions.
There were a couple of queries about the December thing. It was the tenth month of the old Roman calendar, and the word has the same root (Latin decem ‘ten’, from PIE *dekm-) that we find in decimate. The etymological fallacy produces no end of absurdity like this, but people resist information that contradicts their old beliefs; they’ll conjure up all kinds of nonsense to avoid changing their minds.
One reader balked at unique and found, in Oxford Dictionaries: ‘Being the only one of its kind; unlike anything else’. He concluded, prematurely: ‘That’s the definition.’ But the same page, same source, offers sense 1.2: ‘Particularly remarkable, special, or unusual.’ Here’s a tip for using dictionaries: Read past line 1.
Gradable unique cropped up in the comments here before, and recurs in collections of peeves. Purists assert that being unique is like being married or dead or pregnant: you either are or you aren’t. They make similar hyper-literalist claims about perfect, certain, equal, and other adjectives, even though all these words have been modified by great writers for centuries (a more perfect union, anyone?).
Steven Pinker, in The Sense of Style, defends gradable unique:
Here is the flaw in the purists’ logic. Uniqueness is not like pregnancy and marriage. It must be defined relative to some scale of measurement. I am told that all snowflakes are unique, and so they may be under a microscope, but frankly, they all look the same to me. Conversely, each of the proverbial two peas in a pod is unique if you squint hard enough through a magnifying glass. Does this mean that nothing is unique, or does it mean that everything is unique? The answer is neither: ‘uniqueness’ is meaningful only after you specify which qualities are of interest to you and which degree of resolution or grain size you’re applying. . . .
Calling something quite unique or very unique implies that the item differs from the others in an unusual number of qualities, that it differs from them to an unusual degree, or both. In other words, pick any scale or cutoff you want, and the item will still be unique. This ‘distinctive’ sense has coexisted with the ‘having no like or equal’ sense for as long as the word unique has been in common use.
MWDEU, after surveying the history of the word and the commentary thereon, concludes: ‘Those who insist that unique cannot be modified by such adverbs as more, most, and very are clearly wrong’. This is not to say that you should say ‘very unique’: neither Pinker nor M-W recommends this. But the purists’ claims reflect only their wishes, not the reality of English or the world it seeks to describe.
Intransigence over language use is a recipe for irritation, given how inexorably it keeps changing. Everything about language use changes: meanings, sounds, structures, spellings. Wishing this would stop is like wishing the clouds would freeze in place or the tide would leave your sandcastles alone. A lot of usage myths arise in the gap between the true nature of language and sticklers’ false beliefs about it.
Peevers who are not satisfied with grumbling to themselves become language police, ‘grammar’ cops playing inept games of gotcha with other people’s speech. They don’t stop to think they may be wrong and rude, or that their targets might use a different dialect or have learned English as a second language or have been blessed with less education. They forget their privilege and their manners.
Linguistic fault-finding gives language lovers a bad name. We end up characterised as intolerant fusspots obsessing over ‘correctness’ and pedantic niceties in every human interaction, silently judging our peers’ speech. On the contrary: linguists (and linguistic dabblers like me) love language in all its diverse wonder. We are infinitely more likely to be intrigued by error and delighted by variation than irritated by usage.
Usage myths, on the other hand – don’t get me started.
As I’ve written elsewhere, the aim of many of these myths is to maintain anachronistic shibboleths that allow in-group members to feel pleased about knowing them, to present as authority figures, and to stigmatize other groups. Thus they preserve – or imagine they preserve – their status and power in social hierarchies.
You don’t have to recognise it. Language has no ultimate authority except its users, and your vote is as important as anyone else’s.
[Previously: A to Z of linguistics in rhyming couplets.]
* Not just linguistically, you may have noticed.
Love this post! Especially near the end where you mention being “infinitely more likely to be intrigued by error and delighted by variation than irritated by usage.” A hearty amen.
Thanks, Sharon! It’s a healthier attitude, I find, and not remotely at odds with being a copy-editor.
I would modify your “I” a bit, by leaving out “and verbing”, which is the subject of “V”. In addition, calendar invite ’email sent to be directly interpreted by the recipient’s calendar program’ is entirely formal usage, and actually has more ghits than calendar invitation.
Fair point about verbing. (Confession: I didn’t plan it all before I began tweeting it.) Calendar invite is not a term I’m too familiar with in any register, so that’s an interesting exception.
I would add, though, that Sheidlower is being disingenuous when he says that letters have no meaning and therefore literal meaning is an oxymoron. A single letter typically has no meaning, but a sequence of letters definitely does, and the opposition between literal and non-literal meaning is very old, dating back to Middle English and at least a century before in Latin. The OED’s earliest citation from 1450 says (spelling modernized): “To the literal understanding it [Jerusalem] signifieth an earthly city […] to allegory it signifieth holy church […] to moral understanding it signifieth a Christian soul.”
Are they not being *deliberately* spurious, to highlight the nonsense of “the literal meaning of literal”? The word “literally” as derived from “by the letter” makes perfect sense if you accept that it is itself figurative. It only falls apart if you insist that its meaning is, well, literal.
What is a myth is calling these things “myths”. These are preferences.
For instance, the matter of splitting an infinitive is not a rule; it is symply a stylistic guideline. This guideline should be followed not because some authority said so, but because adhering to it will usually result in the most aesthetically pleasing form of the sentence.
A writer may violate this or any other guideline if that writer so chooses, so long as that writer understands that doing so will produce writing that some other people will find ugly and unpleasant.
They are preferences and myths. You can, for example, decide to restrict your expression by using hopefully only to mean ‘in a hopeful manner’. But the widespread and false idea that this is the word’s only correct meaning is a myth.
Lexicographer John Ayto says the bogus rule against split infinitives ‘has been so consistently and energetically condemned by self-appointed guardians of English grammar that generations of speakers and especially writers have been terrorized into avoiding it’. If it was merely a guideline, it would not have attained such notoriety. But because it was pressed on impressionable minds with the imprimatur of grammatical necessity, it became the epitome of a zombie rule.
My post on split infinitives has numerous examples of how obeying the pseudo-rule can lead to awkward or downright confusing syntax. Your claim that it ‘will usually result in the most aesthetically pleasing form’ is difficult to square with the evidence.
More generally, you if you argue for all manner of usages on the basis that “if lots of people do it, then it’s right” (which is the correct scientific way of doing linguistic analysis), then you must agree that a particular style preference (be it the avoidance of the split infinitve, or the retention of the distinction between “less” and “fewer”, or any other one) that is espoused by lots of people must be descriptively acknowledged as a functioning standard within the contemporary language, at least in the more formal registers.
What many who posit the “prescritivism vs. descriptivism” narrative do not seem to understand is that even those people who pass judgement on others’ usage do not tend to believe that there is anything inherently valuable in the standards which they are defending. For example, they know that “fewer items” is no more logical than “less items”. However, the traditional standards function as markers of education; so the use of certain forms of the language is meant to signal a membership in the educated stratum of society. And the annoyance at other forms is simply [spelt correctly in this reply, unlike in my previous one!] another way of saying “You should have paid attention at school, mate”.
Finally, I’d like to point out that, when you express your disapproval for the use of “man” to mean “humanity” based on an ideological position (a position with which I wholly agree, I might add), while dismissing the relevance of people’s usage practices regarding that word, you are in fact engaging in an act of prescriptivism.
In sum, I will say that is not meaningful to rail against “prescriptivism”. We all should be encouraged both to promote our desired usage standards (especially when they be educated standards) and to understand our responsibility to honestly observe the language community as a whole in order to determine what the overall norms are.
(And I hope that you noticed the spit infinitive in the last sentence, as well as the avoidance of it in several other instances where it might have been employed. Sometimes splitting the infinitive is the better choice; most times it ain’t.)
The position ‘If lots of people do it, then it’s right’ goes too far, in my book. Lots of people put an apostrophe in possessive its, but by current norms it’s indisputably wrong. Avoiding split infinitives and observing the less/fewer distinction are different, obviously: both styles are standard, as are their alternatives, though problems can arise with either approach.
I’m not sure if you think I’m advancing the ‘prescriptivism vs. descriptivism narrative’. For the record, I think it’s a false dichotomy and have repeatedly said as much, here and elsewhere. I disagree that prescriptivists generally don’t think there’s anything ‘inherently valuable’ in the standards they espouse. Quite the contrary: their criticism frequently has an unpleasant moralistic or ideological flavour.
Of course I’m engaging in prescriptivism when I advise against generic man. Some prescriptions are useful, and I have no qualms about sharing them, or imposing them in my job as an editor. But it’s not helpful to consider prescriptivism monolithic. My beef is with ill-informed prescriptivism, of which there is an endless supply. (You may have missed my post on reconciling descriptivism with editing.)
PS: I deleted four paragraphs from your comment, as it was a bit excessive. My comments policy is here.
Oops! My prolixity strikes again, as I believe that a comment of mine had once before been truncated on your blog. I’m sorry for ignoring that rule.
Anyway, I realise that you tend not to engage in the over-simplified narrative of “prescriptivism vs. descriptivism”; my intent was to criticise that practice where it does exist.
And I think that the moralistic and ideological flavour that you notice in language critiques is connected to traditional usage’s being a marker of education, and is therefore a result of the defence of education rather than of a defence of the inherent value of the language forms themselves.
Ultimately, I cannot accept your characterisations of many of these stylistic preferences (such as the avoidance of “reason why”) as “myths”. Such an act of characterisation positions the maker as an impartial observer who can establish objective classifications. But the truth is that every speaker is a participant in the life of the language and — more important — a is partisan of and a proponent of his/her own particular style.
Your last line is well put. I would support it heartily except for the use of his/her. It’s not just a matter of this and similar forms being awkward: they also explicitly exclude people who don’t identify categorically as male or female. Singular they for the most part bypasses this problem and is therefore intrinsically more inclusive. When it comes to gender identity, objective classifications (to use your term) are not always clear-cut or possible. I don’t expect to persuade you on this usage, but I think it’s important to note.
And thank you for the lively and civil debate.
You imply that at least some prescriptivism is a form of snobbery. To that it extent it is certainly to be railed at. In my book snobbery is closely related to racism and is equally deplorable.
There can be no ‘rules’ in English as there is no recognised institution to formulate them. What we have is a vast tangle of regularities and patterns. Most of these (98%?) are universal to all registers. And must be, otherwise we couldn’t understand each other. We pick and choose among the 2% depending on many factors, including education, fashion and group identification.
It’s very like the way we dress. There are universal patterns: keeping warm/cool/dry, concealing certain parts of the body. But within these constraints there is enormous choice.
Observe, don’t judge. That should be our watchword.
Exactly. I think rules can often be usefully replaced by conventions in discussions of English grammar and usage. Or, when the ‘rules’ are dubious, to put them in scare quotes (as you’ve done) or prefix/precede them with bogus, pseudo-, non-, or some such modifier, lest they be taken as holy writ.
> adhering to it will usually result in the most aesthetically pleasing form of the sentence.
This myth wasn’t created due to æsthetics. It was created in order to imitate Latin. Imitating Latin ≠ being æsthetic.
The idea that not splitting infinitives results in prettier sentences is ridiculous and easily falsifiable by even a cursory glance. Far from a good style, adhering to this pointless Latinism will just generate clunky, unwieldy, ugly English sentences.
Avoiding the split infinitive most often produces a style that is more pleasing and elegant. “She began to walk gracefully down the path” is superior to “She began to gracefully walk down the path”. This is the typical example.
Of course, once in a while, avoiding the split infinitive produces clunky writing — in which case the writer should definitely go ahead and split it. The famous example of “to boldly go where no man has gone before” is the obvious one; this version is much better than a version that goes “to go boldly…”.
It is not surprising that someone can cite examples in which not splitting the infinitive gives us awkward phrasing. But this proves nothing, as, for every such example, there are myriad cases in which avoiding the split infinitive is the more aesthetically pleasing choice.
Students should be taught a strong preference for not splitting the infinitive, but should at the same time be able to recognise the infrequent occasion when doing so yields the prettier phrase.
‘More pleasing and elegant’, ‘aesthetically pleasing’, ‘prettier phrase’ – this is all a matter of taste. Ours differ. You have, luckily, no right to impose yours on other people. They may decide for themselves what sounds better to their ears.
Nor do I see what makes your ‘typical example’ typical. I too prefer she began to walk gracefully over she began to gracefully walk, but you’ve neglected to mention she began gracefully to walk, an ambiguous style to which many split-infinitive-avoiders default, and which regularly gets them in trouble.
It may be helpful to compare versions of a more typical phrase, such as continued to strongly influence vs. continued to influence strongly and continued strongly to influence. All three are common in Google Books, but the split form is far more common than the other two put together, which is telling given how many copy-editors have internalised the zombie rule.
‘Students should be taught a strong preference for not splitting the infinitive…’
They absolutely should not be taught this ridiculous superstition.
I can’t help but feel that learning to say “I personally object to what that person is doing, but I accept there’s nothing objectively wrong with it, and no, nobody Ought To Make A Law Against It” would for many people be a useful and necessary lesson in tolerance outside of the matter of linguistics. Ditto your comments about linguistic privilege. Attitudes to language are so often a microcosm of attitudes to society.
Agreed on all counts, John. When people gripe about some aspect of language use, they’re often complaining – under the guise of language – about a person or group of people they dislike in some way, or airing concerns about broader cultural developments like the pace of social change (‘kids these days’). Learning to contextualise and rein in or redirect these impulses would be good for everyone.
In the world of computer protocols, we’d refer to this as Poster’s Law: https://en.m.wikipedia.org/wiki/Robustness_principle
Damnit, Postel’s Law.
Logically, aren’t all plurals mass nouns? That would make less appropriate (and more sense).
My only objection to your list is that it reinforces the peeve that data is a plural noun. Only very pedantic statisticians insist on that; the rest of us treat it as singular. (Do you say “I went to the opera last night and they were very good”?
It’s not just pedantic statisticians: plural data is still fairly common in the academic and scientific texts I edit. So is the singular, which I tend to prefer but didn’t in this instance. I wrote a brief overview of the usage in the early days of this blog.
I would, but only in connection with the performers, not with the work. I wondered why the plural form opera was applied to something that is logically singular, and it turns out to be a Latin irregular noun (third declension) whose plural form was reinterpreted during the change from Latin to Italian as a feminine singular. The musical use of opus is borrowed directly from Latin, and so maintains the Latin singular form.
There are choices and there are choices. ‘Irregardless’ is not accepted by the vast majority of people. ‘Decimate = widespread destruction’ is not accepted by the vast minority of people.
You’re probably right about the general unacceptability of irregardless, though I’ve been surprised sometimes to hear from people I ask that they’re fine with it or even use it. Maybe the silent tolerant proportion is underestimated. Among language professionals its reputation remains low: 90% of the American Heritage Dictionary‘s Usage Panel rejected it in 2012. With decimate it’s more complicated (but did you mean vast minority or vast majority?).
You could equally say that amn’t is not accepted by most people – but it is accepted by just about everyone in Ireland. Correctness depends on context, and widespread disapproval doesn’t imply categorical unacceptability. Many usages are valid in their own niches, and irregardless has a legitimate place in some dialects. It’s not for the majority to tell them otherwise.
I really meant ‘vast minority’ (ie a very minority). I was being flippant. ‘Decimate = widespread destruction’ *is* accepted by a vast majority, therefore is *not* accepted by a vast minority.
*(ie a very small minority)
I can accept ‘amn’t’ as ‘well-formed’ – it has a form of verb BE and a contraction of the word ‘not’ and makes a small group with ‘aren’t’, ‘isn’t’, ‘weren’t’ and ‘wasn’t’. It just happens that the vast minority of English speakers outside Ireland and ?% of people in Ireland don’t say it. ‘Irregardless’, on the other hand, is badly formed – there is (afaik) no other English word which has a negative prefix plus ‘-less’. It happens to be completely completely unacceptable to some English users, disfavoured by the vast majority and used by a tiny minority.
I think it’s fair to say that I am one step more cautious than you are. If I was writing this list, I would say ‘There is no context in which ‘irregardless’ is a better choice than ‘regardless”.
One linguist/language writer/blogger wrote about a similar list and mentioned that s/he’d circulated a preliminary list among family, friends and colleagues. All of the replied ‘What you say is fair, except for one item, which is completely unacceptable’, but then each nominated a different word or usage.
How well-formed a word is doesn’t dictate usage (though it often influences it). That irregardless is a morphological anomaly doesn’t outlaw it from English. Lots of English words and constructions are redundant, even if not in precisely the way irregardless is.
In some dialects irregardless has been used by educated speakers and writers since the 19th century as an intensive form of regardless, not a straight synonym of it. (This nuance, not widely known, is explored by Kory Stamper in her recent book Word By Word.) Caution doesn’t even begin to approach my aversion to telling these people they’re using their own dialect wrong.
‘Regardless’ is universally accepted. ‘Irregardless’ isn’t. Stating that fact does not equate to ‘telling these people they’re using their own dialect wrong’.
Maybe not telling them directly, but it does seem to follow from: “There is no context in which ‘irregardless’ is a better choice than ‘regardless’.”
I think there is much more hostility to irregardless in AmE than in other varieties.
Partly, I suppose, because it’s more likely to be used there than in other parts of the world, and its connotations of rurality or ill-education or whatever (however undeserved) are more to the fore.
Thanks for quoting Pinker’s arguments in defense of gradable unique. My unease with the arguments against it stand out in my mind as one of the first major waypoints in my journey away from peeverdom, way back in my teens and early twenties. However, I was never able to phrase a counterargument effectively the few times I attempted to defend it.
You’re welcome. His argument is more thorough than the excerpt may suggest – he devotes slightly over two pages to it – but that’s the essence.
Late teens and early twenties seems to be a common enough period for escape from peeverdom, if anecdotal evidence is anything to go by.
One thing that’s not exactly related but still eats at me: the use of the word American. We commonly use it as a demonym for the United States. As it turns out, the word originally referred to the Americas in general, and it was through a misinterpretation that it came to refer to the US. In Spanish, americano still refers to the Americas, while estadounidense is the demonym for the US. For this reason, is it more appropriate to use US as an attributive noun when looking for a word to use as the demonym for the United States?
Do you mean how ‘America’ is used as a demonym for people from the United States? Demonym refers to residents or natives of a given place, not (in general or standard usage) to the place itself. There’s good discussion of this at Lynne Murphy’s blog.
You don’t advocate ” for ” something..do you?
You can. The intransitive use (advocate for X) is over 400 years old, and was used by Daniel Defoe, among others (‘I have thus far advocated for the Enemies’). See the usage note at the AHD.
Indeed, the OED says advocate and advocate for are about equally old, having both landed around 1600.
I thought the peever objection re: “quote” was that it’s supposed to refer only to the punctuation mark, not that it’s not supposed to be a noun at all? Anyway, ¯\_(ツ)_/¯ seems about right.
They complain about both, and there would be some overlap. Critics seem to have started denigrating quote as a noun in earnest around 50 years ago: MWDEU lists Bernstein (1965), Follett (1966), Shaw (1975), Trimmer & McCrimmon (1988), and the AHD usage panel (1969, 1982) among its many detractors.
[…] A to Z of English usage myths […]
[…] a ‘bible’ for journalists, and the same problem (more forgivably) in a Simpsons comic. The passive also appears in my A to Z of English usage myths. […]
[…] 1860, when W. M. Thackeray wrote, ‘Irving went home medalled by the king’. From my A–Z of English usage myths: […]
[…] begs – the questions of who these authorities are and why we should trust them. Last year, in an A–Z of English usage myths, I […]
You appeal to antiquity to rightly justify many of these, yet the generic “man” is sexist and reactionary despite its own great antiquity? Hilarious.
Sometimes other factors, such as sexism, override antiquity. I’d have thought that was obvious, but I guess not.
I tend to agree with all you say, despite the occassional bit where I’m incilned to disagree. The first thing I don’t like is your “S”, where I think that your “leads to mangled” is a gross understatement – it should be “imposes appallingly mangled”. The second is in your U bit on “unique”, where I agree with what you said but am shocked that you don’t realise that the second paragraph you quote from Pinker is so distant from his normal capability that anyone who has learnt any serious logic (or indeed any serious physics or mathematics) will immediately on reading it realise that he is contradicting his own first paragraph (and I was surprised that you apparently didn’t realise that). And the third and last is on X, because X for Christ is a good deal more than a millenium old even in Old English or whatever immediately preceded it – the British Celts were better educated by the Germanic iinvaders, and saw your X for Christ as X for Χριστός more than half a millenium earlier than that and that’s where the Germanic- invaders took it from as soon as they started to take Christianity seriously.
So I have minor disconnects with your story in two places U and X) , and an utterly trivial disconect in one (S).
[…] unwelcome deletion or addition of an Oxford comma interferes with the sense; ditto the use of that or which. This may be an apt time to quote from my A–Z of English usage myths: […]
[…] is infamous in editorial circles for this reason. My rule, featured in the A–Z of English usage myths, is that if you say decimate can only mean ‘kill one in ten’, you must also call October […]
[…] selective about the rules I enforce, dismissing the myths that bedevil English usage. I may apply a rule one day and not the next, adjusting to house style or other factors. I also […]
“Theodore Bernstein said that this term, being from aviation, should only mean ‘upward mobility’.”
I guess Theodore Bernstein doesn’t realise that aeroplanes can travel really fast (Lockheed Martin/Boeing F-22 Raptor, anybody?), and that’s why the word ‘zoom’ can mean that too. (-_Q)
They travelled really fast even in Bernstein’s day; he was just being Canutish. Nice emoticon, btw.
[…] Stan Carey enumerates English usage myths: […]