Category Archives: editing

Currying favour with your readers

Originally published on The Editors’ Weekly

Editing and writing have a lot in common with cooking. For one thing, people come to a text, as to a restaurant, with certain expectations and ideals, and you should satisfy them. You don’t have to give them something completely predictable – especially if you’re in a line more artistic than industrial – but you do want to curry their favour.

That puts me in mind of a recipe in the Larousse Gastronomique, 1977 English edition: “Chicken curry (Plumerey’s recipe).” The listed ingredients are two chickens (cut up), butter, 500 grams of diced uncooked ham, a tablespoon of flour, light veal stock to moisten, a bouquet garni (a standard French seasoning made of a bundle of herbs), and two teaspoons of curry powder.

I don’t think you’ll be served that recipe at any restaurant today. It would seem weirdly out of place (and just weird) in a French restaurant, and it would get the chef in an Indian restaurant fired. But there was a time when French cuisine was considered by many to be the apex of the culinary world, and anything you might eat could be “improved” by a French touch. Even curry.

Likewise, there was a time when a single standard prevailed throughout most of literature. Even if a given work didn’t meet that standard, it was understood that that was what it was aiming for. Certain things were simply infra dig, my dears. Other standards were sub-standard. It was important to show you had the right sort of education.

That time is past. Just as we no longer consider French ingredients and techniques the basis of all the best food, we – or many of us, anyway – are now wise enough not to think that starchy formal English is necessary or even appropriate everywhere. There are, alas, still some people who believe that an overarching consistent adherence to a single standard is the goal of writing and editing. If a writer aiming a rambunctious piece at an informal audience puts “There’s a couple things you should know,” such an editor will tut-tut and change it to “There are a couple of things you ought to know” – or “a few things” if there are more than two. Never mind that that changes the flavour completely; somehow, a palate that can’t taste the difference is supposed to be better.

And perhaps such an editor would be pleased to be served a curry cooked to the standards of Carême. For everyone else, let’s use appropriate ingredients and techniques. English – like any living language – has a multitude of styles suited for different contexts and people. When we recognize that and work with it, we aren’t letting go of rules, we’re choosing which rules to use to suit the occasion. When people come to a French restaurant, give them the best French cuisine, sure. When they come to a chain restaurant, give them a consistent demotic product. And if they’re after good barbecue, or tortellini, or nuer pad prik, or vindaloo, leave Larousse on the shelf.

frontline

Who is at the frontline of language change?

Sorry, should that be front line? I think it should. If you are at the foremost front, you are at the forefront, not the fore front, but the line that forms the front of a battle – or, more figuratively, any other advance (especially in conflict situations) – is the front line, according to, well, every dictionary you look in.

But a lot of those dictionaries have frontline too. Not as a noun, though: as an adjective. Staff who are dealing directly with customers are front-line staff or frontline staff. If they were written as front line staff, there could be confusion over whether they were line staff at the front rather than staff at the front line. So we hyphenate. And, over time, as with this adjective, we may merge.

Or we may not. Mergers happen sometimes and not other times. You can be a healthcare professional working in health care – or working in healthcare, because that noun has a closed-up version now too. And you’re reading this on a website – or, to be old-fashioned, a web site. But you can have an ice cream float or an ice-cream float, but if you had an icecream float you risk having some pedant with a marker draw a couple of lines to indicate that there should be a space there, implying that you need grammatical trainingwheels.

The front lines, in language change as in war, are very uneven, meandering up and down and in and out, and the main thing that keeps them from moving is just if they get really entrenched (yes, when you think about it, front line and entrenched both call to mind the ghastly battles of World War I – both predate it by centuries, but both have military origins).

So… could frontline become the noun form too? Some people want it to be – a colleague mentioned to me that one of the people he works with is pushing for that change in their published text. Mind you, his coworker isn’t saying “I know that front line is standard, but I think we are making a good move forward to close up this compound. We may be in the, erm, vanguard, but we can take the fire.” No, his coworker is saying “I looked in the dictionary and it has frontline as a form so I’m going to use it everywhere.” His coworker is heedless of the noun-adjective distinction.

Which is how language change so often happens: reanalysis, or what members of preceding generations tend to call mistake. The English language isn’t really an ongoing battle – if there is an enemy, they are us. It’s more like a complex game that gets passed on from one family to another, and it doesn’t have a rulebook, and each new group of players pick up a few things from the previous players but mostly figure things out for themselves, resulting in some shift of the rules over time. We hear our parents talk, and we work things out for ourselves, and they don’t correct all of our reconstruals.

So, yeah, you could say that the front line in language change is the battle between the older generation, wanting to preserve what it knows, and the younger generation, wanting to do what suits them best. But from another perspective, the battle is as much like explorers having to put up with previous people – who didn’t get as far – shouting at them “No, you fools! You’ll fall off the edge of the planet!”

Fine, fine. The question remains: is frontline taking over from front line as a noun? Is it heading the way of healthcare and forefront? Will we soon see not only the frontline but the frontlines just as we see the headlines? Or is it like icecream and trainingwheels? Let’s have a look at a Google Ngram:

frontline_NOUN is way below front line_NOUN and both adjective forms, and not gaining very much

Hmm. Nope. Anyone who uses frontline as a noun is going to be awfully far in front of everyone else, exposed and prone to being shot at… from behind. And the general usage may not ever come close to catching up. It looks pretty well entrenched.

Addendum: I neglected to consider one important vector for change in this. Google ngrams are case-sensitive, and I only surveyed lower-case. But take a look at this:

Frontline-1

So Frontline is increasing in usage much more than frontline. Why is that? I’ll tell you one reason. Since 1983, PBS has had a documentary series called Frontline. TV shows are important vectors for language change.

But that doesn’t mean the branding of the show is spreading the one-word noun throughout the language rapidly. A brand is a brand and may stay as such. Let’s put this in perspective:

Frontline-2

After all, it’s on PBS, not NBC, ABC, or CBS. Public broadcasting is at the front line of knowledge, but most people don’t actually like to get too close to the front line. At least not intentionally.

Whoever is the subject?

Who will inherit the investigation?

Oh, whoever will inherit the investigation?

Whoever will inherit the investigation, he will be someone Mr. Trump nominates.

Whoever will inherit the investigation, Mr. Trump nominates him.

Whoever Mr. Trump nominates will inherit the investigation.

Wait, says the writer. Mr. Trump nominates him. So it must be whom. Whomever. And so, in The New York Times, appears this:

Whomever Mr. Trump nominates will inherit the investigation.

Because formally correct. So whom. Yeah?

Nah. Hyperformalism.

Of course cases like this bedevil writers. The construction is complex and whom is not part of standard daily English; in effect, it is a foreign word for most of us. Wherever we think it might be appropriate for formally correct speech, we are tempted to slip it in, sort of like how some people stick –eth on every conjugation when they want to sound old-fashioned. But sometimes we go overboard and use it where it doesn’t belong.

When people write sentences like the one in question, the rule they’re turning to is that the object must be whom, not who.

The rule that they’re forgetting is that every verb must have a subject.

What’s the subject of will inherit?

It has to be whoever, because whoever else would it be?

One loophole that writers miss that would resolve some grammatical dilemmas is that a whole clause can be an object, as in “Mr. Trump will nominate {whoever gives him the most money}.” Another loophole they miss is that the subject or object of an embedded clause can be made to disappear by what linguists call moving and merging, leaving just an embedded trace (that we know exists thanks to psycholinguistic experiments). That’s what goes on here. The him in Mr. Trump nominates him gets tossed like a baseball in a double play back to the Who, and the catcher’s mitt on the Who is ever. (It can also be an emphatic as in “Oh, whoever will help us?” but it’s not one here.)

Look at “Who(m)ever Mr. Trump nominates, he will inherit the investigation.” (I put the m in parentheses because if you use whom as the object you would use whomever here, but in normal non-prickly English we use whoever as the object too.) Notice that you (almost certainly) wouldn’t write “Who(m) Mr. Trump nominates, he will inherit the investigation.” The ever sets up a second reference, the he. It can also set up an object (him): “Whoever gives the most money, Mr. Trump will nominate him.” (All of this works with she and her too, but we can see that Mr. Trump does not work with very many shes and hers.) So the ever can refer to an object while attached to a who that’s a subject, or the converse.

Our sentence du jour, however, is not derived from “Who(m)ever Mr. Trump nominates, he will inherit the investigation.” Not quite. In “Whoever Mr. Trump nominates will inherit the investigation,” the main verb of the sentence is clearly will inherit (will is the auxiliary that takes the actual inflection, and inherit is the infinitive that conveys the sense); the subject of will inherit is Whoever, as already pointed out. Mr. Trump nominates is an insertion – a subordinate clause modifying Whoever. By itself it would be Mr. Trump nominates him, but, as I said, the him is tossed back and caught by the ever.

Let’s diagram that like a good linguist, shall we? This is the fun part! Syntax trees have details that non-linguists will be unfamiliar with, so let me set down a couple of basic facts:

  1. A sentence is a TP, which means tense phrase – because it conveys tense (when the thing happens), not because it’s too wound up. The heart of it is thus the part that conveys when it happens: the conjugation on the verb. The verb phrase (VP) is subordinate to that, but it merges with it unless there’s an auxiliary verb taking the tense.
  1. A subordinate clause is also a TP, because it has a conjugated verb, but it’s inside a CP, which means complement phrase, because it’s a complement to something else in the sentence. Often there’s a complementizer, such as that or which, but not always.

So.

The subject is Whoever. Because in English conjugated verbs (except for imperatives) have to have explicit subjects and they have to be in the subject (nominative) case, this can’t be Whom or Whomever. The tense goes on will. The verb is inherit. The object of that (its complement) is the noun phrase (NP) the freakin’ mess – sorry, the investigation. (I haven’t broken that down further, but actually it’s a determiner – the – and a noun.) The complement of Whoever, by which I mean the subordinate clause that describes who the Whoever is, is Mr. Trump nominates [him]. The him is tossed back to the ever.

Whoever Mr. Trump nominates will inherit the investigation.

Whoever will inherit the investigation?

Who will inherit the investigation?

He will inherit the investigation.

(Mr. Trump nominates him.)

So why doesn’t the NYT version instantly sound bad, as “Whom will inherit it?” would? It’s a more complex and unfamiliar construction, and what we tend to do in such cases is go with the salient rules we can remember and basically make up rules to make the rest work. For people who don’t balk at the “Whomever Mr. Trump nominates will inherit the investigation,” I believe what’s probably going on is that it’s an underlying “Whomever Mr. Trump nominates, he will inherit the investigation,” and the he is getting tossed back to the ever. So you have a trace of the subject rather than the object. Now, you can have a trace of a subject when you have more than one verb conjugated to the same subject – “Whoever gets the nomination inherits the investigation” – but it’s not normal formal standard English for a subject to be deleted and merge with an object that is not deleted. We need the subject!

But then, really, whoever speaks formal standard English all the time? Well, not whoever wrote that sentence, anyway, or it wouldn’t have been written, because it would have sounded wrong.

What’s logical about English?

A common complaint about English – by those who are inclined to complain about English – is that it’s not logical enough. Whatever that means. Words aren’t premises and sentences aren’t syllogisms, after all.

If you inspect the targets of their opprobrium, you find soon enough that what they mean is that English isn’t tidy enough for them. It’s inconsistent. Lacking in symmetry. Their experience has led them to believe that for every up there should be a down, for every in an out; when they see an over, they think “therefore under,” and if there is no under, they are… underwhelmed.

They’ve condemned themselves to a lifetime of disappointment. English does not satisfy their need for an overarching tidiness. It is not a Zen garden; it is a forested mountain, every tree grown unplanned in its place and conditions, every rock where the ineluctable complexities of physics left it. It is not an edifice of modernist design with proportions based on the golden mean; it is a Winchester House of a language, a veritable Heathrow Airport of accretions (for those who have not been to Heathrow, let me just say I suspect that J.K. Rowling based Hogwarts on it). Like any natural language, English has been built up by habit, need, association, and analogy. It does have structure – in fact, it has some inflexible syntactic requirements. We have slots to fill, and fill them we do. We just sometimes grab whatever’s ready to hand to fill them.

Let’s consider a few examples. One case where a desire for logic has actually prevailed is “double negatives.” Anyone who has studied logic will tell you that in “not not” the second not undoes the first one. “There will not be cake” is disappointing; “There will not not be cake” is affirming. Thus, the reasoning goes, “I do have nothing” and “I don’t have nothing” are opposite. But anyone who has learned a Romance language ought to know ça ne vaut rien, no vale nada – that ain’t worth nothing.

Nothing, you see, is not not. It’s a noun, not an operator. And one thing languages like is agreement. Concord. Adjectives tend to take the same gender as the nouns they modify, for instance. In English, we use concord with tenses in some contexts: “Should we expect them tomorrow?” “They said they weren’t coming.” Notice how we use weren’t even though we’re talking about the future? We even let negative concord pass unremarked in some contexts: “They won’t be coming, I don’t think.” This doesn’t mean I don’t think they won’t be coming; it just retains the negative aspect.

But since it’s possible, with shifting emphasis, to make “There ain’t no one here” and “There ain’t no one here” mean opposite things, an argument can be made for disallowing negative concord for the sake of unambiguity. So the proscription stuck, defended by pleas for logic – although “if negative noun, then negative verb” is perfectly reasonable if that’s the rule in the language.

Syntax has its requirements – as linguists would say, there are principles and parameters that specify how it functions in a given language. Negative concord is one parameter we have managed to turn off. Others are not so easily disabled. It’s necessary to have an explicit subject (except in imperatives), for instance; I can’t write “Is necessary to have an explicit subject,” so I stuff in an it that has no meaning. It may not seem logical to have a pronoun with no referent, but consider that, from the view of our syntax, “if it has a sentence then it has a subject” is solid. Sometimes we grab and stuff on the fly – we may jam a word in the place where a word like it normally goes, even if in this case it’s a whole nother thing and what even were we thinking? This, too, comes from a simple if-then – just a little simpler than it might have been.

Another plea for logic comes when a word is pressed into service in a way that seems untidy. One I saw recently was an objection to using disconnect as a noun, as in “There is a real disconnect between the labourers and the management.” We don’t say “There is a connect between them,” we say connection, so it’s illogical not to say disconnection. Indeed, this is untidy, in the same way as it’s untidy that when my wife is at home I heat two servings of food and pour two glasses of wine, but when she’s not at home I heat one serving and open a beer (or go out for sushi). But our little untidinesses have reasons: my wife doesn’t drink much beer and doesn’t like sushi. And disconnect is an allusive use borrowed from electronics and telephony.

A line of communication is expected to remain connected, so there is no instance where we would say that it has experienced a connect. We grabbed a bit and stuck it where it fit, and in so doing made a metaphorical connection. There’s no need to construct a symmetrical positive use any more than there is a need for a 33-storey building to have 33 levels of basement. And there’s no need to disallow allusions just for the sake of tidiness – we don’t forbid lights on Christmas trees just because there are none on the house plants. If you want to make a connection, you make it; if you don’t, you don’t. That’s logical, no?

Some people also like to laugh at how “illogical” English words are. “Why do our noses run and our feet smell? Why do we park in a driveway and drive in a parkway? Why do we say a bandage was wound around a wound? How come you can object to an object?” OK, now tell me why these are illogical.

Every one of them comes from a well-motivated historical development founded on consistent principles: metaphor, ergativity, historical sense developments and standard compounding rules, phonological shifts, stress-based differentiation of nouns from verbs. In every case there was an if-then judgement based on analogy. It just happened not to be exactly analogous to some other if-then judgements, and it produced results that seem inconsistent when juxtaposed. I think that’s fine – why not have funny things? But more than that, it’s not even illogical. In every case, we got to it from “if A → A´, then B → B´.” They just happened to be local judgements made in the context of a big, multifarious, inconsistent world.

But it would be illogical to treat a multifarious, inconsistent world as though it were elegant and pervasively consistent, wouldn’t it? It certainly wouldn’t be well adapted. It would be like laying down a strict grid street plan for a very hilly city (and San Francisco knows how well that worked out). It wouldn’t be as much fun, either. And it might do real harm.

croon

The lexis of our language is like a coral reef, full of wonders rich and strange. And, as with coral reefs, one of the threats to its diversity is bleaching.

Coral bleaching is a result of coral shedding algae due to rising sea temperatures. Semantic bleaching is a result of words shedding distinctions of meaning due to overuse, over-broad use, what one might call thesaurusitis: treating all words in a section of Roget’s as fungible. The words are still in use, but they lose much of their distinction of sense, thereby reducing the expressive power of the language. And modern electronic media can amplify this effect.

Expressive power and electronic amplification have much to do with the word of which I sing today, croon. It’s not a new word; it dates back half a millennium in Scotland and northern England, where it has for that long had the sense of a low sound, either (particularly in Scotland) a low, deep, loud, steady sound, or (more broadly) a low murmuring sound or soft quiet singing. It sounds like it should mean what it means. In singing, it’s the opposite of belting. Belting is the kind of singing you do when you have a noisy room and poor (or no) amplification. Once you have a good microphone and good speakers, you can draw the audience in with quiet, smooth singing. You can croon.

Which is what happened in jazz in the late 1920s onward. The first truly famous crooner was Rudy Vallée. His soft voice crept like a lover into millions of living rooms through the radio. Many others followed; the one probably most often thought of as a crooner was Bing Crosby. They all had a gentle, quiet singing style that worked closely with the microphone.

But as one technology giveth, another helpeth take away. Newspapers are written by people who are trained to be allergic to repetition. They seek different words for the same thing to make their prose seem more varied and expressive. We can never forget that a pumpkin is a gourd thanks to them; we are given the idea that every promise, however solemn and formal or not, is a vow; food writers talk of people munching even the softest, smoothest, quietest foods (ice cream? oatmeal?). And every act of singing may be called crooning.

It’s not that the word is always used over-broadly; it has not been utterly bleached. But a quick look through recent New York Times articles finds gospel choirs “crooning” more than once and a jazz singer who “crooned over the trio, belting the 1941 Duke Ellington classic” – yes, crooning and belting as the same act. Even non-singers, we are told, have sometimes “crooned”: soccer fans, en masse in a stadium, “crooned” “We’re not going home”; Donald Trump, at a rally, “crooned” “I love you! I love you!” to his supporters. Every one of these uses is amplified to an unlimited number of eyeballs through the wonder of the world wide web.

As a linguist, I can look at this and just write it down as instances of semantic broadening due to an evident desire for more expressive-sounding vocabulary (with the likely long-term effect of reducing the expressive value it draws on). As an editor and user of the language, however, I would rather resist it, because it ultimately reduces the expressive power. And there can be quite a lot of expressive power in the soft, quiet, focused, and amplified sound of crooning.

populism, populist

Language change is, generally, organic. It usually doesn’t happen by fiat (especially in English); it also doesn’t happen by vote. There may be some influence from “above” by people such as English teachers, but that mostly affects what rules people think they’re breaking when they’re speaking the way they want to speak anyway. You could say that language change – grammar, the meanings and pronunciations of words, and so on – happens by mass popular movement.

Which is not to say that it’s populist. Populism is a political stance that advocates the people – the general populace, hoi polloi – in opposition to a ruling élite. Language change is not the product of a program leading a movement of the populace in opposition to English teachers, editors, and others. There could be such a movement, of course, but de facto language change happens by popular will anyway. We’re all part of it, not just the people at the top. (The situation can at times be different for languages with official deciding bodies, such as in France.)

So, you could reply, if we all decided that the meaning of populism should shift to ‘following the will of the people as with a general tide, without a specific political program’, then that would be the meaning. Well, yes, if that sense shift happened and ultimately overcame opposition. The change would probably take quite a while and not be without some controversy. But it could happen.

But there are shifts that we have every right and reason to resist, no matter how many people use the new sense. We are all streamkeepers of this flowing language. When a word is being used as a euphemism to let something slide that should not, or when its use carries implications that have negative consequences, we should not let these pass unnoticed. If we speak up and point out the problems, we may help these shifts to become unpopular.

So, for instance, for a long time there was a default assumption that people in certain roles were masculine, and so he and him were used. Around the time that such assumptions started to be a bit less tenable, a common line from prescriptivists was that he and him were the natural universal gender-neutral pronouns. (Poor men, having to sacrifice the uniqueness of their pronoun! Ah, such sacrifices must be made.) At long last enough people pointed out that this did, in fact, convey the default assumption of masculinity – words carry resonances and implications whether you say they should or not – and so use of masculine pronouns as a universal has lost general acceptance. (Read more about this in my article on they.)

Which, I suppose, you could say was a populist movement – the neglected masses against the prescriptive authorities – though it was in particular a movement by and on behalf of the more neglected moiety of the populace, to influence the more dominant segment and thereby produce a fairer outcome for all.

But now let’s say that people start using populism to refer to a movement focusing on the desires not of the whole population but on a minority of it who consider the remainder to be of lesser status. Say, for instance, that there is a group we’ll call X in the population, and they feel that the government has been giving too many rights to that larger part of the population that is not X. This group has traditionally been the group that, for all its internal differences, has been ceteris paribus the more-advantaged group, and they’re seeing non-Xes get similar rights. This doesn’t involve the loss of any rights from X – unless you consider it a right to have things that other people you consider inferior can’t have. If some political leader or party rallies members of X against the government just so they can protect their perceived right to have more rights, would you call that populism? Would you accept seeing it called populism? When the movement is for the rights of not all the populace but just a subset of it, and strongly against the rights of others?

This is not a hypothetical question. I’m seeing populist used quite a lot by news media and the commentariat for racist, nativist, frankly sexist and reactionary movements. In countries across Europe and at least one in North America, leaders who advocate or enfranchise not just xenophobia but racism and sexism are being called populists, and the reactionary groups that support them are being described as having populist sentiment.

Which implies that women and non-white people are not part of the populace, or anyway are not relevant parts of it. In spite of being, in sum, the majority. And, for that matter, it also implies that white men are, en masse, in favour of such movements. Which is also not true.

I think we owe ourselves and everyone else a duty to make this use of populist and populism unpopular.

Calling them what they want

This article was originally published on The Editors’ Weekly, the official blog of Editors Canada.

We’re all professionally attentive to detail, so I’m sure we all appreciate that, having earned a PhD, I am technically Dr. Harbeck, and it could be rude to call me Mr. Harbeck. My wife, having a master’s, is Ms. Arro — not Miss Arro, because she’s married, and not Mrs. Arro, let alone Mrs. Harbeck. Letters addressed to us as “Mr. and Mrs. Harbeck” will be received as uninformed or rude, depending on who they come from.

Now, if I were a judge, you would call me “Your Honour”; if I were a lord, I would be “sir” or “my lord”; if I were a king, I might be “Your Majesty.” When we refer to politicians, nobility and high-ranking ecclesiastics, we have to make sure we include, as appropriate, “the Right Honourable” or “His Eminence” or whatever. We’re in the business of calling people the right thing: the title to which they are entitled.

Or calling them what they want to be called. Even non-editors know it’s rude to call someone something they don’t want to be called. We don’t call Sir Edward “Eddy baby” unless he asks us to. We also don’t call people who have changed their names by their old names, especially if their identity has changed. We don’t call Chelsea Manning “Bradley Manning” or Caitlyn Jenner “Bruce Jenner” (although we may use that name historically, for instance in stories on the Olympic Games).

We don’t always call people by names and titles, though. Sometimes we just use pronouns. There are languages (such as Turkish and Finnish) in which the sex of a person makes no difference in the pronoun, but English is not yet one such. Since the binary distinction is an unnecessarily restrictive imposition, the singular they is gaining currency (since number sometimes is relevant, however, expect to see they-all becoming popular in its wake). But some people do want to use pronouns for gender presentation. There are a few different pronouns in use, not just heshe and they, but also others such as zey. But not nearly as many as there are honorifics, let alone names.

And yet, some people — even ones apparently capable of attaining and requiring “Doctor” before their names — find it beyond endurance to have to keep track of these pronouns. They deride it as silly faddism or political correctness — terms of abuse for people who refuse to stay in the boxes you have made for them. They can manage to remember who is Mr., who Dr., who Your Excellency; they can get a grip on who is Alex, who Sandy and who Alexandra; but keeping track of pronouns is just too much for them.

Of course it’s not really. They just don’t want the dominance of their paradigm challenged.

As editors, we like to ensure adherence to chosen sets of arbitrary standards. But we also like to check our facts and get the myriad nice details right — such as what pronoun a person has asked to be called by. It’s not all that difficult, and it’s good manners, too.

In the original article, I didn’t include some further remarks on “freedom of thought,” which was a line taken by a professor who is a prominent opponent of following people’s choice of pronoun. But I would like to add them briefly here:

In the case of the professor in question, it’s obvious to onlookers that he’s incensed at having to defer rather than always be deferred to; it threatens his freedom of thought only inasmuch as it makes it difficult for him to maintain his hyperinflated self-estimation. (He has been heard to lecture women on the purity of his feminist bona fides. Not really the cuttiest butter knife in the drawer, this guy.)

But just to address the broader question: If you are of the opinion that strict nativist two-valued gender normativity is the only truth, I assure you that using requested pronouns will not force you to think otherwise. You are still able to think such things. If you are concerned about your reputation, lest you be mistaken for someone who respects others’ choices of gender identities, you are still free to make it clear that you are actually quite rigid in that regard, and are conforming to university policy out of respect for civility. You are even free to think that civility is stupid; your freedom to be a jerk in your mind is not impaired by a requirement to act nice. Most of us are jerks in our minds more often than we are in our words and deeds.

For a parallel: We can’t force people not to think racist thoughts (though we can do what we can to encourage them to revise their views), but we sure as hell can require them not to say racist things. Especially within the ambit of an educational institution, for instance. Part of existing in a civil society is agreeing that, however little you may like or agree with some people, you must at least recognize that they have certain rights, which must necessarily be extended to all for the functioning of society. One of which is to be treated like a human being, and not as something less due to some intrinsic part of their person.

See? You can think whatever you want. But you act in a way that shows the required acknowledgement of others’ humanity. This may threaten your freedom of thought if if interferes with your holding the view that you are already being more than accommodating enough for these people, or forces you to confront the possibility that, in spite of what you tell yourself, you do not view everyone equally. But I do not think freedom from having your thinking challenged is a freedom worth fighting for.