I’ve written a couple times before why I support renaming Columbus Day. Yes, I’m a pasty-white-skinned blue-eyed guy whose ancestors came from places like Ireland, England, and France, but I recognize that I only got to be born here because a lot of horrible things were done to the native peoples, including driving them off the land.
And don’t get me started on how the European invaders just had better technology and the land was underused. Get yourself some history about the pre-colonial Piedmont Prairie and Forests, which were maintained by multiple native tribes, who did controlled burns and crop rotation in some portions, carefully leaving other protions alone, so a huge number of species of plants and animals (including a species of woodland bison) could thrive there. The European colonists made land sharing deals with native tribes… and then decided to ignore their own deals and through encroachment, clear cutting, dam-building, and the occasional outright slaughter drove the indigenous people away. And also drove a bunch of species into extinction.
And if you’re the sort of person who uses “illegals” as a noun and yell at anyone with dark skin, or a non-European name, or who just disagrees with you politically to “go back where you came from!” I have to say, “You first.” Until then, shut up.
Other people have written a bit more about the historical reasons we rename the day and why Columbus isn’t a hero. And since some of them are natives, you should read what they have to say on the topic.
Lots of people repeat the very bad translation of an ancient proverb, thinking that it is rude or crass to say anything in the slightest bit negative about someone who has died. But that isn’t what the proverb actually meant in the original language. It didn’t say never say bad things about the dead, what it actually said was, “Of the dead, speak nothing but truth.” Don’t tell lies about the dead, but there is nothing wrong with saying truthful things that are less than flattering.
So, I am not here to say false things about John McCain, I am here to speak truth, a truth that absolutely contradicts most of the stuff people are trying to say about him.
First of all, he was not a maverick. He was not a loose cannon who stood up to President Trump. He said some things that condemned some of Trump’s worst lies and distortions and most hateful statements, and then he turned around and in every case except two, voted in favor of the evil, hateful laws that Trump wanted and the corrupt unqualified people Trump nominated.
And this is something that McCain did for his entire political career. At certain strategic moments he would verbally disagree with some of the most extreme statements of his fellow Republicans, but then nine times out of ten he voted in favor of the very policies that people think he opposed.
As an example of this theme, let’s look at the Affordable Care Act, often called “ObamaCare” (though a more accurate name would be RomenyCare, because it was virtually identical to the health care system that Mitt Romney signed into law when he was governor of Massachusetts). Many people like to focus on McCain’s dramatic vote against the attempted repeal after Trump took office. First, this ignores the more than 50 times that McCain voted to repeal the law during the six years prior to Cadet Bonespur occupying the White House. McCain opposed it when it was initially proposed. He voted to repeal it more than 50 times. He bragged about voting to repeal it. He mentioned his opposition to it in numerous re-election campaign ads. He fundraised for both his re-election campaign and multiple Political Action Committees on his pledge to repeal it. For a bit more than six years McCain was opposed to Obamacare.
There are people who try to spin his decision to switch sides and stop the repeal of Obamacare are the result of newfound compassion due to his own health care crisis. First, is statement when he cast the vote doesn’t support that interpretation. He said he was opposed to repealing it without going through proper hearings about the impact of the repeal.
And think about what was going on. His own constituents (and thousands of other people outside Arizona) were calling his office and begging him to spare their lives. Voters were begging for the health care coverage of their loves ones. They were begging. And they had been for some time. Every time the Republicans had brought up repeal before, the devastating cost, including the tens of thousands of people who would die needlessly because of the repeal in the first few years was explained. They had the facts and figures. They knew what it meant.
And John drew it out dramatically until the last moment, swooping in with maximum press attention to save the day.
It was the moral equivalent of holding a gun to head head of someone’s sick grandmother or child saying, “It would be a shame if something happened to them,” then pulling the gun away at the last moment and saying, “All right, I won’t kill you today.”
And he expected to be treated like a hero for doing not pulling the trigger.
The primary way that a senator influences policy is with their vote. And if you look at John McCain’s voting record, it does not paint a picture of a hero. He opposed gay rights at every opportunity. He voted against adding sexual orientation to the list of protected classes for anti-discrimination laws and hate crimes. He voted against the federal government recognizing civil unions or marriage of queer people if states enacted it. He voted against allowing queer people to service opening in the military. None of those votes makes him a maverick among Republicans. And it shows a clear bias against my rights under the law.
Another way a senator can influence the policy is decided which party to caucus with. No matter what party the senator belongs to or was elected under, they can choose to caucus with either party. Doing so changes who makes decisions about what is voted on and when. If John McCain had truly been opposed to Trump’s policies, he could had caucused with the democrats. It would only have taken three Republican senators doing that to stop most of Trump’s agenda in its tracks. That would have been the actions of an independent-minded senator putting loyalty to the country ahead of party.
He didn’t do that. Despite the fact that many constituents were writing and calling his office and begging him to do so.
John McCain served his country for most of his adult life. He served in the Navy as a pilot during the Vietnam War until he was shot down and capture. He spent a long time in a prisoner of war camp and was tortured. I don’t dispute his service or his patriotism displayed at that time. I’m not one of the crackpots who try to claim he was a war criminal or traitor because of some of his actions to being tortured.
While he had been a prisoner of war, his wife had been in a horrific car accident. She was required 26 surgeries over a six month period to recover. Once she was able to leave the hospital, she needed assistance to walk, but she resumed caring for their three children. Six years after returning from Vietnam, McCain started an affair with a much younger (and wealthy) woman. He divorced his wife, moved to Arizona, married the younger woman, and then started campaigning for Congress. It has always amazed me how the party that embraced the Moral Majority and calls itself the Family Values party embraces men who cheat on their wives, leave those wives for the younger women, and insist that the men are honorable and upstanding men.
Yeah, life is complicated and people are imperfect. I’m not saying that he was a monster. But he wasn’t a hero in his political life. He voted for and enabled racist, sexist, homophobic, and transphobic policies. He enabled a corrupt and probably treasonous administration to push this country a long way toward being a fascist autocracy. And he wasn’t a hero in his personal life. He was a man. Not a great man, merely a man.
“Woe unto you, scribes and Pharisees, hypocrites! for you are like unto whitewashed sepulchers, which indeed appear beautiful outwardly but are within full of dead men’s bones, and of all uncleanness.”
I need to do a bit of a follow up to my previous post about the issues at Worldcon. I didn’t touch on everything that happened, and since the issue blew up, Mary Robinette Kowal, whose tweet from years ago on a related subject I quoted in that post, has agreed to help redo the programming. Kowal has been running the programming tracks at the annual Nebula conferences for a while, and she had posted a nice summary of their process for trying to put together a program that appeals to many parts of the community. So many of us are provisionally hopeful that the situation will be a bit better at the actual convention than they appeared just days ago.
I have also been reminded that sometimes it is difficult to tell the difference between ignorance and actual malice. Now, I was thinking that most of the bigotry that seemed to be motivating the issues were likely unconscious—all of us are often unaware of just how many prejudices we have absorbed from society. Alis Franklin, in particular, has pointed out another explanation for much of the problem:
“This all feels very much like people used to running a small-town parochial con with an established member-base suddenly getting in a twist because they have to accommodate (gasp) outsiders.”
And she’s likely on to something. A lot of this does sound like the people in programming are speaking from their past experience running their local convention, where they believe they know their audience and what those attendees expect. But even if that is the case, I still suspect that their local crowd includes a lot more queers, people of color, and other folks who are interested in topics that their local con doesn’t recognize in programming—because as I said, we’re everywhere, and we’re all used to being excluded and dismissed; so much so that when we raise an issue and are shut down, we often just hold our tongues thereafter.
On the issue of the one pro whose submitted bio was edited to change all of eir pronouns to “he” and “him”, and the insistence for a few days that this was a bio taken from the web (when no one can find such a bio and they can’t provide a link), that gets into the conscious versus unconscious bias. Either the person who copied the bio was simple too ill-informed about non binary people and nontraditional pronouns, and simply assumed it was some kind of extremely consistent typo (which I think is a stretch), or they’re one of those people who balk at pronouns to the point of refusing to use any they don’t agree with and decided to change the bio and then claim it was a mistake if they were called on it.
I don’t know if the same staffer is the one who decided not to use another pro’s usual publication bio and photograph, and instead write a different bio using information that usually was not released publicly and use a photo taken from the pro’s private Facebook. In any case, it is difficult to construct an “honest mistake” excuse for that one. And if it is the same staffer, I think that is more than adequate proof that the changed pronouns on the other bio was an intentional aggression.
In several of the discussions online I’ve seen a lot of people not understanding what the problem was with requesting semi-formal wear for the Hugo ceremony. Foz Meadows summed it up better than I did:
”…the fashion at the Hugo Awards ceremonies tends to be a welcoming, eclectic mixture of the sublime, the weird and the comfortable. Some people wear ballgowns and tuxedos; some wear cosplay; others wear jeans and t-shirts. George R. R. Martin famously tends to show up in a trademark peaked cap and suspenders. Those who do dress up for the Hugos do so out of a love of fashion and pageantry, but while their efforts are always admired and appreciated, sharing that enthusiasm has never been a requisite of attending. At an event whose aesthetics are fundamentally opposed to the phrase ‘business casual’ and whose members are often uncomfortable in formalwear for reasons such as expense, gender-nonconformity, sizeism in the fashion industry and just plain old physical comfort, this change to tradition was not only seen as unexpected and unwelcome, but actively hostile.”
I also note that a few days ago Mike Glyer posted a link to a letter from decades back from E.E. “Doc” Smith (the author of the Lensmen books, among others) when the 1962 WorldCon asked for all the ladies attending the award ceremony to wear long formal gowns. Smith commented that his wife had not owned formal wear since entering retirement and thought it was unreasonable to expect people to go to such an expense.
Which is a nice segue to this: until the 34th WorldCon (MidAmericaCon I, 1976 in Kansas City, Missouri) the Hugo Awards were given out at the end of the convention banquet. The banquet consisted of eating (obviously) while the guests of honor gave speeches. Fans who couldn’t afford the extra expense of the banquet were allowed in (usually in a separate area such as a balcony) for the awards portion. The awards ceremony was separated from the banquet in 1976 for a couple of reasons, but one was to make it easier for everyone who wanted to attend to do so. The conventions had gotten so large that the fraction who wanted to see the award ceremony was too much for the banquet halls of typical convention hotels to accommodate, and there had always been the problem of people who couldn’t afford the banquet ticket. I wanted to close with that because I have seen a number of people arguing that the people who are feeling unwelcome because of this con’s actions are making unreasonable demands to change traditions of the conventions.
The traditions change over time for many reasons. It isn’t about change for the sake of change, it is change of the sake of practicality and realism. People have, in the past, believed that science fiction and fantasy was only created by straight white guys, and was only loved by other straight white guys. That has never been true, but the illusion was maintained through a variety of societal forces and some willful ignorance. It has become increasingly difficult to maintain that willful ignorance, and besides, ignorance is never a good look on anyone. It’s not about whether fandom is diverse, it is about to what lengths some people are willing to go to ignore, silence, or push out that diversity.
These other folks, who whine and rage about the new movies, I just assumed they were closer to the median age of the typical internet user. Their first exposure to Star Wars had been to see it on a TV at home, possibly when they were too young to remember it, now. Whereas I saw it as a great movie that changed the way the genre was perceived as well as creating a seismic shift in all of pop culture, to them it had always been there. And they had been too young to understand that the word “empire” was inherently political, just as the phrase “rebel spy and a traitor” was also inherently political.
Oh, how naive I was just a few years ago. I hadn’t realized that the problem was much deeper than that.
Before I go on, a few other people have examined in depth a couple of the issues at hand, and rather than try to construct the same analysis, you should go check these out:
The latter post, by the Aaron Pound, is extremely helpful in this discussion if for no other reason the two tables showing how box office of all the movies in the Star Wars franchise have done, and comparing them to other franchises (expressed in millions of dollars):
Please note: when adjusted for inflation, the original Star Wars made three-and-a-quarter billion dollars at the box office—that’s $3,252,000,000! Notice, also, the big drop-off that The Empire Strikes Back suffered, and then how the number went down a bit more for the third movie, The Return of the Jedi.
Now let’s look at the other chart (also in millions of dollars):
Aaron assembled this second chart to show how a single-character movie in a large franchise fares in comparison to the main courses, if you will. The Avengers and its sequels have made a whole lot more money than each single-character movie in the Marvel universe, and so we shouldn’t be surprised when Solo made a lot less money than The Force Awakens. Unfortunately, at least some execs at Disney didn’t understand this, otherwise they wouldn’t have authorized re-shooting almost the entirety of the film, bringing the cost of making Solo up to approximately $250 million (and then spent about $150 million promoting).
For the record, I liked Solo a lot. But I went into it knowing that because it’s a prequel, it will not cover any new ground. They had to show us how Han and Chewie meet, they had to show us how Han wins the Falcon from Lando in a card game, they had to show us the Kessel run. Those beats have to be hit. And because we’ve seen Han’s story play out in the original trilogy and The Force Awakens we already know who the love of his life will be, and he won’t meet her in this movie. Right? And when we meet Han in the original movie, he’s an established smuggler and scoundrel who owes money to at least one dangerous crime lord, so we can expect that this prequel will be some sort of criminal action-adventure movie. So it is nearly impossible to make this a movie that’s going to blow anyone’s mind.
They delivered a solid heist movie that did show us parts of the universe that the other films have mostly glossed over. It isn’t a bad movie, it’s just the sort of movie more likely to make $400 million than $1 billion, which can’t justify the amount they spent making it.
The angry guys who insist that this is more proof that some how the franchise whose main movies are earning more than a billion each is betraying true fans and so forth, don’t understand how the blockbuster movie industry works, compared to, say, the book publishing industry, or the gaming industry, and so forth. A cadre of true fans can make books profitable, but any group of “true fans” in any genre is simply too small a group to generate a billion dollars in revenue for a single movie.
Because the “true fans,” the kind of fans who argue about the economics of the cloud cities or who are dying to see the back story of characters in the original films are going to number in the thousands, at most. Whereas to make the sort of money that The Force Awakens made, you don’t just need millions of people buying tickets, you need at least 100 million.
And when you consider that the so-called “true fans” who are making this argument are the same guys who are angry that one of the leads of the new movies is a black man, and are furious that the primary protagonist is a woman, and are absolutely livid that another lead character is a chinese woman—well, that just means this is an even smaller fraction of the audience than simply people who are nostalgic for the original trilogy.
And with that belief system, well, it’s clear that they aren’t aligned with the light side of the force, either. That ain’t the force you’re feeling, guys.
Several years ago my employer did a weird re-arrangement of the holiday calendar that results in the office being closed for almost a full week at Christmas, but we no longer observer MLK, Jr Day, Washington’s Birthday, or get a floating holiday. So I had literally forgotten today was even a holiday until after getting on my bus which was far emptier than usual and never filled up, riding on roads that were very empty, finally walking through downtown front the bus to my office through a downtown that is nearly deserted.
If I had remembered, I might have scheduled the post that published this morning for later in the week and written a new post about Washington’s Birthday and the myth of President’s Day. Instead, I’ll repost something I wrote on this line originally three years ago. Enjoy:
That’s not the name of the holidayI’ve written before about the fact that President’s Day is a myth, the official name of the holiday is Washington’s Birthday Observance. Click the link to read about the history of the holiday, the few states that do observe a holiday called President’s Day (though some observe it in completely different months), and so on. Today, I want to talk a little bit about why there has never been a Federal holiday honoring Lincoln’s birthday, and how that contributes to people thinking that today’s holiday is about anyone other than Washington… Read More…
Now the sad part is that we were doing this specifically because we’re both working on hall costumes for NorWesCon (at the end of March). My husband actually found things for one of his costumes, but what did I find? Well, I found a copy of the 1951 edition of the World Publishing Company’s New Twentieth Century Webster’s Dictionary of the English Language Unabridged. Yes, that whole thing was the official title. This was one of the dictionaries produced after the legal ruling that found the Merriam-Webster Company could not prevent other companies from using Noah Webster’s name on their dictionaries even though they weren’t actually using Webster’s original dictionaries nor operating under the auspices of the agreement made between Mr. Webster’s estate and George and Charles Merriam back in 1843.
The World Publishing Company only produced this edition, a two-volume version, and a slightly revised 1953 edition before selling out to Macmillan Publishing USA. This dictionary, while being labeled “unabridged” and spanning approximately 2300 pages isn’t exactly one of the most highly regarded, given that a third of that page count is actually a desk encyclopedia, and the editorial staff hadn’t been working on it for as long as some of the more storied dictionaries. Which isn’t to say that it’s a poorly made dictionary.
But its primary claim to fame is that the editorial staff for this edition was headed up by Professor Harold Whitehall, of the University of Indiana. Whitehall was an interesting choice to edit an American dictionary because Whitehall was British. Whitehall was born in 1905 in Ramsbottom, Lancashire, England. He got his first degree at Nottingham University, studied for a while after at London University, before coming to the U.S. where he obtained is Ph.D. from the University of Iowa. He taught at the University of Iowa, at the University of Wisconsin, and Queen’s College New York, before settling at the University of Indiana where he spent the rest of his academic career. While he was at Michigan, he served as assistant editor of a Dictionary of Middle English (the English spoken during the 12th, 13th, and 14th Centuries), which was probably why he was recruited by the World Publishing Company.
And why it’s important that Whitehall worked on this dictionary is, that while the number of words and depth of the definitions weren’t on a par with other unabridged dictionaries of the time, the New Twentieth Century Webster’s Dictionary had the most thorough etymologies of any America-published dictionaries published up to that date. Because linguistics—specifically the history and derivation of our language—was Whitehall’s passion.
When Macmillan acquired most of the World Publishing Company, they already had a staff of dictionary editors, but they asked Whitehall to stay on, created the post of Linguistics Editor for him, and they released several more editions of this dictionary for subsequent years, before the company was acquired by another publisher in 1998, who sold off the reference division to yet another company in 1999 and so on. Whitehall stopped working for them some time before 1960, though he continued to teach English and Linguistics at the University of Indiana until his death in 1986.
In honor of my finally acquiring my own copy of this dictionary famous for bringing a new level of etymological rigor to American dictionaries, this is a perfect time to talk about why understanding when your dictionary was created and how it is being maintained is important. Don’t assume that just because there are lots of free dictionaries available on the internet that anyone started with a high quality source or experts are keeping it up to date. And this is important because the language is a living thing that changes over time.
For instance, terrific used to mean terrifying (terrific is to terror as horrific is to horror, as a friend so eloquently put it). As the 1951 edition puts it (shown in the picture above I took the other night):
“ter-rif’ic, a [L. terrificus, from terrere, to frighten and facere, to make.] Dreadful; causing terror; adapted to excite great fear or dread; as a terrific form; a terrific sight.”
How did the word come to mean the opposite? Simple, the sarcastic or ironic use became far more common than the original meaning. People used it sarcastically to refer to something that wasn’t horrifying at all—quite the opposite—and people hearing that usage while not being familiar with the word themselves inferred its meaning from context. And soon everyone was using terrific as a synonym for “wonderful” instead of “horrible.”
Notice from the image above, there is no other definition given. If we jump ahead to one of my 1987 dictionaries, for instance, we find the primary definition being “causing great fear or terror”, the second as “remarkable or severe” and only the third definition, marked informal is “very good or wonderful.” Whereas my 2001 Oxford New American Dictionary lists the “causing terror” definition as archaic, but even then, the primary definition is “of great size, amount or intensity,” and the second sense of “extremely good or excellent” is still listed as informal. Although that may be because the editorial board of the Oxfords include a lot of British people. Most of my American published dictionaries from the late 90s on list something along the lines of “extraordinarily good” as the primary definition.
But this is part of the reason I am obsessed with dictionaries and how they are made. I have watched the meanings of some words change in my lifetime. It’s important to know this happens, particularly if you ever read books or stories written many years ago.
Some words don’t mean what they used to. That’s not a bad thing, but it can cause some confusion and consternation from time to time. Did I mention, that while consternation now means “feelings of anxiety or dismay” that is once used to mean “terrified”?
Really good article: Most Everything You Learned About Thanksgiving Is Wrong
America was inhabited already when Columbus blundered his way into the West Indies. They are called the West Indies, in case you didn’t know, because he thought he had sailed all the way around the world to Japan, China, and India. Seriously. He was convinced that San Salvador was Japan, and Cuba was China.
Columbus wasn’t a great thinker. Contrary to what school teachers were still telling us when I was in grade school, Europeans had known for centuries that the world was round. And Pythagoras and Aristotle had both deduced that the Earth was a sphere because of the shape of the Earth’s shadow on the moon during Lunar eclipses. Eratosthenes calculated the size of the Earth pretty accurately based on shadows at different latitudes more than 200 years before the time of Christ (He also correctly deduced the tilt of the Earth’s axis a bit later).
Columbus thought that Eratosthenes was wrong, that the Earth was much smaller, and that it would take only a short time sailing west to reach Asia. He was very wrong. And not just because there were two continents Europe didn’t know about.
And then there was the abominable way the Columbus and the Europeans that followed treated the people who lived here. It was not, as some of my other teachers used to say, merely that the Europeans had more advanced technology. The Europeans were fond of making written agreements with the people who already lived here, and then when it suited them, ignore the agreements and take, kill, or pillage whatever they wanted.
So, yeah, even though I am a pasty-skinned, blue-eyeed white guy with ancestors from places like Ireland, England, and France, count me as one of the people who celebrates Indigenous Peoples Day.
The movement to replace Columbus Day with a holiday honoring Native Americans have been around for a long time. In 1989 the state of South Dakota abolished the state observance of Columbus Day and enacted a Native American Day to be observed on the same day as the Federal Observance fo Columbus Day.
Several other states: California, Nevada, and Tennessee all observe a Native American Day in September (the California holiday first called for by then-Governor Ronald Reagan in 1968, though not enacted into law until 1998).
Governors in Alaska and Vermont (and probably others, but I haven’t found them, yet) have issued proclamations to declare and Indigenous Peoples Day, but neither state’s legislature has enacted it into law, and such proclamation tend to be ceremonial, usually assumed only to apply to the year issued.
On the other hand, a rather huge number of cities and towns all over the country have adopted ordinances replacing Columbus Day with Indigenous Peoples Day. Maybe when more follow more states will join South Dakota.
There were many reasons why I didn’t behave like a “normal” boy. And usually when I have written about this topic before I have focused on how as a queer kid I was gender non-conforming. But that wasn’t the only problem. There are queer kids who did a better job than I ever did of blending in. And there are lots of not-queer kids who were bullied for being different in other ways. I had other strikes against me.
One of my relatives, for instance, described me as “a lost adult trapped in a child’s body” when referring to my childhood. One reason several people perceived me in that way as a child is because my intelligence was several standard deviations above average. That had two very distinct effects on my behavior. One was that I often understood and knew things people didn’t expect a child to know, but the other was that there were very few of the kids my age that I got along with, so I kept forming close relationships with adults. And that increased the gap between myself and most of the kids my age.
Now, the word “normal” derives from the Latin normalis, which means made according to a right-angle or square. But ask most people what normal means and you’ll probably get something close to what Oxford calls sense 3: “Constituting or conforming to a type or standard; regular, usual, typical; ordinary, conventional. Also, physically or mentally sound, healthy.” Interestingly, that usage of the word in English only came about in the early 1800s. When in first came into the language, in the late 1400s, it referred exclusively to a regular verb. Then in the mid 1600s its meaning expanded to refer “Right-angled, standing at right angles; perpendicular.” Which is how it entered the lexicography of mathematics.
I was interested in science for as long as I can remember. We can blame my mom the science fiction fan for that. When I was a baby, she literally read aloud whichever Robert Heinlein or Ray Bradbury or similar book she had checked out from the library. And mathematics is something I fell in love with early in school. We moved around a lot because of my dad’s job in the petroleum industry, but as luck would have it, the school district where I attended first grade and a portion of second was one that won awards for excellence year after year. They gave me a great start.
For instance, the explanation my second grade teacher in Fort Collins had given me of the Distributive Property, was how I got labeled a freak on the first day (three schools later) that I attended school in Cheyenne Wells. It was late spring in Third Grade when we moved to Cheyenne Wells, and they were just getting to things like the Distributive Property of Multiplication. The teacher tried to explain it to class, but her explanation wasn’t very good. And during the period when we were supposed to be going through a worksheet and helping each other with the problems, the teacher overheard me explaining the the kid next to me how it works, so she brought me to the front of the room and made me explain it to the whole class. And then they all knew I was a Math Freak, a Brain, and the Teachers new Pet.
It wasn’t just the first school, of course, it was also the fact that I loved to read so much, that whenever I was given a new set of books at school, I would read them all the way to the end on my own as soon as I could. And half the time that I spent in the library I was tracking down non-fiction books about topics that came up in the science fiction, mystery, and adventure books that I loved. And most of the time throughout grade school and middle school, I would rather sit in a corner and read than run around the playground or do other things the rest of the kids were doing any time we were turned loose.
That always failed to endear me to the other kids.
Despite the fact that at heart I was an introvert, I also loved explaining things to people. Which often came across as me being a show off or know it all.
As an adult, I work in a technology field writing and designing documentation and help systems explaining how systems work. So all of those characteristics eventually became useful, eventually.
But there was no amount of counseling from that therapist—or mentoring from my middle school wrestling coach (and pre-algebra teacher!), or the other attempts by specific teachers who tried to take me under their wing to steer me through the shoals of bullying—that would make a smart, queer, introverted, book- and science-loving, know-it-all pass for normal in a typical primary or secondary school.
Which isn’t a slam on the other kids, but rather the way we herd children together by age and leave them to their own devices to work out social dynamics. The theory is that we learn to get along with diverse people that way, but the system creates an artificial social environment that encourages some of our worst behaviors.
I survived. I not only came out of the system free of bitterness and resentment, I often find myself in the position of defending public schools from the distorted statistics some people wave around trying to prove other options are better (spoiler alert: the statistics are on traditional public school’s favor). And when it comes to bullying, private schools and charter schools don’t handle those situations one iota better. In fact, for marginalized kids, they are much, much worse, statistically.
But I digress.
Learning to get along is a worthwhile goal. Conformity and trying to pretend you’re something you’re not, are toxic and destructive. I wish we were better at teaching the former, rather than enforcing the latter.
Each of those statements was a lie.
I was a teen-ager in the 70s when the Southern Baptist Convention finally endorsed desegregation of its churches. And it was as a teen that I learned most of what I’d been taught about the history of our denomination and the Civil War was untrue.
Historically, every state that seceded to form the Confederacy (not just Mississippi a port of whose declaration is pictured above), explicitly listed either slavery or the superiority of the white race (and some mentioned both), as their reasons for seceding. The infamous cornerstone speech delivered by Confederate Vice President Alexander Stephens explained that the foundation of the new Confederate government was “the great truth, that the negro is not equal to the white man; that slavery — subordination to the superior race — is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”
It can’t be any clearer than that: the primary mission of the Confederacy was the perpetuation of slavery of black people and the entrenchment (nay, glorification) of white supremacy. And Confederate soldiers did not volunteer, fight, and die by the thousands because of some need to preserve the mythical idyllic pastoral culture of the Southern plantation—most of them were too poor to own plantations, for one thing! No, typical Confederate grunt believed that if slaves were freed, working class whites would surely lose their livelihoods. The collective self-esteem of the white working class was shored up by the explicit statement that at least they weren’t slaves, so while they might have worked hard in exchange for less than their fair share of societal prosperity, at east they were better off than those black folks! The abolition of slavery was then perceived as an existential threat to the white working class. Of course they were willing to take up arms to protect slavery!
In the immediate aftermath of the war, symbols of the Confederacy weren’t displayed publicly. There were memorials erected in a few places to those who died in one battle or another, and certainly individual tombstones were occasionally emblazoned with Confederate symbols, but there wasn’t a stampede to erect statues to the leaders of the Confederacy afterward. For one thing, there wasn’t a lot of pride in having been on the losing side.
The first big rush of Confederate monuments was years after the war ended as Reconstruction officially ended and Federal troops were withdrawn in 1877. Across the former Confederacy, state legislatures started enacting Jim Crow laws, designed to make it difficult or nearly impossible for black people to exercise their right to vote and to enforce segregation of the races. And statues and monuments went up all over the South. The plaques usually talked about the bravery of the person depicted, but there were also language about the nobility of the cause for which they fought. Blacks living in those states, most of whom were former slaves, knew exactly what that cause had been, and the message the statues and monuments was clearly: “white people are in charge again, and don’t you forget it!”Most of the Confederate monuments were put up in the 1910s and 1920s, coinciding with an increase in activity of the KKK and similar organizations terrorizing blacks. And the next big surge was in the 50s and 60s when civil rights organizations began having successes against some of the Jim Crow laws. The purpose of those monuments was not to honor the culture of the South, the message was still “stay in your place, black people, or else!” A great example of this resides not many miles from my home. Washington territory was never a part of the Confederacy, and the few inhabitants of the state who served in the war did so as part of the Union Army and Navy. A local family, some years after the war, donated land in what would one day become the Capitol Hill neighborhood to the Grand Army of the Republic (which was an organization made up mostly of Union side Civil War Veterans) for a cemetery for Union soldiers. And that’s who was buried there. But decades later, during one of those surges of monument building, the Daughters of the Confederacy paid to have a monument to soldiers of the Confederacy erected in the cemetery. There are no Confederate soldiers buried there. Not one. And there are no soldiers’ names engraved on the massive monument. But there it is, erected in a cemetery full of Union soldiers, a monument to the so-called noble cause of the Confederacy.
Now that some communities are rethinking these monuments—many of them extremely cheap bronze statues erected during times of civil rights tensions—other people are claiming taking them down is erasing history. No, taking down these post-dated monuments in public parks and so forth isn’t erasing history, it’s erasing anti-historical propaganda. The other argument that is put forward in defense of the monuments is that “both sides deserve to be heard.” That’s BS in this case, because there aren’t two sides to racism. There aren’t two sides to bigotry. There aren’t two sides to genocide. White supremacy is not a legitimate side to any argument.
When we defeated Hitler’s armies, we didn’t turn around and erect monuments to the government that murdered millions of people in concentration camps. We destroyed their symbols. When we liberated Iraq, we tore down the statues of Saddam Hussein, we didn’t enshrine his image in an attempt to give both sides equal time. Those few Confederate monuments that list off names of people who died are fine (even if a lot of them have cringeworthy language about the cause they were fighting for). Cemeteries where actual Confederate veterans are buried of course can have symbols of the Confederacy on the tombstones and the like. But the other monuments, the ones erected years later, they don’t belong in the public square.
They belong in the dustbin of history.