Gilbert Baker was born in Kansas in 1951. From an early age he was fascinated with fabrics and color. He attributed this early interest to the women’s clothing store which was owned by his grandmother. Even with that family connection, though, in small town Kansas in the 1950s no one thought a boy should learn to sew. In 1970 the 19-year-old Gilbert was drafted into the army, where he was trained as a medic and stationed in San Francisco, where he treated soldiers who had been wounded in Vietnam.
“In 1978, when I thought of creating a flag for the gay movement, there ws no other international symbol for us than the pink triangle, which the Nazis had used to identify homosexuals in concentration camps. Even though the pink triangle was and still is a powerful symbol, it was very much forced upon us.
“I almost instantly thought of using the rainbow. To me, it was the only thing that could really express our diversity, beauty, and our joy. I was astounded nobody had thought of making a rainbow flag before because it seemed like such an obvious symbol for us.”
—Gilbert Baker, 1951-2017
In 1970 there was a thriving queer community in San Francisco. Gilbert found other people like himself, and managed to serve out his tour as a medic without getting caught (being gay was a court martial offense), so he was honorably discharged. But having found a community, he chose to stay. He bought a sewing machine and taught himself to sew. He hung out with a lot of other artists. He designed fabulous drag costumes. And he also began designing pro-gay and anti-war protest banners for a variety of marches and rallies. Soon he was known as “the banner guy.”
When Harvey Milk was elected a city supervisor, becoming the first openly gay man elected to public office in the U.S., he had worked with Gilbert a few times in relationship to those rallies and protests. And so when Milk thought that the community needed a new symbol to unite around, he asked Gilbert to create it.
Note that Milk asked him to create a symbol, not necessarily a flag. But Gilbert said he settled on a flag very quickly, because a flag represents sovereignty. “A flag,” he said, “proclaims that gays are a people, a family, a tribe.” He chose the rainbow as the basis of the flag because it represented diversity—of race, gender, age. “Plus, it’s a natural flag — it’s from the sky!”The Gay Freedom Day Committee provided money, and the Gay Community Center provided working space. Gilbert Baker and approximately 30 friends gathered together with over a thousand yards of cotton fabric and a lot of bottles of dye, and carefully created fabric in eight colors: hot pink, red, orange, yellow, green, turquoise and violet. Gilbert also worked with Fairy Argyle, who was known as the Queen of Tie-Day, to create a square of blue fabric that had tie-dyed stars on it, to evoke the field of stars on the U.S. flag. Gilbert sewed two different flag designs in 1978, the first was the 8-stripe rainbow, the second one looking like the American flag, but with the tie-dyed stars and rainbow stripes.
The two flags were first hoisted into the sky above San Francisco’s U.N. Plaza as part of the Gay Freedom Day Parade on June 25th, 1978. Gilbert’s longtime friend, Cleve Jones, described the day as having the perfect amount of wind to make the flag furl, but not be unpleasant on the ground: “It was just stunning.”Five months later, Harvey Milk was assassinated, and the community was thrown into mourning. Thousands gathered that night in the Castro, that marched to city hall where they held a candlelight vigil. In the following days, people began asking for rainbow flags. To meet the sudden demand, Gilbert worked with the Paramount Flag company to mass produce flags. They used a then stand available rainbow fabric with only seven stripes: red, orange, yellow, green, turquoise, blue, and violet. The Freedom Day committee wanted larger flags for the next Pride Parade, and Gilbert went to work, dropped the hot pink stripe from his larger hand-sewn flags in part because the dye was difficult to obtain, and no one was manufacturing stock hot pink fabric.
And the next year he dropped another stripe. Some say that the turquoise was dropped because when the flags were hung vertically from city light poles the middle stripe wasn’t visible from other angles. Gilbert said that turquoise and indigo fabric was difficult to obtain, so he switched to a navy blue stripe.
I’ve written before that the rainbow flag was not immediately embraced by everyone in the LGBT+ community. In fact, it was considered more a regional thing until a court case in 1989, when a West Hollywood man had to sue his landlord for the right to fly the rainbow flag from his apartment balcony.
In 1994 Gilbert supervised the creation of the first mile-long rainbow flag to commemorate the 25 anniversary of the Stonewall riots. The flag was cut up afterward to make smaller flags. Some sections were sold as a fundraiser, others were distributed to Pride Parade committees in other cities. In 2003, the 25th anniversary of the creation of the rainbow flag, Gilbert was commissioned to create another giant flag. This one was one and a quarter miles long and was carried in the Key West, Florida Pride event. It was eventually cut into 100 slightly less giant flags and again distributed to various cities around the world.
Gilbert often described himself as the Queer Betsy Ross and was sometimes asked to give his blessing to some variants designed by others (such as the Victory Over Aids Flag, which used a lighter violet and had a black stripe to symbolize our mourning for those who have died of complications of AIDS). It is worth noting that except when he was directly commissioned, Gilbert didn’t make money from his creation. In his later years he struggled financially. But the one interview I saw where someone asked him about it, he said it would have been wrong to try to trademark the design. How could it be a symbol of our tribe if it legally belonged to one person?
After 2003, Gilbert started lobbying for a return to the original 8-stripe version, so far to little avail. When Barack Obama was elected President, Gilbert hand sewed an 8-stripe version as a gift to Obama, and during the Obama administration that flag was displayed in the White House.Gilbert redesigned the flag one more time before he died. The election of Trump prompted him to add a 9th stripe, lavender for diversity or resistance. He sewed 39 by hand before his death, and they were used in the following San Francisco Pride Parade.
When I was first coming out of the closet in the late 80s, pink triangles were the symbol I saw around the Seattle queer community. You could find pink triangle buttons and key chains and bumper stickers and so forth in every store in the gayborhood. There were rainbows, as well, but the pink triangle outnumbered them. Then in the 90s, when suddenly there were rainbows everywhere, especially at pride, there was a bit of a backlash. I heard more than one person grumble about rainbows everywhere.
But I think Gilbert was on to something. The pink triangle was forced on us by oppressors; it was also most often used to identify gay men in the concentration camps—therefore many lesbians felt the reclaimed symbol didn’t include them. There is something joyful about the bright colors of the rainbow flag. The different colors side-by-side can signify that diversity Gilbert talked about: different races, different genders, different generations of queer people.
And I confess that as long as anti-gay religious wingnuts have conniption fits about us supposedly stealing the symbol from god, I’m going to take a bit of delight in raising my own rainbow flag. And it isn’t just about sticking it to the haters. Rainbows appear in the sky after a storm. They are beautiful and ephemeral and otherworldly. It’s difficult to look up at one in the sky after storm clouds have cleared and not feel at least a bit of wonder.
As queers we encounter a lot of storms in life. We may be bullied as kids. We may face discrimination and even physical assault as adults. We achieve a small victory, and then face a conservative backlash. In my lifetime there have been campaigns to pass laws to bar us from certain professions, even as courts and civil rights laws open some doors for us. The AIDS crisis killed tens of thousands, and it wasn’t just Republican politicians who laughed at our suffering during the 1980s. But every tempest and onslaught that we weather makes us a stronger. We have setbacks, but we fight on, moving ever forward.
Like the rainbow, we shine on after each storm.
I was going to write a post about Daylight Saving Time, specifically the many myths that get thrown around by people trying to explain it. I think the fact that almost no one understands why we do it is one of the best arguments for why we shouldn’t do it at all. Let alone the problems the switch causes: Heart problems, road accidents and mood changes are associated with the DST time change. But while I was searching for a good image to attach to such a post, I found this Buzzfeed article and includes a section that hits all the notes I wanted to:
In 1905, a British architect named William Willett invented daylight saving time. Willett was out for his regular early-morning horse ride when it he noticed that 1) it was rather light outside, and 2) he was the only one up. Like Franklin, he thought this was a waste of perfectly good sunlight. And it ~dawned~ on him that instead of getting everyone up earlier by blasting cannons, they could simply shift their clocks forward to take better advantage of that sweet daylight. So, in 1907 Willett published a pamphlet outlining his formal proposal. He suggested that people turn their clocks forward 20 minutes every Sunday in April at 2 a.m. (And then they would set the clocks back by 20 minutes every Sunday in September.) He argued that this would get people outside and exercising, and that it would save on electricity, gas, candles, etc. (He also estimated it would save $200 million in today’s dollars. This was…again, a wild exaggeration.) A member of parliament, Richard Pearce, heard about Willett’s idea and was into it; he introduced Pearce’s Daylight Saving Bill to the House of Commons in February of 1908. The idea of changing the clocks four times in a month didn’t go over well, and the bill was eventually revised so that the clocks would be set forward one hour at 2 a.m. on the third Sunday in April (and then set back in September).
The bill was endorsed by merchants, banks, railroad companies, and the guy who created Sherlock Holmes, but was opposed by most astronomers and scientists. And one newspaper wrote “that if a man were going to a 7:00 dinner, under the new arrangement of daylight he would appear on the streets of London in evening dress at 5:40, which would shake the British Empire to its foundations.”
You know who else opposed the bill? FARMERS. They argued from the start that they couldn’t perform their operations at a different time — for example, they couldn’t harvest grass for hay while it was still wet with dew, and the dew wasn’t going to disappear earlier just because the clock had changed. And there were other activities that they couldn’t do until temperatures dropped after the sun went down. Basically, they hated DST from its inception.
Despite the association with farmers, daylight saving time actually came to the United States thanks to business owners (and war)
If you feel like garbage this week, you can direct your curses toward Marcus A. Marks, a clothing manufacturer; A. Lincoln Filene, a department store owner; and Robert Garland, a Pittsburgh industrialist. These three were very pro-DST, and were able to get labor organizations on board, along with the US Chamber of Commerce, the president of the National League of Baseball Clubs, and other prominent business owners. Even President Woodrow Wilson wrote a letter expressing his support for their efforts.
Less than two weeks after the US entered WWI, a daylight saving bill was introduced in Congress. It was heavily opposed by farmers, and also railroad companies, who were concerned about anything that could mess with the standard time zones (which had only recently become A Thing — a story for another day), and who said that 1,698,818 (!!) clocks and watches along their routes would have to be changed if DST were implemented. Because the fewest trains were running at 2 a.m., that became the proposed hour for the change-over. And because the most coal was consumed in March and October in the States, the bill was expanded to include those two months. On March 19, 1918, daylight saving time was signed into law in the United States, and took effect on March 31 of that year.
—“9 Things You Probably Don’t Know About Daylight Saving Time” by Rachel Wilkerson Miller, for Buzzfeed
The energy consumption savings argument was difficult to back up with numbers in 1918. The energy consumption argument at least had some slight possibility of being correct in 1918, when the vast majority of energy use was in factories, retail businesses, and the like. Residential energy use was limited to cooking, heating, and providing light usually with oil- or gas-burning lamps.
But in 2018 the argument doesn’t hold up. For instance, residential energy use thanks to all our computers, TVs, sound systems, game systems, refrigerators, microwaves, et cetera is a larger fraction of the total national energy consumption. And the amount of that home energy consumed for lighting is much smaller than all those other things. Also, a much larger proportion of businesses run 24 hours a day than did back then. Setting clocks forward or back has negligible impact on how much energy is used per day on a 24-hour business.
What I’m saying is, there isn’t much reason to justify the effort, the impacts on people’s health, and other costs of this twice annual fiddling with the clock.
Besides, I’ve always agreed with the one reaction, usually attributed to an elderly man on a Native American Reservation after first getting an explanation of Daylight Saving Time: “Only a fool would think you could cut a foot off the top of a blanket, then sew it to the bottom to get a longer blanket.”
Several years ago my employer did a weird re-arrangement of the holiday calendar that results in the office being closed for almost a full week at Christmas, but we no longer observer MLK, Jr Day, Washington’s Birthday, or get a floating holiday. So I had literally forgotten today was even a holiday until after getting on my bus which was far emptier than usual and never filled up, riding on roads that were very empty, finally walking through downtown front the bus to my office through a downtown that is nearly deserted.
If I had remembered, I might have scheduled the post that published this morning for later in the week and written a new post about Washington’s Birthday and the myth of President’s Day. Instead, I’ll repost something I wrote on this line originally three years ago. Enjoy:
That’s not the name of the holidayI’ve written before about the fact that President’s Day is a myth, the official name of the holiday is Washington’s Birthday Observance. Click the link to read about the history of the holiday, the few states that do observe a holiday called President’s Day (though some observe it in completely different months), and so on. Today, I want to talk a little bit about why there has never been a Federal holiday honoring Lincoln’s birthday, and how that contributes to people thinking that today’s holiday is about anyone other than Washington… Read More…
Now the sad part is that we were doing this specifically because we’re both working on hall costumes for NorWesCon (at the end of March). My husband actually found things for one of his costumes, but what did I find? Well, I found a copy of the 1951 edition of the World Publishing Company’s New Twentieth Century Webster’s Dictionary of the English Language Unabridged. Yes, that whole thing was the official title. This was one of the dictionaries produced after the legal ruling that found the Merriam-Webster Company could not prevent other companies from using Noah Webster’s name on their dictionaries even though they weren’t actually using Webster’s original dictionaries nor operating under the auspices of the agreement made between Mr. Webster’s estate and George and Charles Merriam back in 1843.
The World Publishing Company only produced this edition, a two-volume version, and a slightly revised 1953 edition before selling out to Macmillan Publishing USA. This dictionary, while being labeled “unabridged” and spanning approximately 2300 pages isn’t exactly one of the most highly regarded, given that a third of that page count is actually a desk encyclopedia, and the editorial staff hadn’t been working on it for as long as some of the more storied dictionaries. Which isn’t to say that it’s a poorly made dictionary.
But its primary claim to fame is that the editorial staff for this edition was headed up by Professor Harold Whitehall, of the University of Indiana. Whitehall was an interesting choice to edit an American dictionary because Whitehall was British. Whitehall was born in 1905 in Ramsbottom, Lancashire, England. He got his first degree at Nottingham University, studied for a while after at London University, before coming to the U.S. where he obtained is Ph.D. from the University of Iowa. He taught at the University of Iowa, at the University of Wisconsin, and Queen’s College New York, before settling at the University of Indiana where he spent the rest of his academic career. While he was at Michigan, he served as assistant editor of a Dictionary of Middle English (the English spoken during the 12th, 13th, and 14th Centuries), which was probably why he was recruited by the World Publishing Company.
And why it’s important that Whitehall worked on this dictionary is, that while the number of words and depth of the definitions weren’t on a par with other unabridged dictionaries of the time, the New Twentieth Century Webster’s Dictionary had the most thorough etymologies of any America-published dictionaries published up to that date. Because linguistics—specifically the history and derivation of our language—was Whitehall’s passion.
When Macmillan acquired most of the World Publishing Company, they already had a staff of dictionary editors, but they asked Whitehall to stay on, created the post of Linguistics Editor for him, and they released several more editions of this dictionary for subsequent years, before the company was acquired by another publisher in 1998, who sold off the reference division to yet another company in 1999 and so on. Whitehall stopped working for them some time before 1960, though he continued to teach English and Linguistics at the University of Indiana until his death in 1986.
In honor of my finally acquiring my own copy of this dictionary famous for bringing a new level of etymological rigor to American dictionaries, this is a perfect time to talk about why understanding when your dictionary was created and how it is being maintained is important. Don’t assume that just because there are lots of free dictionaries available on the internet that anyone started with a high quality source or experts are keeping it up to date. And this is important because the language is a living thing that changes over time.
For instance, terrific used to mean terrifying (terrific is to terror as horrific is to horror, as a friend so eloquently put it). As the 1951 edition puts it (shown in the picture above I took the other night):
“ter-rif’ic, a [L. terrificus, from terrere, to frighten and facere, to make.] Dreadful; causing terror; adapted to excite great fear or dread; as a terrific form; a terrific sight.”
How did the word come to mean the opposite? Simple, the sarcastic or ironic use became far more common than the original meaning. People used it sarcastically to refer to something that wasn’t horrifying at all—quite the opposite—and people hearing that usage while not being familiar with the word themselves inferred its meaning from context. And soon everyone was using terrific as a synonym for “wonderful” instead of “horrible.”
Notice from the image above, there is no other definition given. If we jump ahead to one of my 1987 dictionaries, for instance, we find the primary definition being “causing great fear or terror”, the second as “remarkable or severe” and only the third definition, marked informal is “very good or wonderful.” Whereas my 2001 Oxford New American Dictionary lists the “causing terror” definition as archaic, but even then, the primary definition is “of great size, amount or intensity,” and the second sense of “extremely good or excellent” is still listed as informal. Although that may be because the editorial board of the Oxfords include a lot of British people. Most of my American published dictionaries from the late 90s on list something along the lines of “extraordinarily good” as the primary definition.
But this is part of the reason I am obsessed with dictionaries and how they are made. I have watched the meanings of some words change in my lifetime. It’s important to know this happens, particularly if you ever read books or stories written many years ago.
Some words don’t mean what they used to. That’s not a bad thing, but it can cause some confusion and consternation from time to time. Did I mention, that while consternation now means “feelings of anxiety or dismay” that is once used to mean “terrified”?
Really good article: Most Everything You Learned About Thanksgiving Is Wrong
America was inhabited already when Columbus blundered his way into the West Indies. They are called the West Indies, in case you didn’t know, because he thought he had sailed all the way around the world to Japan, China, and India. Seriously. He was convinced that San Salvador was Japan, and Cuba was China.
Columbus wasn’t a great thinker. Contrary to what school teachers were still telling us when I was in grade school, Europeans had known for centuries that the world was round. And Pythagoras and Aristotle had both deduced that the Earth was a sphere because of the shape of the Earth’s shadow on the moon during Lunar eclipses. Eratosthenes calculated the size of the Earth pretty accurately based on shadows at different latitudes more than 200 years before the time of Christ (He also correctly deduced the tilt of the Earth’s axis a bit later).
Columbus thought that Eratosthenes was wrong, that the Earth was much smaller, and that it would take only a short time sailing west to reach Asia. He was very wrong. And not just because there were two continents Europe didn’t know about.
And then there was the abominable way the Columbus and the Europeans that followed treated the people who lived here. It was not, as some of my other teachers used to say, merely that the Europeans had more advanced technology. The Europeans were fond of making written agreements with the people who already lived here, and then when it suited them, ignore the agreements and take, kill, or pillage whatever they wanted.
So, yeah, even though I am a pasty-skinned, blue-eyeed white guy with ancestors from places like Ireland, England, and France, count me as one of the people who celebrates Indigenous Peoples Day.
The movement to replace Columbus Day with a holiday honoring Native Americans have been around for a long time. In 1989 the state of South Dakota abolished the state observance of Columbus Day and enacted a Native American Day to be observed on the same day as the Federal Observance fo Columbus Day.
Several other states: California, Nevada, and Tennessee all observe a Native American Day in September (the California holiday first called for by then-Governor Ronald Reagan in 1968, though not enacted into law until 1998).
Governors in Alaska and Vermont (and probably others, but I haven’t found them, yet) have issued proclamations to declare and Indigenous Peoples Day, but neither state’s legislature has enacted it into law, and such proclamation tend to be ceremonial, usually assumed only to apply to the year issued.
On the other hand, a rather huge number of cities and towns all over the country have adopted ordinances replacing Columbus Day with Indigenous Peoples Day. Maybe when more follow more states will join South Dakota.
There were many reasons why I didn’t behave like a “normal” boy. And usually when I have written about this topic before I have focused on how as a queer kid I was gender non-conforming. But that wasn’t the only problem. There are queer kids who did a better job than I ever did of blending in. And there are lots of not-queer kids who were bullied for being different in other ways. I had other strikes against me.
One of my relatives, for instance, described me as “a lost adult trapped in a child’s body” when referring to my childhood. One reason several people perceived me in that way as a child is because my intelligence was several standard deviations above average. That had two very distinct effects on my behavior. One was that I often understood and knew things people didn’t expect a child to know, but the other was that there were very few of the kids my age that I got along with, so I kept forming close relationships with adults. And that increased the gap between myself and most of the kids my age.
Now, the word “normal” derives from the Latin normalis, which means made according to a right-angle or square. But ask most people what normal means and you’ll probably get something close to what Oxford calls sense 3: “Constituting or conforming to a type or standard; regular, usual, typical; ordinary, conventional. Also, physically or mentally sound, healthy.” Interestingly, that usage of the word in English only came about in the early 1800s. When in first came into the language, in the late 1400s, it referred exclusively to a regular verb. Then in the mid 1600s its meaning expanded to refer “Right-angled, standing at right angles; perpendicular.” Which is how it entered the lexicography of mathematics.
I was interested in science for as long as I can remember. We can blame my mom the science fiction fan for that. When I was a baby, she literally read aloud whichever Robert Heinlein or Ray Bradbury or similar book she had checked out from the library. And mathematics is something I fell in love with early in school. We moved around a lot because of my dad’s job in the petroleum industry, but as luck would have it, the school district where I attended first grade and a portion of second was one that won awards for excellence year after year. They gave me a great start.
For instance, the explanation my second grade teacher in Fort Collins had given me of the Distributive Property, was how I got labeled a freak on the first day (three schools later) that I attended school in Cheyenne Wells. It was late spring in Third Grade when we moved to Cheyenne Wells, and they were just getting to things like the Distributive Property of Multiplication. The teacher tried to explain it to class, but her explanation wasn’t very good. And during the period when we were supposed to be going through a worksheet and helping each other with the problems, the teacher overheard me explaining the the kid next to me how it works, so she brought me to the front of the room and made me explain it to the whole class. And then they all knew I was a Math Freak, a Brain, and the Teachers new Pet.
It wasn’t just the first school, of course, it was also the fact that I loved to read so much, that whenever I was given a new set of books at school, I would read them all the way to the end on my own as soon as I could. And half the time that I spent in the library I was tracking down non-fiction books about topics that came up in the science fiction, mystery, and adventure books that I loved. And most of the time throughout grade school and middle school, I would rather sit in a corner and read than run around the playground or do other things the rest of the kids were doing any time we were turned loose.
That always failed to endear me to the other kids.
Despite the fact that at heart I was an introvert, I also loved explaining things to people. Which often came across as me being a show off or know it all.
As an adult, I work in a technology field writing and designing documentation and help systems explaining how systems work. So all of those characteristics eventually became useful, eventually.
But there was no amount of counseling from that therapist—or mentoring from my middle school wrestling coach (and pre-algebra teacher!), or the other attempts by specific teachers who tried to take me under their wing to steer me through the shoals of bullying—that would make a smart, queer, introverted, book- and science-loving, know-it-all pass for normal in a typical primary or secondary school.
Which isn’t a slam on the other kids, but rather the way we herd children together by age and leave them to their own devices to work out social dynamics. The theory is that we learn to get along with diverse people that way, but the system creates an artificial social environment that encourages some of our worst behaviors.
I survived. I not only came out of the system free of bitterness and resentment, I often find myself in the position of defending public schools from the distorted statistics some people wave around trying to prove other options are better (spoiler alert: the statistics are on traditional public school’s favor). And when it comes to bullying, private schools and charter schools don’t handle those situations one iota better. In fact, for marginalized kids, they are much, much worse, statistically.
But I digress.
Learning to get along is a worthwhile goal. Conformity and trying to pretend you’re something you’re not, are toxic and destructive. I wish we were better at teaching the former, rather than enforcing the latter.
Each of those statements was a lie.
I was a teen-ager in the 70s when the Southern Baptist Convention finally endorsed desegregation of its churches. And it was as a teen that I learned most of what I’d been taught about the history of our denomination and the Civil War was untrue.
Historically, every state that seceded to form the Confederacy (not just Mississippi a port of whose declaration is pictured above), explicitly listed either slavery or the superiority of the white race (and some mentioned both), as their reasons for seceding. The infamous cornerstone speech delivered by Confederate Vice President Alexander Stephens explained that the foundation of the new Confederate government was “the great truth, that the negro is not equal to the white man; that slavery — subordination to the superior race — is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”
It can’t be any clearer than that: the primary mission of the Confederacy was the perpetuation of slavery of black people and the entrenchment (nay, glorification) of white supremacy. And Confederate soldiers did not volunteer, fight, and die by the thousands because of some need to preserve the mythical idyllic pastoral culture of the Southern plantation—most of them were too poor to own plantations, for one thing! No, typical Confederate grunt believed that if slaves were freed, working class whites would surely lose their livelihoods. The collective self-esteem of the white working class was shored up by the explicit statement that at least they weren’t slaves, so while they might have worked hard in exchange for less than their fair share of societal prosperity, at east they were better off than those black folks! The abolition of slavery was then perceived as an existential threat to the white working class. Of course they were willing to take up arms to protect slavery!
In the immediate aftermath of the war, symbols of the Confederacy weren’t displayed publicly. There were memorials erected in a few places to those who died in one battle or another, and certainly individual tombstones were occasionally emblazoned with Confederate symbols, but there wasn’t a stampede to erect statues to the leaders of the Confederacy afterward. For one thing, there wasn’t a lot of pride in having been on the losing side.
The first big rush of Confederate monuments was years after the war ended as Reconstruction officially ended and Federal troops were withdrawn in 1877. Across the former Confederacy, state legislatures started enacting Jim Crow laws, designed to make it difficult or nearly impossible for black people to exercise their right to vote and to enforce segregation of the races. And statues and monuments went up all over the South. The plaques usually talked about the bravery of the person depicted, but there were also language about the nobility of the cause for which they fought. Blacks living in those states, most of whom were former slaves, knew exactly what that cause had been, and the message the statues and monuments was clearly: “white people are in charge again, and don’t you forget it!”Most of the Confederate monuments were put up in the 1910s and 1920s, coinciding with an increase in activity of the KKK and similar organizations terrorizing blacks. And the next big surge was in the 50s and 60s when civil rights organizations began having successes against some of the Jim Crow laws. The purpose of those monuments was not to honor the culture of the South, the message was still “stay in your place, black people, or else!” A great example of this resides not many miles from my home. Washington territory was never a part of the Confederacy, and the few inhabitants of the state who served in the war did so as part of the Union Army and Navy. A local family, some years after the war, donated land in what would one day become the Capitol Hill neighborhood to the Grand Army of the Republic (which was an organization made up mostly of Union side Civil War Veterans) for a cemetery for Union soldiers. And that’s who was buried there. But decades later, during one of those surges of monument building, the Daughters of the Confederacy paid to have a monument to soldiers of the Confederacy erected in the cemetery. There are no Confederate soldiers buried there. Not one. And there are no soldiers’ names engraved on the massive monument. But there it is, erected in a cemetery full of Union soldiers, a monument to the so-called noble cause of the Confederacy.
Now that some communities are rethinking these monuments—many of them extremely cheap bronze statues erected during times of civil rights tensions—other people are claiming taking them down is erasing history. No, taking down these post-dated monuments in public parks and so forth isn’t erasing history, it’s erasing anti-historical propaganda. The other argument that is put forward in defense of the monuments is that “both sides deserve to be heard.” That’s BS in this case, because there aren’t two sides to racism. There aren’t two sides to bigotry. There aren’t two sides to genocide. White supremacy is not a legitimate side to any argument.
When we defeated Hitler’s armies, we didn’t turn around and erect monuments to the government that murdered millions of people in concentration camps. We destroyed their symbols. When we liberated Iraq, we tore down the statues of Saddam Hussein, we didn’t enshrine his image in an attempt to give both sides equal time. Those few Confederate monuments that list off names of people who died are fine (even if a lot of them have cringeworthy language about the cause they were fighting for). Cemeteries where actual Confederate veterans are buried of course can have symbols of the Confederacy on the tombstones and the like. But the other monuments, the ones erected years later, they don’t belong in the public square.
They belong in the dustbin of history.
After serving one term, Kozachenko stepped out of the public eye, though not out of the activist life entirely. After meeting her life partner, Mary Ann Geiger, and having a son, Kozachenko retreated more fully into private life and her place in queer history went virtually ignored for decades.
In “The First Openly Gay Person to Win an Election in America Was Not Harvey Milk,” a 2015 piece for Bloomberg politics, Steve Friess explored the factors that contributed to Kozachenko’s diminished place in the history of gay liberation: geography, misogyny, timing, messaging. When asked why the groundbreaking gay journalist Randy Shilts referred to Harvey Milk as “the first openly gay elected official in the nation,” for example, Kozachenko “figures there was little fuss at the time because it was just liberal, small-city Ann Arbor.”
“I don’t think I was brave,” Kozachenko told Friess, “because I was in a college town where it was cool to be who I was. On the other hand, I stepped up and did what I felt needed to be done at the time. Maybe that’s the whole story, that ordinary people can do something that then other people later can look back on and feel really good that they did this.” #HavePrideInHistory #KathyKozachenko (at Ann Arbor, Michigan)
(Reposted from LGBT HISTORY ARCHIVES IG: @lgbt_history.)
Is it weird for me to think this is a cool coincidence one day after I write about a much more recent openly gay person at the University of Michigan?
Mr. Wright’s post was blunt, and not at all a feel-good statement. But it also contained a lot of truth:
“You’re expecting some kind of obligatory 9-11 post, aren’t you?
Here it is, but you’re not gonna like it.
15 years ago today 19 shitheads attacked America.
They killed 3000 of us.
And then … America got its revenge for 9-11.
Yes we did. Many times over. We killed them. We killed them all. We killed their families. We killed their wives and their kids and all their neighbors. We killed whole nations that weren’t even involved just to make goddamned sure. We bombed their cities into rubble. We burned down their countries.
They killed 3000 of us, we killed 300,000 of them or more.
8000 of us came home in body bags, but we got our revenge. Yes we did.
We’re still here. They aren’t.
We win. USA! USA! USA!
You goddamned right. We. Win.
Every year on this day we bathe in the blood of that day yet again. We watch the towers fall over and over. It’s been 15 goddamned years, but we just can’t get enough. We’ve just got to watch it again and again.
It’s funny how we never show those videos of the bombs falling on Baghdad today. Or the dead in the streets of Afghanistan. We got our revenge, but we never talk about that today. No, we just sit and watch the towers fall yet again.
Somewhere out there on the bottom of the sea are the rotting remains of the evil son of bitch who masterminded the attack. It took a decade, but we hunted him down and put a bullet in his brain. Sure. We got him. Right? That’s what we wanted. that’s what our leaders promised us, 15 years ago today.
And today those howling the loudest for revenge shrug and say, well, yeah, that. That doesn’t matter, because, um, yeah, the guy in the White House, um, see, well, he’s not an American, he’s the enemy see? He’s not doing enough. So, whatever. What about that over there? And that? And…
15 years ago our leaders, left and right, stood on the steps of the Capitol and gave us their solemn promise to work together, to stand as one, for all Americans.
How’d that promise work out?
How much are their words worth? Today, 15 years later?
It’s 15 years later and we’re STILL afraid. We’re still terrorized. Still wallowing in conspiracy theories and peering suspiciously out of our bunkers at our neighbors. Sure we won. Sure we did. We became a nation that tortures our enemies — and our own citizens for that matter. We’re a nation of warrantless wiretaps and rendition and we’ve gotten used to being strip searched in our own airports. And how is the world a better place for it all?
And now we’re talking about more war, more blood.
But, yeah, we won. Sure. You bet.
Frankly, I have had enough of 9-11. Fuck 9-11. I’m not going to watch the shows. I’m not going to any of the memorials. I’m not going to the 9-11 sales at Wal-Mart. I don’t want to hear about 9-11. I for damned sure am not interested in watching politicians of either party try to out 9-11 each other. I’m tired of this national 9-11 PTSD. I did my bit for revenge, I went to war, I’ll remember the dead in my own time in my own way.
I’m not going to shed a damned tear today.
We got our revenge. Many times over, for whatever good it did us.
I’m going to go to a picnic and enjoy my day. Enjoy this victory we’ve won.
I suggest you do the same.”
—Jim Wright, Stonekettle Station
I almost never write about 9/11. On the first anniversary, I made a post on my old blog called “Living for 9/12.” And I reposted it on this blog around the eleventh anniversary. I didn’t express the same sentiment as Wright either of those times, but I’m getting to a similar emotional space.
It’s not that I think we should forget the deaths that happened that day. But could we try using that grief to accomplish some good in the world? I mean, my goodness, it took us 14 years to pass a bill to help the fireman and paramedics and police who responded that day, survived, but have suffered longer term health issues. And yes, we killed the mastermind of that plot, but along the way we’ve bombed countries that weren’t involved, and have used the original tragedy to justify all sorts of violations of our own civil liberties, assassinating at least one of our own citizens without due process, not to mention developing a disturbing habit of killing civilians with drones!
Every year about 11,000 U.S. citizens are murdered with firearms, sometimes in mass shootings like Orlando or Sandy Hook, most in incidents that barely make it to the local news. That’s nearly four 9/11s every single year. Maybe we should actually do something to prevent some of those? Or at least let the National Institutes of Health research into whether we could do anything to reduce that number?
Why are we unable to work up any determination over any of the tens of thousands of deaths that have happened since that day?
I need to stop ranting. There was one other 9/11 post I saw on the day that I think is worth looking at. It isn’t like Wright’s at all, but it also doesn’t wrap itself in the flag to push an agenda. Tricia Romano is currently the Editor in Chief of a Seattle weekly newspaper, the Stranger. But in 2001 she worked in New York City, writing for another weekly newspaper, the Village Voice: I Was In New York City During 9/11. I’ll Never Forget.