This isn’t going to be my typical Saturday post where I talk about news stories that either I missed for this week’s round up of links or new developments. I’ve already made a couple of pretty personal posts this week, between my birthday and remembering my late husband on his birthday just a few days later.
And tomorrow would be my dad’s birthday, if he were still alive. Which doesn’t make me sad, by the way. It fills me with a bit of dread, because I suspect there will be communications from some of my relatives that I’d rather not get. I can’t use the phrase that one friend made me practice saying right after Dad died so that I wouldn’t make people who were just offering condolences but didn’t know our history wouldn’t feel bad: “We weren’t close. We’d hardly talked in forty years.” Depending on which family member is reaching out, that comment is likely to get an angry, “Well, whose fault is that?”And I’m dreading it because I got such comments (and confrontations) on Father’s Day and on his previous birthday. Maybe I need to memorize this Stefan Molyneux quote and say that back to any of them who trot out the admonishments that it isn’t healthy for me not to grieve or not to forgive or whatever. The former is the mostly darkly funny, because I did grieve the total lack of a loving, functional father decades before my actual dysfunctional dad died. I took myself to therapy because I realized that many of his abusive behaviors and attitudes were manifesting in my own relationships. I didn’t want to turn into him, so I got therapy and dealt with it, and yes, part of my healing process was letting myself grieve for the relationship that could have been. To grieve for kind of childhood I didn’t have.
I know most of them are doing it because they worry about me. unfortunately, some are doing it because they need validation for their own feelings, or validation of the rationalizations that let them look the other way while those of us living with him were subjected to the abuse. Anyway, being angry at them doesn’t solve anything. I will probably do what I did with most of the messages that came on Father’s Day: ignore them.
But, completely unrelated: I was pointed to some cartoons by an artist I had not previously been aware of, and while checking out his web site, I found this interesting thing he created last March: My Mother Was Murdered When I Was a Baby. I Just Found a Photo of Her Funeral for Sale Online. It reminded me that there are many other ways that one’s childhood can be dysfunctional. But also, it reminded me of a bit of advice I received from one of my lesbian aunties (not an actual aunt) back around the same time I was seeing the therapist. My childhood was bad, yes, but I survived it. Not everyone who suffers domestic violence does. So, while I’m grieving what I didn’t have, I should remember to be thankful that I lived to make a better adulthood for myself.
Please note that I said this stereotype is only somewhat less toxic than many others about queer men.
So a few years ago when I mentioned in blog post that it was my birthday and my age (it was 53 or 54, but I don’t feel like going on an obsessive search to try to find the specific post), some random person I didn’t know commented about how broken-hearted I must be, since everyone knows that fags are all obsessed with being young. I typed a reply to the effect that no, I actually considered myself quite lucky. But then I decided that rather than argue with a troll the better thing to do was to simple delete the troll’s comment and move on.
But I keep running into people making this specific observation, or variants of it. A gay activist who is a frequent guest on news programs passes the age of 50 and all the anti-gay hatemongers start referring to him as an “aging activist.” This is pretty rich coming from a completely white-haired anti-gay pastor who is pushing 70, let me tell you. If a 50-year-old is “aging,” what do we call a 68-year-old, hmmmmm?
So, I’m still a couple years from 60, yet, and I know that I frequently make references to my age, mostly because 1) I am older than the average people active on the internet, 2) I’m older than the average age of people active in the various fandoms I participate in, and 3) I frequently find myself being a little boggled at people who otherwise seem really well informed being completely unaware of (or deeply misinformed about) fairly major things that happened in the world when I was, say, in my 20s.
I was still very closeted in my early 20s when the AIDS crisis began. This mysterious illness was striking gay men down, and not only did the White House Press Secretary laugh and make a fag joke when a reporter asked about the first Center for Disease Control alert about the illness, but all of the rest of the reporters in the room joined in on the laughter. One night at a church service I was sitting with my head bowed when a pastor went on a long digression in his prayer thanking god for sending the scourge of AIDS to punish the wickedness of gay people and wipe them from the face of the Earth. 10 years later, as an out gay man, I found myself going to memorial services of men sometimes younger than I. One particularly bad winter, 16 different people we knew died in a single three-month period. It really did seem that every gay person was doomed. And it didn’t seem to matter that we all now knew to practice safe sex—because condoms can break, and so on.
As much of an optimist as I’ve always been, in the face of all the overwhelming chilling life experience, I seriously doubted that I would live to see my 50s.
So, I am not in the slightest bit sad or embarrassed to have reached the “ripe” age of 57. I’m not sad that my beard is mostly white, because I’ve earned every one of these grey hairs! I’m not ecstatic that some of the medical issues I’ve always had are getting worse as I get older. I’m not joyful when I read about the death of someone (famous or not) that I’ve known and admired for years. I know that that is going to happen more often, that’s just the natural consequence of the passing of time.
Getting older has its drawbacks, yes. But the alternative is worse, right? So I say, “Bring it on!”
Among my role models growing up was a very cantankerous paternal great-grandmother (who taught me how to listen in on the neighbors’ on the party line phone, among other fun things) and an even more ornery maternal great-grandfather (whose jobs when he was younger had included driving souped up cars, including sometimes outrunning the police, to deliver illegal alcohol during Prohibition). Both of them said and did things around us kids back then that embarrassed their own children (my grandparents and great-aunts and great-uncles), and I fully intend, if I’m lucky enough to live as long as them, to similarly embarrass some of my younger relatives and acquaintances.
On ocassions such as birthdays, one is often asked to share some words of wisdom. I’m going to give you two pieces of advice, one from each of the aforementioned great-grandparents:
“Life is too short to carry grudges or worry about what other people think of you.”
“Never let the revenuers piss on your parade.”
There are a few things to note about this particular transition of the seasons at least where I live. First, we officially can enter summer 2017 into the weather record books for a couple of different things. It was officially the driest summer (going by solar summer: June 21-Sept 21). Seattle summers are usually relatively dry, particularly compared to our Novembers, but this year was exceptional. Only 0.52″ of rain total, and it is worth noting that 0.50″ of that rain came in the last six days! Which certainly contributed to many days that the city was blanketed in smoke from various wild fires in British Columbia, Eastern Washington, and Central Oregon.
Summer 2017 also tied for the hottest summer ever recorded (1967). Though it is worth noting that 2014 and 2013 are tied at second hottest only one-tenth of a degree cooler (and 2015 was two-tenths of a degree cooler, so we definitely have a trend going).
But that nightmare is over, at least until next year. The jet stream has shifted. We got light rain last weekend, the daytime highs have been in the high 50s to mid 60s all week. We may break 70 again late in the week, but that’s a considerable improvement over the temps just two weeks ago.
So, autumn is here! Time to start thinking about Halloween and Thanksgiving decorations. Time to break out the pumpkin spice (I actually started experimenting with pumpkin spick cocktail recipes the day we got the first rain last week).
Welcome to fall!
Content Warning: the following essay (which will also touch on dangerous misperceptions and myths about sexual orientation) includes some specifics about physical abuse of children and worse. Only click when you’re ready … Read More…
Software programs don’t usually work that way, but the non-rational part of my brain doesn’t quite get that. So seeing a review of a word processor that extolls features that appeal to me has the same effect on that impulsive part of the brain that makes me pick up a new pencil or pen or pocket notepad when it catches my eye in the store.
Many apps offer free trial versions, so it is literally a matter of just clicking or tapping a few times on my phone or laptop, and the next thing you know there’s a new word processor installed on my iPhone or iPad or Macbook Pro. And I will play with it for a bit, maybe find some things I like about it. If it works well and is cheap, well, I might buy it. If the free version has no time limit, I may just leave the free version on indefinitely.
All of that sounds mostly harmless, and it usually is. But… Read More…
Then I read the story aloud to my monthly writers’ group.
I honestly don’t remember much of the critique I got from the group that night. And truth be told, I didn’t read everything I’d written. I only read the opening scene, and by the time I reached the end of the scene, I already knew that the story was a disaster. Part of it was the nonverbal reaction of the group, yes, but that wasn’t what killed the story for me. No, just hearing it aloud in my own voice revealed that it was an awful opening to an unpleasant story.
The character was in a very unpleasant situation, but that’s not what I mean when I say it was an unpleasant story. I mean that it was unpleasant to read the scene that I’d written. And I knew the rest of them suffered the same problem. I had picked the wrong place to start the story, and I was fairly certain that while my new character was interesting, she shouldn’t be the viewpoint character for this particular story. She might still be the protagonist, but she wasn’t the person who should narrate this particular tale.
And I learned all of that before any of the other writers in the group said a word. Just from the act of reading it aloud.
It’s advice I have received for as long as I can remember. Back when I was a grade-school student haunting the library’s magazine collection reading back issues of The Writer and Writer’s Digest I saw the advice again and again: read the story aloud to yourself before you show it to other people. It’s advice I’ve given many times. But I don’t always follow it. That particular story I really should have.
Reading it aloud, either to yourself or an audience, will expose awkward sentences at a minimum. There are all sorts of sentences you can write that make perfect sense, follow the rules of grammar and so forth, but when you try to say them out loud, your tongue trips on them. That’s why I always have a pencil or other writing implement in my hand when I read aloud, so I can circle the places I stumble over awkward phrasing.
But that isn’t the only thing you learn reading it aloud. There are numerous studies that show, for instance, the act of simply speaking about a problem you’ve been worrying about makes you think of it in a new light. Neurologically, they say, that’s because different parts of the brain interact differently. It’s not just the act of putting a problem into words, it appears to also be the fact that as you listen to yourself speak, different areas of the brain react differently than when you contemplate a problem in silence.
That process doesn’t just apply to solving real world problems, obviously. Listening to your story aloud makes you process it differently than reading it silently.
Reading it aloud to someone else brings in a different level of information, much of it non-verbal as I alluded to above. Your listeners may fidget, or become distracted, for instance. You’re not holding their attention. You’ll get other cues, as well.
That particular tale was re-written substantially several times, though I didn’t bring each draft back to the group. I tried telling the story from the points of view of three different supporting characters before I found the right viewpoint character and the right starting point. The fourth version, when it was read, got very positive responses. And eventually was published, and I got a few compliments from readers of the ‘zine.
The key to realizing my approach was wrong was to simply read the opening scene aloud–advice I have tried to follow much more faithfully ever since.
Both were digital alarm clocks with that formerly ubiquitous red LED display, though Ray’s was a large print display, because without his glasses, even if he picked up a regular alarm clock and held it so close that his nose was almost touching the display, he still couldn’t read the numbers. My alarm clock was a clock radio, and I always set it to start playing NPR about a half hour before I needed to wake up, then the alarm when I had to get out of bed. Because I was less likely to be a Grouch Monster™ when the alarm went off if I’d been eased into waking up by the radio. After Ray died, I kept both alarm clocks. For one thing, while my eyesight had never been quite as bad as Ray’s, I liked the fact that I could read the large print clock from the far side of the bedroom when I didn’t have my glasses on.
When Michael moved in with me the year after Ray died, he already owned an alarm clock. And since he also had a job where he needed to get up at different times each day for work, it made sense to have a separate clock. But we didn’t get rid of my second clock. Instead we moved the clock radio to the far side of the bedroom, which I found made it less likely that I would hit the snooze alarm a bunch of times and oversleep. Over the years, the clock radio had to be replaced a couple of times. And Michael’s clock’s display went wonky and had to be replaced, but the large print clock which had been Ray’s just kept chugging along.
Or at least, that’s what I told myself.
I don’t know how old the clock was, because Ray already owned in when we started dating in 1990. But that means it was at a minimum 27 years old this spring when Michael and I were packing. Not surprisingly, after 27+ years of use, some things didn’t work as well any longer.
- One of the features the large print clock had which was innovative and unusual in 1990 was a battery compartment in the bottom of the clock so that if you kept fresh batteries in there, the clock wouldn’t lose time during a power outage. The clock wouldn’t actually stay lit up or sound its alarm when it was on battery back up, but you didn’t have to reset it once the power came back on. Now it is pretty standard for electronics to have a in-built mini rechargeable battery for this purpose, but back then it was unusual. The battery backup stopped working years ago. You don’t want to know how many times I changed the batteries and cleaned the contacts in the battery compartment, or shone a flashlight into it while I peered through a magnifying glass trying to fix it before I admitted to myself that the memory chip or whatever it was that the batteries powered must have failed.
- A couple years after the battery backup stopped working, the alarm became inconsistent. You could set the alarm, and when it came time for the alarm to go off, the clock would try to sound an alarm. But sometimes all you got was a click and a single weird little chirping noise. other times the buzzer would sound, but it wasn’t very loud. Other times it chirped and chirped and chirped until you turned the alarm off. Very rarely did the buzzer just buzz loudly. But since by this time I had a clock radio that had two alarms in addition to the radio, I didn’t really need the alarm on this clock any longer. But the large print display I still had a use for.
- More recently, the power cord had gotten twitchy. By which I mean, if you bumped the power cord, it would temporarily lose power. And because the battery backup wasn’t working any longer, that meant that basically if you sneezed in the vicinity of the clock, the display would go dark until you jiggled the cord again, and then you had this enormous blinking 12:00 on the screen. Now, I’m not saying the cord was frayed or otherwise showed any sign of the sort of wear that would make it a fire hazard, I think the iffy connection was actually inside the body of the clock on one side or the other of the rectifier (this is the part inside most electronic devices that converts the household 110-volt alternating current into the much lower voltage direct current that circuit board and chips and such use). So this didn’t represent a fire hazard, just an annoyance.
- Cosmetically, the faux-gold coating on some parts of the plastic bezel around the display had been wearing off. The labels on some of the switches and buttons necessary to setting the time had faded to the point of being difficult to read, and there was a half-inch-long crack in one corner of the display.
When I actually type these things up, it seems really ludicrous that I hung onto the clock as long as I did, right? And it is ridiculous. But it’s not that unusual for people to let small annoyances like this build up to a ridiculous point and try to keep muddling along. How many times have you known someone in a relationship which had obviously soured or become awful over time who didn’t notice the thousands of little ways they were walking on eggshells to keep the peace?
Yeah, part of the reason I was more willing than was reasonable to overlook the growing list of problems with this clock is because it had belonged to Ray. And I am a sentimental fool, so of course I don’t want to get rid of something that had any fond memories attached. And yes, the alarm clock did have fond memories associated with it. Not to get too graphic, but it was the only light on in the room the first time we made love, after all. But the other part was the human tendency to make-do with something because it seems easier to keep the thing we’re familiar with than to replace it.
As it was, the clock radio, though many years newer than the large print clock, was also beginning to develop some issues, and the alarm clock on Michael’s side of the bed had a crack in the display that made it difficult to read from some angles. And so Michael bought a brand new bedroom clock for the new house within a day or two of the move. And he found a single clock that replaced the functions we had actually been using on the three old ones. The main display shows time, day, date, and the temperature in the room. It has a radio, multiple alarms, alarms you can specify for different days of the week, and it has an adjustable, focusable laser display that projects the time on the ceiling or a wall in very large print so I can read it in the dark (and it doesn’t have to be that dark, just dim in the room) from across the room without my glasses.
It’s a very big improvement, it wasn’t expensive, and one little clock takes up a lot less space than the three old things we had before.
Change doesn’t have to be bad!
In the early 70s U.S. pop culture became obsessed with martial arts. One of the best examples of this was the television series, Kung Fu which ran from 1972-1975. The show, which was wildly popular both with audiences and critics, told the story of Kwai Chang Caine, a half-chineses, half-white man raised in a Shao Lin monastery who winds up in the American Wild West wandering the countryside seeking his father while evading agents of a Chinese nobleman who wants him dead. The show cast white actor David Carradine in the role (after rejecting Bruce Lee). And it really was wildly popular. In the redneck rural communities I was living at the time, every one of my classmates would quote favorite lines from the show and make allusions to it in various ways. While the show cast a white actor in the role of the supposedly biracial lead, since ever episode relied heavily on flashbacks to incidents in Caine’s childhood, teen, and young adult years back in China, it also provided a lot of acting roles for Asian American actors in recurring and supporting roles. Probably more so than all of American TV before then. Which doesn’t make up for the white washing, but was at least a teeny step forward.
That TV show wasn’t the only bit of pop culture effected. Action movies and television series of all kinds started introducing martial arts experts to their story lines, and soon audiences were expecting amazing martial arts fights in all of their entertainment. Even the BBC’s Doctor Who had to bow to the expectation, with the velvet-jacketed Third Doctor suddenly becoming an expert in “Venusian Karate” though embarassingly what that meant was the actor occasionally exclaiming a cliche “Hai-ya!” as he felled opponents with an unconvincing chopping motion of his hand.
And comic books were hardly immune. Suddenly every comic company was adding martial arts experts (some of asian descent, some not) into their superhero lines. Comic titles such as Master of Kung Fu, Karate Kid (no relation to the 80s movies), and Kung Fu Fighter, and Dragon Fists were suddenly popping up in department store comics racks. Along side characters such as Shang Chi, Richard Dragon, Lady Shiva, and Karate Kid (no “the”) there was Danny Rand, aka Iron Fist: the Living Weapon.
Danny was a classic mighty whitey: a white orphan taken in by mysterious monks in a secret temple in the Himalayas, who masters their semi-mystical martial arts to a degree that far exceeds any of the natives and becomes their greatest warrior. This being an American comic, of course Danny comes to America, specifically New York City, where he tried to reclaim his family fortune (along the way discovering that his parents’ deaths on the journey may not of have an accident). His costume was a bit unusual for male superheroes of the time—ridiculously plunging necklines were usually reserved for women. The excuse for exposing all that skin was the black dragon mark on Danny’s chest. It’s not a tattoo, but rather a symbol that was burned into his flesh during a fight with a dragon, which is an important part of the ritual of becoming the Iron Fist.When Marvel debuted in comics in 1974 I was 14 years old. I didn’t read the very first couple of issues. Back then my source of comic books was the rack at the only drugstore in the small town where we lived, and which comics they got were hit and miss from month to month. But I remember seeing this cover in that rack one day and being instantly fascinated. I bought the comic, and as I frequently did in those days, read it, re-read it, and re-read it again and again. The story was middle episode in the middle of a story arc, so I was a bit confused about some things, but was still immediately enamored with the character. I kept my eyes peeled for the character from then on, and managed to pick up a few more issues as they came out, but not all of them. It was a constant frustration at the time: not being able to count on the next issue making it to my town.
Because of that inconsistency—where I would pick up, say, issue #85 of Spider-Man, then not find another issue until #89 came out—I spent a lot of time looking for clues in the stories as to what I had missed in the intervening issues, and I would write up my own versions of the adventures my favorite heroes had experienced in between. Very occasionally I tried to draw my own comics, but mostly I wrote them out more as prose stories. This skill of figuring out all the ways a character might go from point A to point Z has been useful in my own writing since.
Eventually, after my parents’ divorce, Mom, my sister, and I moved to a town large enough to have multiple book stores and an actual comic shop, where eventually I managed to purchase at relatively cheap prices many of the back issues I had missed of Iron Fist and several other titles. I was a little disappointed that some of my attempts to fill in the gaps between issues were way off, but I still loved the character. I know now (but didn’t realize back then) that one of the things that appealed to me about the character originally was that chest-baring costume. But Danny Rand’s story also appealed to me because he was an outsider, never quite fitting in anywhere. That was something I really empathized with.
Another thing that appealed to me about Iron Fist the comic (and some of the other Kung fu-ploitation properties) was the inclusion of (often mangled, I know) zen, buddhist, and taoist philosophy. Seeing other traditions underpinning moral and ethical principles, seeing good, brave, and noble character behaving morally and ethically outside of the fundamentalist Christian framework helped me reconcile my growing discomfort with the evangelical beliefs I’d been raised with. Yes, it was culture appropriation, and it was a stripped-down and distorted representation of those other religions, but it wasn’t being done to deride those beliefs. The distortion was because of ignorance and the expediency of meeting writing deadlines, not out of a hostility to the cultures themselves. While it was problematic, it still helped me find a way to escape the clutches of a homophobic denomination. And that’s a good thing.
As I said at the beginning of this post, I had had high hopes for the Netflix Iron Fist series. I’d read enough reviews when it first came out to know that the consensus of critics and a lot of fans was that the show was nowhere near as good as some of the other Marvel-Netflix shows. But I still hoped. I still think that the show would have been improved immensely if they had cast an asian american as Danny. It would have been really easy, and I think would have made the way they chose to tell his story work a bit better. The external conflict of the series is mostly about control of the corporation originally founded by Danny’s father and the father’s best friend. The internal conflict is about Danny trying to figure out his place in the world. If they had made Danny biracial, showing his father in the flashbacks as white and his mother as, let’s say, Chinese American, then that internal conflict would have had more layers. And this story desperately needed something less shallow than a badly thought out boardroom drama.
It also doesn’t help that the actor they cast as Danny seems about as talented as a block of wood. Seriously, the adam’s apple of the actor who was cast to play Danny’s childhood friend from the mystical city displays more acting talent and skill in a single scene than the actor playing Danny does in the entire series. Another big problem is pacing. The series spent about 9 episodes setting things up that could have easily been handled in one. The first episode was pretty an okay beginning of the tale, but it wasn’t until about episode 11 that things seemed to pick up. I also can’t figure out why they showed virtually no scenes of the mystical city where Danny gets his training. Let along never showing us the dragon. I mean, what is the point of telling Iron Fist’s story without showing us all that?
Maybe they’ll do better in season two.
In case you don’t know where the title of this blog post originated, here’s a music video that might explain things:
Carl Douglas – Kung Fu Fighting:
(If embedding doesn’t work, click here.)
Once when I vented about this misconception on line, someone replied that their mother kept thinking that his wife’s need to eat gluten free meant always having a vegetarian option.
My mom isn’t the only person who buys into this myth that diabetics can never ever eat this, that, or the other. I’ve met plenty of diabetics (and a doctor or two) who also buy into it. I’ve seen plenty of people who take it the other way: since they’ve been told they can “never” eat anything they think is good, they just say “screw it” and eat themselves to death.2
So, for instance, most mornings I have a cookie or nice piece of chocolate. We’re supposed to eat a small snack each time we take our insulin, and I have found my blood sugar is most stable if I eat something with less than 10 grams of carbs for that first snack of the day. And there are lots of snack foods which are NOT sugar-free where a single serving falls in that range.
Everyone’s body varies, but I’ve found that as long as my total carbs for the day stay under 150 grams and no single meal has more than about 40, (and I take my meds on time) my blood sugar readings stays in the desired range. That’s true whether those carbs come from things like lentil soup, tangerines, and icelandic yogurt or chocolate, Dry soda, and beef stroganoff with noodles.
Keeping track of the carbs takes work, I get that. People keep asking me if I have an app the tells me how many carbs are in things and I reply, “Safari!” Yes, the defalt web browser on my iphone. I can just type “how many carbs in french fries” and get a useful answer–a serving of this size contains this many carbohydrates, that much fibre, et cetera. I’ve been doing it long enough that I don’t have to look up a bunch foods, because the numbers and serving sizes have started to stick in my memory. I can eyeball a lot of servings and come up with a good guess.4
I’m just enough of a creature of habit with a bit of obsessive compulsive leanings that once the behavior was established, making a not of how many carbs I’m eating happens almost without thinking. But there is time and effort involved. And let’s be honest, eating healthy isn’t cheap. Our society has gotten really good at serving massive amount of calories cheaply in forms that are almost tailor-made to make you fat. Finding health alternatives, that are easily to keep with you, easy to store, won’t spoil before you can eat them all, and so forth is more expensive the just grabbing the reasonably priced pre-packaged foods, or a cheap (and delicious) meal from a food truck or whatever.
Another myth I hear a lot is, “You can eat all the fruit you want!” in various forms. I can’t tell you how many times I’ve had a conversation that includes the claim, “But fruit doesn’t have sugar!” and when it’s pointed out that fruit contains a lot of sugar, “Well, but it’s nature sugar, so it’s good for you!” I’ve had engineers who I know had to pass basic chemistry to get their degrees insist that the naturally occurring sugar in fruit doesn’t elevate your blood sugar reading, because it “isn’t the bad stuff.”
News flash: all sugar is natural. Really. That beautiful white crystalline power you buy at the store and put in sugar bowls is natural sugar. It’s simply squeezed out of plants, such as sugar cane, or beets. The science is very clear that nothing about the process of extracting it and purifying it makes it any more or less dangerous than the sugars you ingest if you take a bite of an apple or a banana. At all.5 This is really just the flip side of the myth I opened with: you can never have certain foods, others are always good for you, no matter what.
So, no, just because it’s “healthy” doesn’t mean I can eat as much as I want.
The biggest adult-onset diabetes myth I keep running into is the notion that being obese causes diabetes. For decades people (and doctors) said that because there was a strong correlation. But try as they might, no medical study could ever establish the causal link. Not only that, as options other than just injecting insulin became available, a lot of people started noticing that diabetic patients who had been struggling to lose weight (and failing), starting losing weight easily once they were on the correct medicine for them. Turned out there was a reason for both of those facts: for several types of diabetes, it isn’t the obesity that caused the diabetes, it’s the underlying genetic issue that will eventually turn into diabetes that is causing the obesity.
There is a relationship. Being obese makes many of the other symptoms of diabetes worse. But a huge number of studies have shown that the old way of treating the disease: basically fat-shaming the pre-diabetics and refusing to start them on medication until their blood sugar levels were so out of control that permanent damage had accrued in the liver, kidneys, and other internal organs–wasn’t helping anyone. Whereas starting patients on medication early, and focusing on diet, exercise, and the results of blood tests rather than worrying about weight, often leads to the patient losing weight, and sometimes bringing function back to the pancreas.
While we’re on that. There’s another related myth. For a long time when treating patients who were developing adult-onset diabetes, doctors put off starting the patient on insulin for as long as they could. The reasoning was a combination of the obesity-causation myth and the anecdotal experience of watching adult-onset diabetics’ health decline sharply after starting the insulin. The problem was the waiting, not the insulin.
There’s a number that is generated from the routine blood tests, your A1C. You don’t need to know what that means to understand what I’m about to explain. The average healthy person’s A1C will be about 4. If it’s over about 5.7 you’re considered pre-diabetic. If it’s over 6.5 you’re considered diabetic. In the old days, doctors would wait until an adult patients’ A1C was over 12 before starting them on insulin. The problem is that when the A1C is greater than 7, internal organ damage starts happening. So waiting until a patient is consistently at an A1C of 12 means the body is already so damaged that the patient was already dying. So it created this mass of anecdotal evidence that people were associating with the insulin.
That’s why the guidance now is to start medications early. Try the various meds that can be taken as pills when a patient is in the upper end of pre-diabetic stage, but don’t wait so long. Again, lots of studies are available on this.
Anyway, besides just trying to reduce the number of people who argue with us about what we’re eating, I hope that this will encourage you to think about how the body works and how many of medical and biological facts you’ve absorbed over your life are actually just widely believed myths. Everyone should have a basic understanding of how human bodies work, in my opinion.
1. Well, except almonds, but that’s an actual allergy I’ve had forever.
2. One reason I asked my doctor for a nutritionist referral when I was diagnosed pre-diabetic 17 years ago was watching my Dad,3 some uncles, and cousins who didn’t take care of their illness lose their eyesight or toes or entire lower legs along with their swiftly declining health. Contrasted with a great-uncle who had watched his diet and took his meds faithfully since his diagnosis at age forty who lived to the age of 99, spry enough to play nine holes of golf with some younger buddies just a few days before he dropped dead.
3. I should mention that Dad wasn’t diagnosed until 13 years after the divorce, so my Mom has never lived with someone with this particular disorder.
4. Usually. Sometimes I realize partway through eating something that it is sweeter than I expected, and I can stop at half a portion. Other times I didn’t realize there were hidden extra carbs in something until later, when I start feeling that high sugar buzz, then check my blood sugar and confirm that it’s shot up a lot higher than it should have if what I’d eaten a bit ago had been what I thought.
5. And all sorts of natural substances are poison, while lots of manufactured substances are life-saving medicines. Stop assuming that natural somehow means magically healing or whatever.