Monday, September 23, 2019

RIP Winibelle: A Tribute to Rabbits

Here are two fun facts about rabbits.

They actually really do love carrots, and carrots are actually not very good for them.

Carrots have a lot of carbs and about half of those carbs are sugar.  In fact, one big carrot has three grams of sugar, which is a lot, compared to grass, which is what most rabbits eat.

My rabbit, Winibelle, loved carrots.  She wasn't supposed to eat them because she, an organic-fed vegan, was morbidly obese.  This wasn't really Winibelle's fault.  Her mother was a Flemish Giant and her father was a Rhinelander, so she was "destined for chonk," as my husband liked to say.

He often lovingly compared her to Big Chungus.

Winibelle was born September 12th, 2012 and she passed September 21st, 2019, making her seven years old.  This may not sound like a lot (dwarf rabbits can live 10 - 12 years) but for a Flemish/Rhinelander rabbit (life expectancy: 5 - 8 years) this was entirely normal.  We saw it coming; Winibelle had begun to lose some of her ample weight.

 Day 1: Winibelle is on the far right.  Her original name was "Goliath" due to her size. 

 Winibelle years later, as a chonky bunny.

(A curious side note about Rhinelanders: a breed developed in the early 1900s, they quickly became ragingly popular but then fell out of vogue post-WWII, nearly disappearing between the '30s and '70s.  Though the breed re-emerged, it is now considered relatively rare.  Yet another terrible, terrible consequence of war.)

It was, in fact, Winibelle's eating habits that tipped us off to her passing.  I asked Andrew to give her some pellets; he responded that she already had pellets from the morning in her hopper.  "Is she dead?" I asked.  I said it partially joking.  But deep down, we knew.  Winibelle had passed as most "small" animals do: quickly, quietly, and relatively peacefully, curled up in a bed of hay in the bottom of her hutch.
Winibelle on a walk, circa 2013. 

 Winibelle on a carry, circa 2016.

I will not be getting another rabbit.  To be clear, I liked Winibelle.  She was cool, for a rabbit.  But I don't really see the appeal of rabbits as a whole.  They do not emote and they're not enormous fans of cuddling or being held.  They poop about 300 times a day.  They require a lot of space (a hutch and, if they are as big as Winibelle, room to run around, or, in Winibelle's case, ooze into a motionless puddle).  The reward/cost ratio for a rabbit is not at all comparable to a dog, or a cat, unless you plan to eat them or make gloves out of their very soft fur.  A rabbit is, in my opinion, not a very practical pet.

Yet rabbits have been domesticated for thousands, or perhaps even tens of thousands, of years ago.  Genetic data shows that domestic rabbit lines diverged from wild rabbit lines around the same time as sheep, cattle, pigs, and cats, making them one of the oldest domestic animals.  Of course, apocryphal tales state that rabbits were domesticated in 600 AD by French monks, but this is likely bullshit.  Stories of rabbits, especially religious ones, abound like so many wild hares.

The rabbit has a surprisingly strong foothold in the mythology of countless cultures, and today I'd like to explore a couple in Winibelle's honor.  The rabbit is easily dismissed as a dumb prey animal, a "common" and somewhat boring mammal best known for ruining vegetable gardens, but rabbits have a surprising depth of character that is echoed throughout human history.

"Hey, pal, my ears are up here." - sexy rabbit statue by Beth Cavener

If there are two themes that seem pervasive, it's that rabbits are fertile, and that rabbits are tricksters.

As far as fertility goes, it's sort of a gimme.  Rabbits are notorious for breeding.  As prey animals, they operate by quantity, not quality; it's not a surprise that we have phrases in various languages equating sexual promiscuity or virility with rabbits.  (In English, we have, "fuck like rabbits," and in Spanish, "coneja" is impolite slang for a woman with many children.)  In Pagan mythology, Eostre, a fertility goddess, was associated with rabbits.  In Greece, Aphrodite, goddess of love, beauty, and fertility was associated with rabbits.  In Rome, rabbits were also associated with fertility, and gifting a woman a rabbit or rabbit meat was a way to wish fertility upon her. Pliny the Elder recommended rabbit meat as a cure for sterility as well as a virility enhancement. Across the sea, the Aztecs had a pentheon of 400 rabbit gods collectively called the Centzon Tōtōchtin, who reigned over fertility and threw crazy, drunken rabbit parties.  The Norse goddess Freya, associated with femininity and fertility was served by rabbit attendants.

"Sorry 'bout your infertility.  Have a rabbit." - people in Athens, 480 BC

Curiously, many people also equate rabbits with the moon.  This isn't too surprising if you think about it.  Rabbits are more easily seen at night by the light of the moon, and also, the moon has traditionally also been associated with femininity and fertility.  (This is most likely due to menstrual cycles lining up neatly with lunar cycles, which are 28 days long.)  In Japan, they see "the rabbit in the moon," aka the Jade Rabbit.  (In America, we see the craters on the full moon as a face, aka, "The Man in the Moon.")  Rabbits were tied to the moon in Egyptian myth; Eostre, the aforementioned Anglo-Saxon Pagan goddess, was a moon goddess.  In China, Japan, and Korea, rabbits were associated with the moon and with lunar new years.  Kaltes, a moon goddess of western Siberia, could transform into a rabbit, and she wasn't the only one.  In many myths, people transform by the light of the moon to rabbits or back.  (In Egypt it was thought rabbits switched sex on the full moon, at the risk of getting kicked out of the American military.)

Because so many cultures associated moonlight with feminine mystery, perhaps this is why rabbits were also seen as "tricksters."  This makes less sense to me than the fertility thing.  But to give credit where it's due, rabbits, sporting virtually no real defense against their bountiful predators, must be tricky, in the sense that they're really good at getting away.  (Trickiness is commonly associated with hunted animals: for example, foxes, and raccoons.)  Examples of the rabbit as a trickster character include Nanabozho the Great Hare in Algonquin lore, the self-named Rabbit of Panchatantra Indian fairy tales, and West African fables of clever or sneaky hares, which would later give rise to the African-American folktales of "Br'er Rabbit," not to be confused with his Cajun cousin, Compair Lapin.  And don't get me started on Bugs Bunny!

 Not to be confused with his chubster cousin, Big Chungus. 

Perhaps because of the whole fertility thing, rabbit heraldry wormed its way into Christianity, too.  The "three hare" symbol has been equated with the Trinity, with rabbits representing Jesus's supposed virgin birth.  (Like the ancient Egyptians, early Christian scholars thought the rabbit was hermaphroditic and could conceive without loss of its virginity.)  Then again, there's evidence that the three hare symbol found carved into churches and drawn into illuminated manuscripts was derived from Buddhist or Chinese origin, as monuments bearing it along the Silk Road would indicate.  No one really knows.  It's simply too ancient of a symbol.  Curiously, the rabbit is not considered a kosher animal to eat, and its use in Celtic Pagan divination made it, early on, something of a boogeyman for Christians.  That was before they discovered the three hare symbol, though, which I imagine was like a "cool S" back in the 1300s.

Considering its lack of emoting and the whole eating-its-own-poop thing, the rabbit as an animal has truly been elevated in nearly every human culture, and it's no wonder that, today, people continue to keep them as pets.

Winibelle was a decent enough pet.  She was not a dog, or a cat.  But she had her charms.  She favored attention over treats, and would usually emerge from the bottom part of her hutch when people passed by to stick her head out and get her face rubbed.  If you ran a finger down her cheek bone, the corresponding ear would tilt hilariously forward.  If you grabbed both ears, you could pretend to be operating a forklift, and she would patiently tolerate this, which is really the most you can ask of any pet who you are pretending is a forklift.

She will be missed, but I won't deny the benefits.  One less pet to take care of before the baby comes is probably a good thing, in the long run, and Winibelle was a ferocious shedder who kicked up a lot of dust and dander in the house.  She passed at a very convenient time, having lived a full life and then graciously making room when it was most needed.

Winibelle playing with a toy, circa 2015.

And here is, perhaps, the craziest rabbit anecdote of them all.  Back in 1931, it was discovered that you could accurately predict whether or not a woman was pregnant by injecting a rabbit with her urine.  A rabbit whose ovaries swelled indicated a positive pregnancy test with about 98% accuracy.  Of course, this required you to kill and dissect the rabbit.  The "rabbit test" was one of countless examples of the noble rabbit quietly and demurely being used in laboratory testing to further human knowledge of mammalian biology, and it was used for decades.  The confused public thought that the rabbit dying from the injection meant a positive result, giving rise to a common euphemism for positive pregnancy tests: "The rabbit died."

RIP, Winibelle.

The ironic timing of your death with my pregnancy was truly your greatest trick.

Monday, September 16, 2019

I Like Clowns

Clowns.  A surprisingly polarizing subject.

If you ask someone about clowns, they tend to have one of two opinions: a neutral one or a very strong anti-clown stance.

Rarely is anyone pro-clown (unless you count the niche of macabre people who specifically like creepy clowns).  But I'm a trend-setter, so I'll go ahead and say it.  I like clowns.  I am decidedly pro-clown.

100% my aesthetic.

I don't know how long I've been pro-clown, though I can definitely say it's been for at least ten years, because I remember back in high school, I made a self-portrait that involved clowns.  I don't recall the project but it was some senior thing meant to be "meaningful" and help pad the yearbook.  It involved a self-portrait. I can't draw for shit so instead I decided to do a collage of myself. I just printed out a picture of myself and then matched up colors. Super easy.

I suspect the teachers were as burnt out as the student,
since they were having us do artwork that involved gluesticks.

So, the thing is, I felt like collages needs a unified theme. I collected pictures of clowns on my computer and they were really colorful with a lot of diversity, so, perfect for a collage!

The whole portrait ended up being made of tiny printed clown pictures. I titled it "I'm Not Laughing" because I was trying to be ironic or something.

Turns out the teacher had a rather deep fear of clowns.  Calls to the school counselor were made. The hardest part was trying to explain why I collected pictures of clowns in the first place.  There was no reason. I just like clowns.

 Stories like this bring a tear to my eye.

The fear of clowns is so pervasive that it has a name (coulrophobia) and a website (  I can name a friend with a fear of clowns (one bad enough that she unfollows everyone during the month of October to avoid seeing "scary" clowns), celebrities with a fear of clowns (one of my morning jock shock hosts, and P. Diddy), and fictional characters with fear of clowns (Chuckie from Rugrats).  

So what gives?  Why do people dislike clowns so much?

"Why do you hate fun?" - Weary Willie, probably part, I think we can blame the creepy clown trope on a handful of bad clowns.  There's the Joker, who hails from the 1940s but has recently gained cultural importance among psychopaths who think he's woke and edgy.  There's famous "clown killer" John Wayne Gacy, a serial killer who also happened to be a clown.  And there's Stephen King's It clown, Pennywise, the homicidal clown who was probably inspired in no small part by Gacy.

But psychologists say there's also an inherent dislike of clowns, that people don't like them because the face paint and exaggerated features make their expressions difficult to interpret, and the always-smiling, happy, cheerful attitude seems disingenuous and therefore untrustworthy.  Another psychologist weighed in and suggested that people find clowns "otherworldly," pointing out that many of their physics-defying feats (getting packed into tight cars, producing seemingly impossibly long scarves) makes people uneasy.

According to a poll of 2,000 people performed by Vox, one-third of Americans are made scared or uneasy by clowns, making clowns more of a sore subject than climate change or terrorism.  

This poll might be a little biased, though; it was conducted during the height of the 2016 "clown sighting" phenomenon, in which people were reporting seeing "creepy" clowns wandering about.  The 2016 clown panic was nothing new; in 1981 there was a similar surge of what you might call hoaxes, though there's nothing really clever about them.  It's not that shocking or weird if you think about it, though.  The 1981 clown panic was coming on the heels of a major news story involving a scary clown; John Wayne Gacy has just been executed after a long and gruesome trial.  And in 2016, the movie It was in development and being actively advertised.  Who's to say the clown sightings didn't begin with a push from advertisers?

The sad thing is that clowns don't deserve the guff they get, nor do most professional clowns appreciate the "evil clown" stereotype.  People actually study the "bad clown" phenomenon and have come to the conclusion that "bad clowns" are a small, niche group among clowns.  The detractors of clowns say that, while they may or may not be scary, they're definitely not funny.  But that's not the fault of clowns.  They lost their way sometime in the 1950s, with TV clowns like Bozo and Clarabel, whose main audience was children.  The thing is, kids don't like clowns.  Back when clowning was serious business, clowns were for adults, and their antics were performed in theaters, not at birthday parties.  If you think that's weird, don't.  Clowns are a natural evolution of the court jester, a professional buffoon who dressed silly and entertained adult royalty.  Jesters as we think of them were popular in medieval times, and later in the Renaissance, but actually date back to both ancient Egypt and ancient Rome, where they were called balatrones (singular: balatro).
The modern scary clown might also get some of its bad rap from the historical stereotype of clowns as vagabonds or addicts.  Two of the most well-known clowns of our time, Grimaldi and Pierrot, were arguably both sad clowns.  Both suffered from alcoholism and terribly abusive childhoods.  Grimaldi died alone and penniless, and his memoirs were later written by Charles Dickens.  Grimaldi was the first of the colorful, loud clowns that we know today. But let's not forget Pierrot, the more traditional white-faced, red-lipped, Pennywise-esque French clown, who famously accidentally killed a child for mocking him by smacking him on the head with a walking stick.  Then you've got Charlie Chaplin's clownish "Tramp" figure, and boom.  Suddenly you have a whole generation of people with ill-fitted clothes, red noses, and bindles, and of course no one likes clowns, because suddenly they're sad and potentially dangerous drifters.

 Who do cocaine.

Ironically, professional clowns are actually super wholesome.  Like, Mr.-Rogers-level wholesome.  They have a code of conduct called the "Eight Clown Commandments," and the third one is not to drink, smoke, or do drugs while in costume. 

The worst part is that, as a result of all the recent bad press clowns have been getting, clown numbers are declining.  Yes, that's right; clowns are a dying breed.  According to the World Clown Association, clown membership is down by a third in the last decade, and the average age of membership is over 40, well past its prime.  Part of this can be attributed to the closing of one of the world's biggest clown colleges; Ringling Bros. and Barnum & Bailey Clown College shuttered its doors in 1997.

But perhaps there's more to it than that.  While discussing clowns with my colleges (I'm very popular at work), the subject of what constitutes a clown came up.  One of my coworkers stated that he doesn't like "traditional" clowns but is fine with Cirque du Soleil clowns.  I responded that those aren't clowns.  (I am a clown traditionalist, which is one of many reasons I'm so popular at work.)  But later, when talking with Andy about clowns, he too brought up Cirque do Soleil.  Apparently they do, in fact, have clowns, and those clowns aren't just tumblers or aerialists or acrobats, as I was led to believe.  They include the traditional white-faced pantomime clowns and the colorful, bumbling slapstick clowns.  (Also, they're paid, on average, a little over $50,000, with full benefits.  Not a bad gig!)  This got me thinking: what even constitutes a clown?

Clowns of yore were pranksters, storytellers, tumblers, magicians, actors, musicians... a performer-of-all-trades, if you will.  Nowadays, when we say "clown," we're usually talking about someone in a red nose.  But this eliminates people who probably fall into the class of clowns, such as comedy magicians.  And without traditional clown signalling (like a red rose or a floppy bow tie), people rarely have as strong opinions as they claim to regarding their discomfort over clowns. The same public that claims to loathe clowns so much, for example, loved Tape-Face on America's Got Talent, even though he occupies the intersection of two universally hated professions: clown, and mime.

So perhaps there are enough clowns, but they're being called by other names: acrobat, magician, comedian, or even simply "performer."  And perhaps this odd rebranding was exactly what they needed all along, to answer for the accidental rebranding that happened over the last century thanks to a handful of bad clowns.

So, to reinerate my stance on clowns: I like them and think they are a delight.  And I think that the issue was never the clowns themselves, but the narrow definition of the world "clown."  A clown can encompass many things, from Auguste to Rodeo to Harlequin, but isn't confined to merely buffoonery or pantomime, and it's doing a disservice to the long-standing history of clowning to suggest clowns only fit into a handful of categories.  Considering how many of them can fit into a single car, it would behoove us to consider more performers as clowns and to work to break the stigma that follows traditional clowns around.  Clowns are fun; they're designed to make us laugh.  The only thing bigger than their shoes are their hearts.  What's not to love?  I am proudly pro-clown, and you should be, too.

Monday, September 9, 2019

The Power of "Huh?"

There are an estimated 7,000+ living languages in the world today.  Of course, half of those are "endangered," meaning that there are less than 1,000 people alive who speak them.  Nonetheless, that leaves us with 3,500 languages, many of which have words that are tricky to translate, creating language barriers between cultures.

But there is one word that universally translates.  And that word is "Huh."

Funnily, there are hundreds of words that don't translate into other languages.  Hell, even in English, we have words that don't quite translate well.  For example, some have argued that "fairness," as a concept, does not accurately translate to other languages, and that the word "fair" is distinctively Anglo in origin, rendering it an abstraction in languages of non-Anglican origin.

Another tricky one is "you."  While many people nowadays are arguing about pronouns and whether or not "they" is singular (it is, according to AP Style, Merriam-Webster, and; historically it has been used by Chaucer, Shakespeare, and Jane Austen, to name a few), no one stopped to question the pronoun "you," which has two meanings in English (second- and third-person, either one of which can be singular or plural).  In other languages (French, for example), "you" has two forms, singular (or informal) and plural (or formal).  Some Asian languages have six or more tenses for "you," depending on the age, sex, and relationship with the person that the "you" refers to.

Malcolm Gladwell suggested this complicated little word, "you," informs cultural interactions and may have, in turn, led to dozens of plane crashes.  Mr. Gladwell might just be on to something.  Other authors have made the suggestion that language influences thought patterns and sociological development; Lera Boroditsky's article in The Wall Street Journal points out how English language tends to assign blame.  (We say, "Jane broke the lamp," not, "The lamp broke (itself).")  English language also tends to use more "I" statements and emphasize individuality, whereas Japanese tends to focus more on social groups or abstractions.

To say English or Japanese is difficult to translate into the other, therefore, is not merely about the words themselves, but the use of the words, and the cultural identities and histories associated with the words.  How we use words is as important (if not more important) than the words themselves.

To make things even more complicated, there are "writer- (speaker)-responsible" languages like English, in which the default expectation is that the writer/speaker/transmitter of language is the one responsible for its effective communication, and "reader- (listener)-responsible" languages like Japanese, in which the default expectation is that the reader/listener/receiver of language is the one responsible for understanding.  This is one reason that, if you've ever been in an argument of Facebook, you've probably noticed Americans furiously demanding that you explain or "convince" them (and using their own "misunderstanding" as points against you), whereas you'll find that native Japanese speakers are typically, as a whole, more willing to try to understand what you're saying even if they disagree with it.

But regardless of what type of language you're speaking, "Huh?" always translates.  And you don't have to take my word for it.  The discovery of the universal translatability of "huh"comes from a study published in 2013 by Mark Dingemanse of the Max Planck Institute for Psycholinguistics in the Netherlands, and you can read it yourself here.  Dozens of languages unrelated etymologically were examined, and two conclusions were reached.  One is that "huh" is a universal word, and the other is that "huh" is, in fact, a word, not merely a noise.

It's arguably the most powerful word in any language.  Monosyllabic and easy to pronounce, it communicates the simplest and most critical failure of language: misunderstanding.  It can be expanded upon, of course.  "Huh?" can be replaced with, "Pardon?" or "Excuse me?" or "I don't understand" or "I didn't hear you" or "Could you repeat that?" or "Say it slower" or "What does that word mean?"  But "Huh?" encompasses all possible scenarios and boils down into a fundamental, universal situation.  "Huh?" means that you don't understand what's being communicated, for any reason, and indicates that it's necessary to back up and clarify, whether by speaking louder, using different words, or some other method.  Ultimately, all languages, universally, have the same goal: communication between people.  And every human is hard-wired to learn language to receive, transmit, and exchange information.  It's one of the defining traits of humans; we literally evolved to be capable of processing ultra-complex language (some of which can be used to explain abstract and narrative concepts).

So, naturally, "Huh?" is the world's most translatable, understandable, and perhaps even instinctual word.  And it's one we should all learn to embrace.  As a native English speaker myself, being made aware of the way my language influences my thought processes has been an eye-opening experience.  Trying to understand instead of trying to be understood, for example, makes for better discussions and helps one expand one's boundaries and test one's knowledge.  And as far as learning is concerned, there's no better tool than, "Huh?"  A brief Google search shows that "huh" is used "to express a lack of understanding or to invite agreement, confirmation or further comment."  In other words, even if you're a writer-responsible language, "Huh?" can be used to ensure that the people you're speaking to are clear on what you're saying.  (Try it at home!  "Pretty crazy, huh?")

"Huh," therefore, is used both to gain understanding by expressing a lack of it, or to confirm understanding.   Asking questions is how we learn, and having a universal shorthand for "give me more information" is a remarkable, almost magical tool... not unlike language itself.

For years I've collected "untranslatable" words.  Below is a list of my favorites.  If you want a more comprehensive list, Global Lingo has a list of hundreds of words that don't translate into English.  And while some might leave you going "Huh?" plenty more might resonate with you in a way that makes you feel-- dare I say it-- understood.

Note: Looking for English words?  Look no further than the Dictionary of Obscure Sorrows, which offers definitions for arcane words that describe universal experiences and emotions.

My favorite is sonder: the sudden realization in a crowded place that every person there has as rich and complex a life and history as you do.

Most Relatable, "Untranslatable" Words:
My Top 15 Picks  
(Illustrations by Anjana Iyer)
  1.  Arigata-meiwaku (Japanese): An inconvenient favor.  Arigata-meiwaku occurs when someone offers to do something for you and you don't want them to and actively discourage them from doing, but they do it anyway, trying to be helpful, and it ened up creating problems or more work for you, as you knew it would, and you still have to grit your teeth and thank them because of social conventions.  The fact that Americans lack this word despite corporate culture is an utter mystery to me.

  2. Culaccino (Italian): The mark or ring left on a table by a moist glass.

  3. Dépaysement (French): The uneasy feeling of being a foreigner in a strange place.
  4. Elefantenrennen (German): Literally, "elephant racing."  It's the thing that happens when you're on the highway, and one semitruck tries to pass another even though they're moving at the same speed, and they end up side-by-side, blocking all vehicles behind them.
  5. Faamiti (Samoan): A verb describing the kissy noise you make by sucking air past the lips in order to gain the attention of a dog, cat, or very young child.
  6. Fernweh (German): Feeling homesick for a place you have never been to.  I first heard this feeling described by Torey Hayden in one of her books while she was describing the wait for a Welsh visa.  She did not use this word.  Discovering that a perfect word existed for Hayden's feelings blew my mind.
  7.  Gigil (Filipino): The urge to pinch, squeeze, or squish something that is unbearably cute.
  8. Iktsuarpok (Inuit): The feeling of anxious anticipation or excitement for a visitor that specifically results in you peeking out the window over and over to see if they have arrived yet.  In America, we simply call this "waiting for the pizza delivery guy."

  9. Ilunga (Bantu / Tshiluba):  Famously the world's hardest-to-translate word, this describes a concept or principle of forgiveness, in which a person readily and willingly forgives an offense the first time, tolerates or forgives with reservations a second time, but will not tolerate it a third time.
  10. Kummerspeck – Literally "grief bacon."  This word can either describe weight gained after an emotional devestation, or the binge eating that follows a terrible personal tragedy. 
  11. L’appel du vide (French): Literally, "the call of the void."  It's that feeling you get when you're standing on the edge of a precipice and have the sudden awareness, almost urge, to fling yourself off.  A sort of emotional vertigo not indicative of suicidal intent, just the hyper-awareness that you could easily and willingly jump from the building or cliff you're standing on.
  12. L’esprit de l’escalier (French): Literally, "stairway wit."  You know how, after an argument is over, you suddenly come up wth the perfect witty comeback?  That's l'esprit de l'escalier: a witty comeback that you think up after the moment to use it has already passed.  Curiously, this translates perfectly in Yiddish, to "trepverter."
  13. Mamihlapinatapei (Yagan): Famoulsy listed in the Guiness Book of World Records as "the most succinct word," this is a very specific type of glance.  A wordless, meaningful look shared by two people who both want to say or do the same thing, but are both reluctant to initiate.  I experienced this on my wedding say when the minister said, "The couple may now read the vows they wrote for each other," and both my partner and I realized, simultaneously, that we had forgotten to write our vows and were trying to silently signal the other to go first.

  14. Tartle (Scottish): The brief pause or hesitation during an introduction due to not being 100% sure that you actually remember the person's name who you're introducing.
  15. Tsundoku (Japanese): You love books.  You go to a bookstore and buy a book that looks great; you put it on your shelf with your other books.  Weeks later, a friend says, "This book looks interesting.  What's it about?"  You now have to admit to them that you committed tsundoku, the act of buying a book and leaving it unread.  

And of course, an honorable mention should be given to the German word "schadenfreude."  "Schadenfreude" is enjoyment derived from the misfortune of others.  I used it last week to describe the feeling I get when I watch cars outside of the bar get towed away during Rams games.  No longer an "untranslatable," this word is now a borrowed word, having entered the English lexicon after been featured in Simpsons and getting its own Broadway musical number, demonstrating just how quickly living languages can adopt and adapt words when they find a niche that needed filled.  Language is meant to be understood; don't be afraid to use "huh" to get there.  It's the one word you can always rely on.

Monday, September 2, 2019

Chemotherapy: A Pervasive and Misunderstood Treatment Option

One of the worst diagnoses a person thinks they can get is that dreaded "c" word: cancer.  When people imagine cancer, they imagine a very specific future.  Chemotherapy, hair loss, weight loss, a slow wasting peppered with bouts of nausea.  People's understanding of cancer, and chemotherapy, is largely informed by the cultural zeitgeist.  When we see people in movies with cancer, we see a rather specific interpretation.  Rarely, for example, do we see anyone who is chubby from taking steroids; rather, we always see someone pale and hollow-cheeked.

 There's not really anything funny about cancer so here's a meta-meme.

Though everyone today is familiar with cancer as a concept, few know much about it.  We probably think we know more than we do.  For chemotherapy, this is even more true.  People have a vague notion of what chemotherapy is: it's an IV drug that somehow kills cancer and makes you sick.

Beside that, the internet is rampant with misinformation.  If you've ever been on Facebook for more than two seconds you've probably seen all sorts of bogus conspiracy theories that revolve madly around cancer treatment, everything from the idea that it can be cured using crystals and sound frequencies (looking at you, Amanda) to the notion that a cure exists and is being hidden from us by doctors who are profiting off of the treatment of cancer.

It was one such Facebook post that inspired me to write this post.

As usual, this post cites no source.  One thing I've always told people on Facebook and other social media is, get the source.  That includes news articles.  If no primary source or study is linked, don't trust what you're reading.  This is an easy way to bullshit-check medical claims, even if you're not a scientist.

As a skeptic, I tend to go searching for the sources of claims such as these myself.  Most of the time, of course, it's total bunk.  But once in a while I stumble onto a half-truth that startles me.  (One time, a yoga-obsessed friend posted a link to an article stating that yoga can be used to treat scoliosis.  It's legit.)

I plunged down a rabbit-hole of chemotherapy research, curious about its true origins, and found that at least one part of the little meme was correct.  Chemotherapy was, in fact, actually developed from mustard gas.

Today's post is a brief overview of the history of chemotherapy, a thing most people only learn about in earnest once they have to deal with it on their own.  Hopefully, reader, you and I won't have to confront that from a personal standpoint for a long while yet, if at all. But nonetheless, it's not a bad thing to have a broad overview of the facts while emotionally distanced from them.  I will not be speaking much about cancer itself, because that's too broad of a topic.

However, I do want to explain at least one exceedingly common misconception about cancer.

The windmill thing.

That misconception is that it is a single disease.  People talk about the "cure for cancer" the way that they talk about the "cure" for the common cold.  But like the common cold, cancer isn't so much a single disease as it is a series of diseases.  Cancer comes from abnormal or uncontrolled cell growth or propagation, but it has a broad range of causes and manifestations.  Conspiracy theories who claim that scientists secretly have the cure for cancer are partially right.  There are forms of cancer we have effectively cured.  Skin melanoma, for example, has a 99% survivability rate.  And Hodgkin's lymphoma has an 86% survivability rate five years after diagnosis.

And it's with Hodgkin's lymphoma that we will begin, because this cancer, perhaps more than any other, is the one whose patients have benefited the most from chemotherapy.

First described in 1832 after the titular Thomas Hodgkin, Hodgkin's lymphoma is a cancer that affects the white blood cells known as lymphocytes.  It's often detected when people notice a swelling in the lymph nodes, something that occurs due to over-production of these cells.  Unfortunately it can also be easily missed because your lymph nodes also swell for most minor infections in which your body would produce extra lymphocytes to ward off bacteria.

 I am so sorry.

In 1915, the first effective use of mustard gas came into play.  Mustard gas (sulfur mustard) was designed by chemist Fritz Haber, inventor of the Haber process and strong proponent of chemical warfare.  Ironically carcinogenic, mustard gas, in its purest form, is both colorless and odorless.  But the type deployed on WWI battlefields was low-grade mustard gas.  Yellowish-brown and reeking of peppery horseradish, soldiers were not inclined to worry much about the carcinogenic properties, as contact with the gas caused immediate skin blistering, blindness, and, in large amounts, suffocation.  It was a terrible way to die and the use of mustard gas in warfare was banned by the Geneva Convention in 1925.  Hitler was famously against the use of chemical warfare after being hit in 1918.  (This dislike of chemical warfare did not extend to the Jews and political prisoners he gassed in the camps using Zyklon B, a chemical developed by, you guessed it, Fritz Haber, who was Jewish and whose whole family was ironically killed by his own invention.)

Soldiers who survived exposure to mustard gas suffered ill effects, including anemia and low white blood cell counts.  Clearly, the gas suppressed the development of some of the components of the blood.

Fast-forward to 1942.  Sulfur gas wasn't being used in WWII, but two scientists from Yale Louis Goodman and Alfred Gilman were nonetheless interested in the effects.  They had ample data from all of the WWI soldiers who had been hit and the idea that mustard gas suppressed white blood cell propagation intrigued them.  After injecting a ton of animals bearing lymph tumors with sulfur-nitrogen compounds, they observed that the tumors stopped growing and even got smaller.

Armed with the no-fucks-given attitude of medical testing in the '40s that gave us Captain America, they found themselves a human test subject, listed in the records as "J.D."  J.D. was suffering from massive tumors in his mouth, armpits, and groin; his prognosis was bad.  With no other option, in severe pain and desperate for relief, he agreed to be injected with "substance X."  Two months later, his tumors were receding and he reported feeling a lot better.

I wish I could give J.D. a happy ending but, alas, this earliest form of chemotherapy failed to target only the lymphocytes.  It wiped out J.D. bone marrow and platelets, and he ended up dying from internal bleeding as a result.  By some accounts, he "passed peacefully."  You can be the judge of whether or not you believe that to be true.

Who the fuck knows how the human body works, really.

Regardless, Goodman and Gilman published an article with their results in 1946.  On its heels followed Alexander Haddow's 1948 paper in Nature that determined which parts of the nitrogen molecule were actually needed and how they worked.  (Spoiler: it's the chlorine atoms.)  Later papers expanded on this, determining causality; the nitrogen molecule worked by linking to DNA within the cell and prompting it to "commit suicide."  Instead of multiplying endlessly, cancer cells were being triggered to self-destruct by the mutagenic properties of "mustine," the first chemical used specifically as chemotherapy to treat cancer.

The struggle of most scientists.
Haddow, happily, got to see the fruits of his labor in his own time.

Following Alexander Haddow's paper, researchers began synthesizing new molecules that were structurally similar to mustine but might better cure patients without killing them.  They came up with chlorambubil, put into use in the 1970s and used today to treat several types of white blood cell cancer, including leukemia and non-Hodgkin's lymphoma.  Since its use in the 1970s, survival from non-Hodgkin's lymphoma has tripled.

Haddow's paper describing the mechanism by which mustard gas actually worked meant that people were no longer being treated with mustine, but other chemicals, ones with fewer side effects.  These chemicals were calls "alkylating agents" because they worked by alkylating organic molecules such as DNA (rendering them useless, and unable to replicate, which in turn triggers cell death, or apoptosis).  Alkylating agents include chemicals not derived from nitrogen mustard at all, such as tetrazines.

One thing worth noting here is that chemotherapy targets living, multiplying cells.  In particular, modern chemotherapy tries to target cells that are actively or rapidly multiplying.  This includes not only cancer cells, but also hair follicles, which divide rapidly, which is why your hair grows so quickly.  This is why many chemo patients lose their hair and why their skin looks unhealthy; hair, fingernails, and skin have some of the fastest-multiplying cells in the body.  Chemotherapy isn't like antibiotics, capable of targeting "other" or "non-self" cells.  Cancer is, by definition, your cells that have gone bonkers.  Although modern chemotherapy can target specific kinds of cells (for example, ones that are multiplying rapidly), it's impossible with our current technology to separate cancerous vs. non-cancerous cells.  Cancer isn't a viral or bacterial infection; it's you.  Chemotherapy is often demonized as "poison."  This is correct.  It is literally an attempt to poison the bad cells in you, and regrettably, there is no way to separate "good" from "bad" cells, because all of them are yours, and contain your DNA.  Unfortunately, people who are suspicious of or don't understand chemistry tend to point to the vicious side effects of chemo as "evidence" that it's bad for you.  Yes, of course it's bad for you.  It's also bad for the cancer, which is also you.

Haddow wasn't the only one who had stumbled across something remarkable.  In Boston, Sidney Farber was toying around with aminopterin, a chemical that blocked DNA replication.  This was the precursr to methotrexate, a modern chemotherapy drug.  Using aminopterin, Farber got ten children with leukemia into remission, earning him the title of Father of Modern Chemotherapy.

The first "cures for cancer" had been found.  (The first cure for a metastatic cancer is widely said to be via the use of methotrexate, in 1956.)

In the 1950s, pharmaceutical giant Eli Lilly announced that plant alkaloids would be used as cancer drugs.  These weren't chemicals synthesized in a lab;  vinca alkaloids (like vinblastine and vincristine) were extracted from Vinca rosea.  Though they were using the research done on alkaloids that originally came out of the mustard gas era, these new compounds were naturally occurring.

Note that nearly all chemicals occur in nature 
and that "natural" does not inherently mean "safe" nor "effective."

The 1950s marked the beginning of "combination chemotherapy," in which multiple drugs would be used to try to target cancer from multiple angles while minimizing the effects on healthy cells.  In some cases, chemo was used to get rid of remaining cancer cells after the surgical removal of a tumor.  This is called adjuvant chemo and dramatically increased the remission rates of localized cancers like breast cancer and testicular cancer by ensuring the cancer couldn't return after the removal of the tumor.

Prior to the 1940s, cancer wasn't a big concern in the minds of most Americans.  Half of all mortalities were from infectious diseases.

You can look up the statistics yourself but I liked this graph in particular.  
Less than 4% of people died of cancer in 1900.  Less than 4%!  
By the 1940s, that figure would leap to over 10%.

It was only with the rise of antibiotics that cancer was noticed; people were living longer and actually developing it, instead of succumbing to something else, and in the 1940s and 1950s, a series of PSAs were put out declaring "war on cancer."


Many of these PSAs had to explain what the hell cancer was.  It was simply not part of the public knowledge back then.  (I have some vague memory of a cartoon by Disney or Looney Tunes explaining cancer using old timey speech, but could not find it and wonder now if I dreamed it up.)

Many of the things we take for granted now (like germ theory and antibiotics) are only recent medical discoveries.  Yet people have a tendency to pick and choose which types of medicines they deem "safe."  Chemotherapy is a pretty easy one to demonize, not only because of its history but also because of its nasty side effects.

But it works.

To those who think that chemotherapy is merely a conspiracy by evil doctors who are profit-driven, I have some bad news (that is, in fact, great news).  The passage of the Cancer Act of 1971 earmarked 85% of money spent in the "war against cancer" toward investigator-initiated research projects.  In other words, scientists, not doctors.  A sizeable portion of the remaining money was put into adjuvant chemo programs.  "But why chemo?" you ask.  Well, for one thing, chemo works.  Is it horrible?  Oh, yes.  But it works.  Alternative therapies don't; a massive study of cancer patients who refused recommended treatment in lieu of alternative therapies found that, universally, alternative therapies don't work in treating cancer.  Not in any form of cancer.  Not a one.

On the other hand, Hodgkin's lymphoma now has a 2/3 remission rate.  That is, people whose cancer is cured; 60% of those who undergo chemotherapy can now achieve complete remission without ever relapsing, allowing them to die from something else (if they're very unlucky, another form of cancer).  If you click on no other link in this post, please do click on this one, which links to a very well-written article on the history of chemotherapy and addresses many of the fringe anti-scientific criticisms of it.

Chemotherapy has a long and storied history, but this doesn't mean it doesn't work or is part of some vast network of doctors conspiring to profit.  There are plenty of instances in history that we stumbled across something wonderful while actively doing something horrible (or, more often, bone-headed).  The question isn't really how we discovered it, but how we can apply it now, and what good it can do us.  More importantly, we need to understand that the origin of a thing isn't the same as the thing itself.  Modern chemotherapy isn't the same as getting hit with mustard gas in WWI.  For many, it's their best shot at survival.  And demonizing it to push a pro-hemp agenda is staggeringly irresponsible. Chemotherapy is saving lives.  Does a better treatment option exist?  Maybe.  But we haven't yet discovered it.  And sometimes, as chemotherapy has shown us, life-saving discoveries can come from the unlikeliest of places.

Monday, August 26, 2019

An Internship for Autumn

It seems unlikely that I'll finish my journalism program at UCLA before the baby arrives, but nonetheless, I feel it's important to get as much done as possible while I still can.  It's not unlike the time I ran track in junior high, and face-planted after catching my foot on a hurdle.  Did I get up and keep running?  Yes, of course.  Did I come in dead last?  Absolutely; there was zero chance of recovery at that point.  The more important question is what genius thought I, a five-foot-tall creature sporting the limbs of a corgi, should run hurdles in the first place.

But that story is set in the past and today we'll be looking at the future.  Specifically, the next quarter of classes at UCLA.  The journalism program is designed to take about two years, or eight quarters; I have tasked myself with attempting to finish it in half that time.  Two quarters are down, and the third begins in a month.  The ending of the third quarter unfortunately corresponds almost perfectly with the due date for the baby, jamming a large baby-shaped wrench into the otherwise well-oiled academic schedule I've set.  But sometimes, you just have to handle things as they are thrown at you, particularly if they are babies, because if you don't catch the baby, you'll end up with a lot of people angry at you for dropping it.

 I'm so ready for this baby thing and not at all terrified 
or convinced that I'm going to fuck up hahahahaha.

One of the things that appealed to me about the journalism program at UCLA is that part of it involves an internship.  In my opinion, this lends it some credibility.  It helps students get real-world, applied, hands-on experience, and helps them network and discover potential post-graduation career options while allowing them to develop relationships and accumulate references.

This was what I felt before I saw that taking a quarter-long internship costs $660.

The idea of paying to work was laughable to me, especially since so many internships are unpaid.  The school is not providing any real value here; it's up to the students to seek out and get an internship, and up to the company providing the internship to set expectations and provide work experience.  All the school is doing is simple administrative work, which does not cost $660.

But I had little choice in the matter, so I spent the summer making cold calls to any and every publication company I could think of in the area, trying to get my hands on an internship, aware that I was competing with dozens of other UCLA journalism students trapped in the same position as I.

Happily, I found one.  My trust in "networking" paid off.  I had maintained a friendly relationship with one of my professors from spring quarter and he had put in a good word for me at the Santa Monica Daily Press.  I e-mailed the editor-in-chief multiple times before getting a reply back, and hastened to meet with him and establish seniority over any other potential applicants.

"Whose dick do I have to suck to get an internship?"  
- desperate college students everywhere

In the newspaper industry, internships have long been an entryway to the newsroom.  Many of my professors (who have previously worked at such publications as the New York Times and Hollywood Reporter) got their start as interns.  I plunged into the Santa Monica Daily Press with the starry-eyed optimism plucky young interns always have in sitcoms.  My impression was mixed.  The building was run-down, the hallways stark, and the newsroom musty with the air of a place that struggles to pay rent and keep up with repairs.  On the other hand, I espied two recycling bins and a dog bowl on the floor, indications of a progressive and chummy atmosphere.  I discovered there were only two full-time, staffed reporters - reading between the lines, I understood this meant there would be no job for me at the end of the internship.  I was told I would be writing bylines.  On one hand, a fantastic opportunity.  On the other, a curious reveal.  Was newsprint being kept on life support by a cadre of unpaid interns?
If so, they are in truly dire straits, since interns are universally stereotyped as 
having no clue what they're doing and getting their ties stuck in the copiers.

My internship doesn't begin until next month, leaving me with nothing but the initial impression I got during the interview, which was one of a dying industry.  But it led me to a lot of questions about internships themselves.

One of the major differences between an internship and an apprenticeship is that an internship is for a fixed amount of time, while an apprenticeship is for a fixed amount of skill.  Apprenticeships have been around since time immemorial, with master craftsmen passing down their skill sets to apprentices, but they peaked in the Middle Ages with rules and regulations set by guilds or town councils.  Curiously, apprenticeships were not, like modern internships, typically unpaid.  Apprentices enjoyed food and lodging under their masters, and might also receive a small stipend or portion of profit from their work, if it was good enough.  Apprenticeships typically lasted seven years, almost like an indentured servitude, except with a major payout at the end.  If you, like me, read Grimm's Fairy Tales, then you're aware that most apprentices didn't just graduate automatically into masters, but first became journeymen, who traveled around looking for work and bumping into witches and woodland nymphs who would try to challenge their moral righteousness for some reason.
"So long as no one challenges my scruples as I journey through these magic woods, 
Imma be a blacksmith in no time!"

Turns out, "journeymen" weren't called journeymen because of their nomadic lifestyles.  The word derives from the French word journée meaning "day."  They were day-laborers who worked for a wage; they held licenses and could sometimes be admitted to a guild but weren't allowed to work as self-employed, established "masters."  Masters were guild-members who had created a masterpiece, and often, a requirement to achieve "master" status in a craft required several years of working as a journeyman first.

I, for one, had no idea that the whole apprentice-journeyman-master thing was so heavily supervised and had so many rules.  Especially since, in modern times, apprenticeships were historically poorly regulated.  (One of the earliest attempts was in the National Industrial Recovery Act of 1933, which was such a massive failure that it was declared unconstitutional within two years of being passed.)

The current regulation for apprenticeships is the Fitzgerald Act, passed in 1937, but it doesn't extend to internships.  "Apprenticeships" have a very narrow modern definition and, as it stands right now, there are only about 750,000 "apprentices" in the entire United States.

Meanwhile, there are over 1.5 million interns - about twice as many. The word "intern" originated at the beginning of the 20th century.  Before World War I, the term described a doctor that had a medical degree but lacked a license.  During and after the war, doctors-in-training became known as interns, sort of the equivalent of a medieval "journeyman."  

Thanks in part to "The Devil Wears Prada," interns are now known 
less for their medical acumen and more for their coffee-fetching abilities.

The modern internship was an outgrowth of co-ops in the '60s and '70s, a way for students to gain work experience and try out jobs while attending school.  But it didn't bloom, not truly, until the 1980s, when business schools and government sectors began using the internship as a recruitment tool, according to Forbes.

The unpaid internship arose thanks to a Supreme Court ruling in 1947 that created a loophole in the Fair Labor Standards Act of 1938 (which established the right to minimum wage).  The loophole was that railway brakemen didn't have to be paid for their week-long training period.  After that, nearly any job could offer an unpaid internship, assuming it met six requirements summarized here by Times magazine:

1. The internship must be similar to training that would be given in an educational environment;
2. The internship must be for the benefit of the intern;
3. The intern does not displace regular employees;
4. The employer derives no immediate advantage from the intern;
5. The intern is not entitled to a job at the end of the internship; and
6. The intern understands that he or she is not entitled to wages. 

Of course,  in modern times, dozens of lawsuits have begun challenging the legality of the unpaid internship, with many interns challenging requirement #4 in particular.  Still, about 60% of internships in the US are unpaid, and more are cropping up every day.  Paid or unpaid, they offer valuable experience; resume audits show that students who held internships are 14% more likely to get a job post-graduation.

For an incredible summary of statistics regarding internships, check out this infograph from

As always, click for full size, or follow the link above to see it in its original format.

My own internship is unpaid, but that doesn't mean it might not pan out into something worthwhile.  If nothing else, it fulfills a requirement of my journalism certification, and without the necessity of interacting with any witches or woodland nymphs.