Monday, August 26, 2019

An Internship for Autumn

It seems unlikely that I'll finish my journalism program at UCLA before the baby arrives, but nonetheless, I feel it's important to get as much done as possible while I still can.  It's not unlike the time I ran track in junior high, and face-planted after catching my foot on a hurdle.  Did I get up and keep running?  Yes, of course.  Did I come in dead last?  Absolutely; there was zero chance of recovery at that point.  The more important question is what genius thought I, a five-foot-tall creature sporting the limbs of a corgi, should run hurdles in the first place.

But that story is set in the past and today we'll be looking at the future.  Specifically, the next quarter of classes at UCLA.  The journalism program is designed to take about two years, or eight quarters; I have tasked myself with attempting to finish it in half that time.  Two quarters are down, and the third begins in a month.  The ending of the third quarter unfortunately corresponds almost perfectly with the due date for the baby, jamming a large baby-shaped wrench into the otherwise well-oiled academic schedule I've set.  But sometimes, you just have to handle things as they are thrown at you, particularly if they are babies, because if you don't catch the baby, you'll end up with a lot of people angry at you for dropping it.

 I'm so ready for this baby thing and not at all terrified 
or convinced that I'm going to fuck up hahahahaha.

One of the things that appealed to me about the journalism program at UCLA is that part of it involves an internship.  In my opinion, this lends it some credibility.  It helps students get real-world, applied, hands-on experience, and helps them network and discover potential post-graduation career options while allowing them to develop relationships and accumulate references.

This was what I felt before I saw that taking a quarter-long internship costs $660.

The idea of paying to work was laughable to me, especially since so many internships are unpaid.  The school is not providing any real value here; it's up to the students to seek out and get an internship, and up to the company providing the internship to set expectations and provide work experience.  All the school is doing is simple administrative work, which does not cost $660.


But I had little choice in the matter, so I spent the summer making cold calls to any and every publication company I could think of in the area, trying to get my hands on an internship, aware that I was competing with dozens of other UCLA journalism students trapped in the same position as I.

Happily, I found one.  My trust in "networking" paid off.  I had maintained a friendly relationship with one of my professors from spring quarter and he had put in a good word for me at the Santa Monica Daily Press.  I e-mailed the editor-in-chief multiple times before getting a reply back, and hastened to meet with him and establish seniority over any other potential applicants.

"Whose dick do I have to suck to get an internship?"  
- desperate college students everywhere

In the newspaper industry, internships have long been an entryway to the newsroom.  Many of my professors (who have previously worked at such publications as the New York Times and Hollywood Reporter) got their start as interns.  I plunged into the Santa Monica Daily Press with the starry-eyed optimism plucky young interns always have in sitcoms.  My impression was mixed.  The building was run-down, the hallways stark, and the newsroom musty with the air of a place that struggles to pay rent and keep up with repairs.  On the other hand, I espied two recycling bins and a dog bowl on the floor, indications of a progressive and chummy atmosphere.  I discovered there were only two full-time, staffed reporters - reading between the lines, I understood this meant there would be no job for me at the end of the internship.  I was told I would be writing bylines.  On one hand, a fantastic opportunity.  On the other, a curious reveal.  Was newsprint being kept on life support by a cadre of unpaid interns?
If so, they are in truly dire straits, since interns are universally stereotyped as 
having no clue what they're doing and getting their ties stuck in the copiers.

My internship doesn't begin until next month, leaving me with nothing but the initial impression I got during the interview, which was one of a dying industry.  But it led me to a lot of questions about internships themselves.

One of the major differences between an internship and an apprenticeship is that an internship is for a fixed amount of time, while an apprenticeship is for a fixed amount of skill.  Apprenticeships have been around since time immemorial, with master craftsmen passing down their skill sets to apprentices, but they peaked in the Middle Ages with rules and regulations set by guilds or town councils.  Curiously, apprenticeships were not, like modern internships, typically unpaid.  Apprentices enjoyed food and lodging under their masters, and might also receive a small stipend or portion of profit from their work, if it was good enough.  Apprenticeships typically lasted seven years, almost like an indentured servitude, except with a major payout at the end.  If you, like me, read Grimm's Fairy Tales, then you're aware that most apprentices didn't just graduate automatically into masters, but first became journeymen, who traveled around looking for work and bumping into witches and woodland nymphs who would try to challenge their moral righteousness for some reason.
 
"So long as no one challenges my scruples as I journey through these magic woods, 
Imma be a blacksmith in no time!"

Turns out, "journeymen" weren't called journeymen because of their nomadic lifestyles.  The word derives from the French word journée meaning "day."  They were day-laborers who worked for a wage; they held licenses and could sometimes be admitted to a guild but weren't allowed to work as self-employed, established "masters."  Masters were guild-members who had created a masterpiece, and often, a requirement to achieve "master" status in a craft required several years of working as a journeyman first.

I, for one, had no idea that the whole apprentice-journeyman-master thing was so heavily supervised and had so many rules.  Especially since, in modern times, apprenticeships were historically poorly regulated.  (One of the earliest attempts was in the National Industrial Recovery Act of 1933, which was such a massive failure that it was declared unconstitutional within two years of being passed.)

The current regulation for apprenticeships is the Fitzgerald Act, passed in 1937, but it doesn't extend to internships.  "Apprenticeships" have a very narrow modern definition and, as it stands right now, there are only about 750,000 "apprentices" in the entire United States.


Meanwhile, there are over 1.5 million interns - about twice as many. The word "intern" originated at the beginning of the 20th century.  Before World War I, the term described a doctor that had a medical degree but lacked a license.  During and after the war, doctors-in-training became known as interns, sort of the equivalent of a medieval "journeyman."  

Thanks in part to "The Devil Wears Prada," interns are now known 
less for their medical acumen and more for their coffee-fetching abilities.

The modern internship was an outgrowth of co-ops in the '60s and '70s, a way for students to gain work experience and try out jobs while attending school.  But it didn't bloom, not truly, until the 1980s, when business schools and government sectors began using the internship as a recruitment tool, according to Forbes.


The unpaid internship arose thanks to a Supreme Court ruling in 1947 that created a loophole in the Fair Labor Standards Act of 1938 (which established the right to minimum wage).  The loophole was that railway brakemen didn't have to be paid for their week-long training period.  After that, nearly any job could offer an unpaid internship, assuming it met six requirements summarized here by Times magazine:

1. The internship must be similar to training that would be given in an educational environment;
2. The internship must be for the benefit of the intern;
3. The intern does not displace regular employees;
4. The employer derives no immediate advantage from the intern;
5. The intern is not entitled to a job at the end of the internship; and
6. The intern understands that he or she is not entitled to wages. 



Of course,  in modern times, dozens of lawsuits have begun challenging the legality of the unpaid internship, with many interns challenging requirement #4 in particular.  Still, about 60% of internships in the US are unpaid, and more are cropping up every day.  Paid or unpaid, they offer valuable experience; resume audits show that students who held internships are 14% more likely to get a job post-graduation.


For an incredible summary of statistics regarding internships, check out this infograph from UndercoverRecruiter.com:

As always, click for full size, or follow the link above to see it in its original format.

My own internship is unpaid, but that doesn't mean it might not pan out into something worthwhile.  If nothing else, it fulfills a requirement of my journalism certification, and without the necessity of interacting with any witches or woodland nymphs.

Monday, August 19, 2019

Crying Drunk: A Tale of Frozen Beer

[Author's Note: This article was originally written for a journalism class for summer quarter 2019.  Names have not been changed.  For a list of beers currently available at the D&M, click here.  The freezing problem is resolved and the drinks are now flowing freely... huzzah!  A note of interest: my two favorite beers, Cali Creaming vanilla ale and The Patsy coconut rye stout are both on tap as of the time of posting this.]


Crying Drunk:
Dogged by setbacks, the behind-the-scenes woes of a new bar’s operations will help you appreciate your next drink.


The customer has ordered the most popular beer on tap, the mango hefeweizen, and it’s up to J.T. to pour it. He tilts the glass and pulls on the tap. Foam erupts from the spout. J.T. winces; the spout sputters, spewing foam into the glass, onto J.T.’s shirt, and all over the unfinished wooden bar. He keeps the lever held down, waiting for the foam to settle into what might be a drinkable beer. The only other option is a lost sale, one that the Dragon and Meeple can’t currently afford.

For weeks, this has been the state of beer sales at the Dragon and Meeple, and it shows no signs of abating.

When most think of beer sales, and the setbacks that plague them, their minds turn to the acquisition and maintenance of a liquor license. This is certainly one of several hurdles the D&M bar has faced since it opened in March.

“It’s quite a lengthy process in the city of Los Angeles to get an authorization to use a liquor license, because there are two separate processes. The first process is applying to the state, and once the state approves you for the sale, the city places the conditions for which you can sell it,” explains the owner, Chris Buskirk. Seated at the bar, Chris perches on a high chair that seems too small for his heavy frame. One hand is curled around a hard-won glass of pale amber beer. In the center of the wall behind the bar is a massive metal sheet cut into the silhouette of a dragon, backlit by red. Chris wears a pale, coppery goatee that matches the warm colors of his beer and the light behind the bar; his flyaway hair frames his slightly reddened face. When he greets new customers with a smile that makes his small, deep-set blue eyes crinkle, he gives off the air of Clement Clarke Moore’s “jolly old elf.”

From his lookout, Chris watches J.T. wiping down a glass of beer. Beside it rests a pitcher, filled with foam. For every beer J.T. pours, three more go down the drain.

Liquor sales account for about half of Chris’s profits. The Dragon and Meeple is a gaming bar and liquor seems an indispensable facet of its business. But the bar floated for months without any liquor sales at all; the beer and wine liquor license was only approved by the city of Los Angeles two months after the business opened its doors. During those two months, operational costs outpaced profits. Like all restaurants and bars, the Dragon and Meeple experienced growing pains unbeknownst to its customers, who flock here in the evenings to sample any number of the dozen drinks on tap.


Two months of operational costs for a bar that is unable to sell beer was a devastating loss in revenue, but Chris’s struggle wasn’t over yet.

“Another round?” asks J.T., approaching a table.

“Just one more,” answers the customer, holding out an empty pint.

J.T. maintains a smile until he turns. The wince returns as he trudges toward the uncooperative tap.

Behind us, at long wooden tables, small groups bow over elaborate fantasy maps, talking in low voices and occasionally throwing a handful of 20-sided dice. They are unaware of the drama unfolding behind the bar. Half-finished beers stand guard at their tables, foam clinging to the sides of the glass. Patrons nurse their beers over the course of an hour or two, more interested in their games, but most finish two or three pints a session.

Those pints come at a high cost - for the business.

“Every restaurant has to face [problems] from time to time. But ours was particularly difficult in that one of the parts in the cold room that stores the beer was defective and every time-- occasionally, instead of the thermostat stopping the cooling, it would continue to cool indefinitely and freeze the room that should never get frozen. So it turns a cooler, a refrigerator, into a freezer,” explains Chris. He picks at a tray of shrimp and grits as he watches J.T. retrieve the foam pitcher. The second glass pours a little better than the first, but still, another beer goes down the drain.

After J.T. returns from delivering the beer, he wrings out his shirt and begins wiping down the splattered foam on the counter with the sufferance of someone who has had to do this many times before.

J.T. was present for most of the freezing incidents. There were eight, or nine, depending on who you ask. The cold room’s temperature typically sits between 38 and 42 degrees Fahrenheit, but overnight, it would plummet into the teens, freezing the beer in the kegs and preventing its distribution.

“A lot of foamy beers,” explains J.T. as he runs a rag over the countertop. “They were all equally terrible in their pour, which made every beer a hassle."

J.T. speaks about pouring the beer without mentioning the lengths he went to simply to get the lines unfrozen in the first place. A month prior, J.T. arrived at work to find the cold room’s temperature had sunk to 19 degrees Fahrenheit. He propped open the door with a wooden dining chair, revealing a stark interior; the only contents of this vault are the kegs, which squat on the floor and line three of the four walls. They trail flexible plastic tubing, or “lines,” that connect the kegs to the tap on the other side of the wall. When the room froze, these lines hardened into stalagmites. J.T. spent hours scurrying between the store room and a sink behind the bar, bearing a bucket of warm water and a bundle of rags. He swaddled the lines in blue terrycloth, but due to the cold, they instantly froze, and when he attempted to remove the old rags to replace them, they’d crackle like tundra frost breaking under footsteps. Steam ghosted from the rags into the air, and J.T.'s breath came out in puffs as he works to relax the beer lines enough to dispense foamy beer.


“We lost a decent amount of beer. It’s lost in the kegs, because every time the beer froze, as it unfroze, the beer in the lines became foamy, and it would take three times as much beer to pour one sellable glass of beer,” explained Chris.

I asked the other employees their take on the frozen beer. Though none have as great a stake in the business as Chris, everyone had an opinion, and none of them were positive.

The head chef, Rob, summed it up in practical terms: “Not good. Waste of products.”

But the effects of the freezing cooler extended further than that.

One of Buskirk’s managers, A.J. Harris, says that the repeated freezing of the cold room had “an emotional effect” on the business. “ I mean, you can expect things to go wrong, but it's just, it feels like for me personally, that things are going wrong more than normal,” he says, clearly frustrated. Rob is a reticent man and offers a single nod to this comment, which prompts A.J. to continue.

“We get into this idea that things are just gonna go wrong around here, and they're gonna go wrong all the time, and I was in a state where I'm just walking in assuming something has already happened while we weren't in here, you know, because of all the other things that have also gone wrong in this place just in the first, like, month. We had all kinds of problems!” he exclaims, gesticulating wildly. Rob offers another silent nod of agreement.

J.T. sums it up neatly: “The beer froze, and so did my heart. It was difficult; it was not a fun time.”

Entire kegs were lost in the series of freezes that hit the cold room. For this fledgling business, such setbacks were devastating. As a small, independent business owner, Chris is hyper-aware of the statistics against him; according to CNBC, 60 percent of new restaurants shutter within the first year. The Dragon and Meeple is now five months old, but its liquor sales only stabilized last month.

A few weeks after the most recent freezing incident, I ask everyone how much product they think was wasted over the three-month series of failures.

“More than I’m comfortable saying,” replied Chris with a pained smile.

“At least half the cans,” said J.T. “They were pressurized and the cans cracked, so we ended up with a beer lake in the fridge, and a bunch of warped cans.”

“A lot,” says Rob in his usual restrained manner.

“A lot,” echoes A.J.

The Dragon and Meeple has now been open for six months. A delivery of kegs, wheeled in through the front door on dollies, is greeted with enthusiastic cheers by the patrons. The delivery man offers Chris a clipboard, and he signs for the kegs as J.T. rolls them from the bar to the cold room.

The Dragon and Meeple is the last delivery of the day, and the man sits at the bar to order a brown ale. J.T. grabs a glass, tilts it, and pulls on a tap.

The tap, courtesy of the installation company, 21st Draft Systems.

The liquid pours out heavy and uniform.

“If and when we succeed… it'll certainly have been through effort and hard work, because of all these terrible things that have happened,” says A.J.. According to him, it’s been three weeks since a defective solenoid in the cold room was replaced; since then, there have been no new freezing incidents. Still, the staff remains wary.

“It seems to be resolved now. Knock on wood,” says Chris, rapping his knuckles on the bar as J.T. forks over the brown ale. Two fingers of foam crown the glass, the perfect amount. The driver drinks it, tips $2, and then bids everyone good-bye.

The shift change is demure and unremarkable. J.T. confirms with A.J. and Chris that he’s finished, and taps an ID number into a computer behind the bar. A time-stamped ticket prints on receipt tape; J.T. takes it, and then grabs a frosted glass from the cooler below the bar. He reaches up to pull down one of the taps and pours himself a sparkling amber ale. Going around to the other side of the bar, he pulls up a stool beside Chris and settles his beer onto a cardboard coaster.

J.T. typically enjoys a beer or two after work. With Chris’s blessing, he doesn’t pay for them. But he, better than anyone, appreciates the cost of a pint.

Monday, August 12, 2019

On Living Forever

 In the constant quest for content ideas, I occasionally turn to my partner, Andrew, who offers ideas that rarely pan out.  Most recently, for example, he came to me, unprompted and excited, with an idea.

"I was just thinking about how people talk about living forever as this, you know, super power, like flying or being invisible.  But how you think of it like a curse.  And how it's been examined a lot in mythology and everything.  You should write an article on that," he said.

"I think that article has been written several times over," I said.

"Yeah, but you could do a fresh take.  You know, like... good," he said.

With Monday upon us and no better ideas, I figured I might as well.  (Whether or not my take is good is to be determined.)

Ask someone how long they want to live, and the answer is telling.  Some people have a number; others just a general, subjective answer.  ("As long as I'm healthy.")  Some people (about one in five) will say "forever."

 Those people are annoying B-characters who we wish would go away.

For time immemorial, long life has been considered a desirable thing and has been a present theme in a variety of myths.  In the Bible, the ten "Antediluvian patriarchs" all lived past 900 years old, with Methuselah making it to 969.  (These patriarchs were the ancestors between Adam and Noah.)  The "Philosopher's Stone" of Greek mythology was thought to promise immortality (as well as unlimited gold) and had an entire field of study dedicated to creating it (alchemy), with the term "Magnum Opus" being coined to describe the work that would someday reveal the secrets of its creation.  The search for an "elixir of life" ironically poisoned at least one Chinese emperor, Zhu Houcong of the Ming dynasty, who drank toxic amounts of mercury. 


In Persian mythos, the Cup of Jimshad was said to contain the elixir, while in European culture, people sought (and killed, and died for) the "Holy Grail."  Some sought out a more natural springwell; Ponce de León crossed the Atlantic, ending up in either Florida or the Bahamas, and many conquistadors followed, searching in various locales around South America in their quest for the famous "Fountain of Youth."  In Greek myth, the gods ate ambrosia to become immortal; the Norse gods, on the other hand, ate golden apples.  In nearly every culture, the concept of living forever existed.

And in modern times, we continue to try to ply old people for their "secret" to long life, with answers ranging from wholesome to delightfully perverse.  (Jeanne Calmert, the oldest recorded person, lived 122 years and 164 days, and smoked.  Kane Tanaka is the oldest living person at the time I'm writing this; she is nearing 117 and drinks three cups of coffee, as well as soda, daily.)




But there's a dark side to the "living forever" myths.  Many immortal beings are cursed.  Look no further than the undead.  Vampires, for example, are a cursed cryptid doomed to live forever.  Hydra and T. dohrnii, the "immortal jellyfish," are a real-life animals with biological immortality who may or may not be miserable. 

Don't be.
Their mouths and anuses are the same orifice.

Certainly, for every tale of someone being bestowed immortality as a blessing, there's another of it being bestowed as a curse.  One of my favorite examples comes from Jonathan Swift's book Gulliver's Travels, in which he describes the "struldbrugg," a race of humans who live forever but age normally, growing increasingly weak and feeble-minded, and turning into a nightmarish walking corpse who are utter nuisances to society.  A far more modern example of immortality as a curse comes from the Disney movie Hocus Pocus, in which Thackery Binx is cursed to live as a cat forever.  (Although the cat's death in the movie is supposed to be a happy moment, I cried my eyes out as a kid when it happened.)

So what's the deal with aging, anyway?


No one knows.  Theories range from mutation accumulation to telomere shortening to programmed cell death (or "planned obsolescence") to antagonistic pleiotropy (which more or less states that the energy involved in maintaining life is eventually the very thing that kills you) to Gypsy curses.   There's a lot of ideas but no one is entirely sure.  If we knew, we would have cured progeria by now.  Progeria is a genetic disease that causes rapid aging and no one is 100% sure how it works.  But we know it probably has some clues about how aging works.  And we know that aging is something that can be cured.  Horrifyingly, according to Wikipedia page for immortality states, "as the existence of biologically immortal species demonstrates, there is no thermodynamic necessity for senescence."


Of course, this is all only talking about the physical limitations of living forever.  There are awful mental and social consequences of such a thing, many of which are people's arguments against why they wouldn't want to live forever.  These include having to watch all their loved ones die, seeing the world's population explode, and the terrible irreversibility of a single decision.

To me, one question has always been the mental implications.  Though the brain has ample storage space (millions of gigabytes), that translates practically into only a few hundred years of memories.  And what happens then?  Does your brain simply start overwriting old memories, making everything you experience impermanent and therefore meaningless?  Or does your brain stop making new memories, rendering you stagnant and senile but still physically youthful?  And what about time relativity?  The longer you live, the more time speeds up from your perspective.  Wouldn't a 1,000-yr-old being find themselves in a situation where all interactions became brief and meaningless, and even a "lifelong" relationship passed in the blink of an eye?

These questions haven't deterred some scientists, who fear death more than they do immortality.  A growing movement of transhumanists have been trying to figure out how to "upload" human consciousness into a storage device.  Some believe "immortality" might be possible by 2050.  This has raised the question of what is "self," whether an immortal being could even be considered a single entity, or whether it would end up in an endless Ship of Theseus cycle, like the immortal jellyfish.  (There's a really excellent New Yorker article on this very topic.)


The subject of "ideal" age goes hand-in-hand with the question of whether or not we "should" live forever.  Some studies have determined that the "best" age is somewhere in the mid-thirties, factoring in both the wisdom that (supposedly) comes with age along with physical decline.  Others have said that the best age is 50, though this discounts those who are going purely by physical prowness or fertility, which peaks in the twenties.  Ultimately there's no correct answer because different people value different things as individuals; if your goal is wealth accumulation, then 50 might be a better age for you than 27, but if your goal is to run a really solid marathon, then 27 is the best place to be.

It's worth noting, though, that no one's ever said that the "ideal age" is higher than 100.


So immortality.  My take?  It's not something I would ever want; I believe death is a very natural part of life and it's not something I trouble myself with.  For me, life has been on an upward trend, getting gradually better as it's experienced.  But that doesn't mean it has to last forever.  How long would I choose to live, if I could?  Maybe 90, maybe even 100.  Not longer.  Life is meant to be lived and experienced, not hoarded.  And among all this talk of longevity, people are missing a very important factor: quality.  Quality of life should be our focus, not quantity.  "Youth" and "health" are, ultimately, means to an end.  That end is the enjoyment of life.  Don't waste your life searching for more of it; love what you have and what you're given, and beware any elixirs that seem too good to be true.  They might just contain mercury.

Monday, August 5, 2019

Gender Reveal Parties Can Get Bent

"Do you want to know the gender?" asked the technician as she moved the wand over the flesh-balloon-mound encasing the baby.

I started to explain to her the difference between gender and sex but instead ended up blurting, "Oh, it's a boy," as the fuzzy black-and-white picture on the screen in front of us scanned over a small baby butt, a pair of legs, and a very obvious penis.  There was no hiding the penis.  The baby is unambiguously anatomically male.

In other words, it's a boy!

What the fuck kind of cisheteronormative nonsense is this.

For weeks now, people have been asking what we wanted.  They seem incredulous when we say there's no preference.  There really isn't.  After all, it's the first kid; there's no pressure on the first kid.  That all gets put on the second kid to "complete the set."  When we found it was a boy, a surprising number of people said, "Oh, I bet Andrew's happy, huh?"

I don't know where they got the idea that Andrew wanted a boy.  (He was leaning toward girl.)

Before finding out the sex, everyone weighed in.  Everyone had an opinion.  My stupid reiki friend insisted he could somehow psychically tell it was a boy.  (You have a 50/50 chance of guessing correctly, so his accurate prediction was not that impressive to me.)  My previously-pregnant friends had opinions on what signaled it was a boy.  Boys "kick more."  (Compared to fucking what?)  Some people threw well-meaning but ultimately useless literature at me saying things like boys are "more likely" to be born if you eat a high-sugar, high-fat diet.  (This one is true, but the statistical shift is mild.)  (Also, girls are very slightly more likely to be born if you experience severe morning sickness.  But not always.  It's still pretty much a 50/50 crapshoot.)

People's obsession with sex has been of minor annoyance to me because it feels like they're already shoehorning the baby (which is less than a pound and doesn't even have eye pigment yet) into social roles and expectations.

 In case you wanted to know what the baby's face looks like, here ya go.
I made this.

But you know what?  I kind of get it.  And today I'd like to talk about the strange 21st-century obsession with "the gender reveal" and my thoughts on the matter.

Let's start at the beginning.  The year was 2008 and an Angelino blogger by the name of Jenna Karvunidis had made a somewhat hideous duck cake.
 
Jenna was pregnant.  When she cut into the cake, a baby fell out.

No, wait.  It was pink frosting.  Inside the cake, there was pink frosting, signifying that Jenna's baby was a girl.  Jenna had just unwittingly invented "the gender reveal."

 The baby was born healthy but they had to put the cake down.

It's only been 11 years, but gender reveals have taken off, often with disastrous consequences.  These include a 47,000-acre wildfire, injuries from errant baseballs and fireworks, and one party that involved harassing a live alligator.


Funnily enough, Jenna decided to opine on gender reveals after ten years, stating: "Who cares what gender the baby is?  I did, at the time, because we didn't live in 2019 and didn't know what we know now - that assigning focus on gender at birth leaves out so much of their potential and talents that have nothing to do with what's between their legs."

 But how will the child know what's expected of it if you don't start a fucking wildfire?

The phenomenon has since gotten a liiiittle out of hand, with people coming up with elaborate ways to reveal "gender."  Of course, they mean sex, but no one wants to say the s-word, because this (correctly) implies that we're color-coding infants so that strangers know what their genitals are.


In researching gender reveal parties (i.e., looking for memes) I found a lot of borderline aggressive defenses of pink = girls and blue = boys.  My favorite bullshit explanation was that girls are "naturally" drawn to pink because, as evolutionary "gatherers" who hunted for berries, pink would appeal to them.  (This article does not explain the fact that there are many, many, many poisonous pink and red berries, including holly berries, yew seeds, and Jerusalem berries.)

The author also seems to have forgotten that "pink for girls" was only the rule after the 1940s. A hundred years ago, a1918 article from Earnshaw's Infants' Department said, "The generally accepted rule is pink for the boys, and blue for the girls. The reason is that pink, being a more decided and stronger color, is more suitable for the boy, while blue, which is more delicate and dainty, is prettier for the girl."  Store catalogues that claimed pink was better for boys included Filene’s (Boston), Best & Co. (New York), Halle’s (Cleveland) and Marshall Field (Chicago).


This means in his baby photos, Captain America was probably in pink, since he was born in 1918.
Adorbes.

Ultimately it's stupid to try to ascribe any color to infants because all infants, no matter how you dress them up, are just pooping, crying, sleep-deprivation machines that fail to do anything interesting for the first year they're alive, unless you count having dangerously floppy necks as "doing something interesting."

So what's the big honking deal with gender reveal parties, anyway?

The short answer is, if you don't care about the baby's sex, being able to discover the baby's sex on an ultrasound is a milestone.  The body scan is typically done at 20 weeks, the halfway point of a human's 40-week pregnancy.  At this stage, the baby is distinctively baby-like.  It is no longer a blob but a "person."  It has graduated from embryo to fetus, and the chances of miscarriage are drastically reduced.


For many, the 20-week ultrasound is the first ultrasound, and the first time they are seeing their baby.  The sex of the baby, once unknown, is now known.  It's the first time you're discovering something about your baby.  What's not to celebrate?

If we wanted to get rid of the "gender reveal" party, we'd be hard-pressed to find something to replace it with.  Infant eye color doesn't become permanent until after about nine months post-birth, and other things, like having ten fingers, tends to be so universal, there's no mystery or anticipation.

But I have an idea and I think it's a good one.

Pro: it doesn't involve ruining lasagna.

Why not have a name reveal party?  Names can be (but are not necessarily) gendered, and week 20 is probably a good time to stop calling your fetus "Gumwad," since it no longer resembles a wad of gum.  You could still do something stupidly extravagant, like writing the name in the sky with gender-neutral and probably-toxic green smoke, or painting the name onto the side of a wildebeest stampede.  I don't know.  I'm just trying to meet you halfway, Pinterest nerds.

The reason people have gender reveal parties is because they want a party to celebrate their child, and the sex of the baby is the first of one of many variables they learn.  The baby suddenly feels more like a person and less like a parasitic blob of cells.  And unlike baby showers, gender reveal parties let the parents get their party-goers excited by creating "stakes."  (Certainly, all of my friends weighed in on whether they thought it was a boy or a girl, and got somewhat competitive in their predictions.)

Oh... oh no...

I think gender reveal parties are stupid.  But I get it.  People are celebrating their baby.  Knowing something previously unknown gives you a sense of "knowing" the baby, which doesn't initially feel like it's knowable.  Babies start out as abstractions and slowly evolve into actual human beings.  It's a head trip, and having a way to give the baby something concrete (i.e., assigning it a gender) is shorthand for moving the baby from abstraction to reality.

I personally hate the way we pussy-foot around the fact that we're having a whole event that revolves about an infant's genitals, which is why, if I were to have a party, I would call it a "baby's sex organs party," forcing everyone present to ask themselves why the hell they are my friends.

Normal friend: "I brought a cute blue bear for the baby."
Me, thrusting this picture at them: "Look at it."

But wildfires aside, let people have their gender reveal parties.  So long as they don't start wildfires, and so long as they understand that their child's interests, hobbies, styles, and tastes may not be informed by their gender at all, and so long as they love their child for the content of their character regardless of their genitalia, then who cares?  Let people eat their disgusting blue lasagna.  The most important thing is for parents to love their kids for the people they grow into.  But at twenty weeks, kids haven't grown into anything yet, so I don't see the harm in finding an excuse to celebrate their existence as it is.  The true test of parenthood will come much later, and so long as parents can keep an open mind about the kind of person their child may (or may not) become, they're hurting no one by having a tacky party.  

That being said, I will not color-code my infant if I can help it, nor will I ever try blue lasagna.

Pinterest idea:
have a gender reveal party that's just a 30 minute PowerPoint presentation about how gender is different than sex.

Monday, July 29, 2019

The Far-Reaching Effects of the World's Fairs

You've probably heard that the world isn't fair.  I don't know who said it had to be, but people are obsessed with this idea of "fairness."  Of course, they're less obsessed now than they were in 1939.

1939 was the year of the largest World's Fair on record (by attendance).  World's Fairs were a cultural phenomenon spanning about 150 years, and while we don't think of them much today, we are still affected by the legacies they've left.

The first World's Fair is largely considered to be the 1791 Prague exhibition on industry held by King Leopold II of Bohemia.  (Not to be confused with King Leopold II of Belgium, who reigned in the late 1800s and was better known for being a total fucking monster responsible for the genocide of 10 million people in the Congo.  For more info about what a horrible person King Leopold II of Belgium was, I highly recommend the book King Leopold's Ghost: A Story of Greed, Terror, and Heroism in Colonial Africa by Adam Hochschild.  I came to know about King Leopold II from an infamous picture online of a Congolese man contemplating his child's severed hand and foot, delivered to him as punishment for not meeting a rubber harvest quota.  View at your own risk.)

World's Fairs erupted in popularity, with over 100 fairs or "expos" being held throughout the 1800s.  They reached their heyday in the early 1900s, but tapered off after WWII.  Massive affairs showcasing industry, technology, and architectural prowess, they faded into obscurity, leaving behind monuments taken for granted in the modern era.

The Unisphere was commissioned for the 1964 World's Fair in Flushing Meadows, NY.
You might remember it from Iron Man 2; Stark Expo 2011 was held in Flushing Meadows.

The reason I have been thinking of World's Fairs is that I recently re-read a book that wasn't, thankfully, about King Leopold II of Belgium.  The book is called Devil in the White City: Murder, Magic, and Madness at the Fair That Changed America, by Erik Larson, and much like King Leopold II of Belgium, it's terrible.  This book was part of the curriculum for one of my journalism classes and I can't stress enough how much I loathe it.  (The book, not the class.)  It's a sensationalist piece of garbage that's full of wild speculation and unwarranted medical diagnoses from an author wholly unqualified to give them.  The book focuses on two stories in tandem: the story of H.H. Holmes, considered America's first true "serial killer," and the building of the 1893 World's Fair.  




The parts about the fair were dry but informative.  The parts about Holmes were titillating only in their morbidity.  In fact, for our first assignment, here is an abridged version of what I wrote: 

I would like to begin my analysis with a disclaimer: I had already previously read this book, and loathed it.  As such I have a strong bias against the material.  (On re-read, I continue to loathe it.)

The  titular "Devil" referenced in "The Devil in the White City" is H. H. Holmes, the infamous doctor, con artist, and serial killer.  

My opinion of the characterization of Holmes ( Mudgett) is that, frankly, the author did a poor job.  As far as physical description goes, if Larson were taking this class, I wonder if he would not get a lot of feedback reminding him to "show, don't tell."  He is constantly describing Holmes as "handsome" without stating why (unless you count reminding us that Holmes has blue eyes every three pages or so).  We are treated to virtually no physical descriptions whatsoever; the only emblem we see through part 1 is Holmes's surgical valise on page 12.  Aside from this, we get vapid descriptions such as "he was handsome" or "he was charming," and little to substantiate it.  My favorite parts were mentioned above, wherein we see Holmes doing something instead of simply being told how he was.  On page 35 we get a strange physical description written by Capen that actually seems at odds with Larson's descriptions of handsomeness, in which Holmes is said to have thin lips, very wide eyes, a very thin frame, and tiny ears.  On page 63, it states Holmes is "handsome, warm, and obviously wealthy" without any more info (except, of course, reminding us that his eyes are blue).  Again on page 87 Larson states that Holmes is "handsome and clean and dressed well," without elaboration, unless you count the bit about how his gaze is "blue and forthright." And on page 91 Holmes is described as "good-looking, almost delicate, who conveyed an air of confidence and prosperity."  The very next line?  "He had striking blue eyes." 

Holmes isn't the only character who gets this treatment by Larson.  The other main character, Burnham, is constantly being described in such vague terms as "candid, direct, and exuding an air of leadership" (page 94) without explaining how or why.

Frankly a lot of it seems like conjecture on the author's part.  In the incident with the skeleton, for example, the author spins a completely imagined tale in which a young Holmes gazes coolly upon a skeleton and then back at his bullies, causing them to run away in terror.  That little piece of fiction had me rolling my eyes.  (At least the author, in this portion, restrained himself from reminding us that Holmes's eyes were blue.)

I believe if the author had spent half as much time describing the characters as he did the city, it would have been a much better book.  The city of Chicago is treated to beautiful, fantastic, detailed descriptions of color, smell, sound, and movement.  The characters are little more than talking heads.  At first, I thought the lack of descriptions of Holmes's outfits, movements, or voice were because of the author's unwillingness to speculate.  But then we get the skeleton scene and it struck me that the author doesn't mind speculation whatsoever.  (We also get speculation about Mudgett killing animals and keeping skulls as trophies, and speculation about the photography incident.)  The author simply doesn't care to describe any of the people aside from occasional flights of fantasy.  His first and only love is the city of Chicago, which gets ample description and demonstrates an ability to describe, making it all the more frustrating when the author fails to every time he introduces a character.
This book would have better been titled "The Blue-Eyed But Otherwise Nondescript Devil in the White City (Descriptions of Which Account for 60% of the Text)."

It is my belief that this book would never have been made into a best seller if not for the subject matter.  (Who doesn't love a good serial killer story?)  The sad part is, there were a lot of incredible people who attended the fair who (probably) weren't serial killers, including Thomas Edison, Buffalo Bill, , Susan B. Anthony, Jane Addams, Clarence Darrow, George Westinghouse, Archduke Francis Ferdinand, Nikola Tesla, Helen Keller, Alexander Graham Bell, Marshall Field, and others.


The other shame of the book is that, if you can get past the piss-poor writing, the book contains a wealth of incredible information on the World's Fair, the accomplishments of which are really overshadowed by Holmes's gleeful killing of bright-eyed young women.

 There is truly a meme for everything.

My familiarity with the concept of a World's Fair, before reading this book, was severely lacking.  In fact, I could think of only two real things associated with it, and both were fictional portrayals.  One was the Stark Expo in the first Captain America movie, where Steve and Bucky go see a flying car, and the other was a short cartoon from my childhood about the World's Fair that was included on a video cassette called "101 Cartoons."  I hadn't thought about that fanciful little cartoon for years but it came flooding back to me when I was reading.  Made in 1938, I was struck by how clearly influenced by the famous 1893 Fair this cartoon was.


The 1893 Fair came on the heels of the Paris Fair of 1889, and there was a lot riding on its shoulders at the time.  Long before the Space Race of the mid-20th century, countries competed in World's Fairs trying to outshine each other with their exhibits.  As the Olympics was to sports, so these fairs were to industry and architecture.  The 1889 Parisian Fair gave us the Eiffel Tower; the 1893 Chicago Fair gave us the Ferris wheel; the 1962 Seattle Fair gave us the Space Needle.

I had had no idea that these monuments had come out of fairs, but indeed they did.  The 1893 Chicago Fair (officially titled "The World's Columbian Exposition") resulted in over 200 buildings being built and drew a crowd of over 27 million people during its six-month run.  (It's also the reason we have Columbus Day.)  Though all the buildings were originally meant to be temporary, some still stand today; Chicago's Museum of Science and Industry, which is the largest in the Western hemisphere, was originally the Palace of Fine Arts.  To give you some idea of how goddamn big the fair was, here are modern pictures of the Museum of Science and Industry.



 Jackson Park today.

Jackson Park, the vicinity of the fairgrounds, is now a major hub in Chicago, and it was completely revitalized for the fair.  The lagoons were all man-made; the 200+ buildings were all built up in less than a year; Chicago's L train system was expanded and an intramural rail system was built solely for the fair, being one of the first to use electric traction motors.  A/C electricity was showcased, helping push the eventual adoption of A/C over D/C, and this was due for purely bureaucratic purposes: General Electric's final bid for the rights to illuminate the fair was $554,00, and Westinghouse undercut them with a bid of $399,000.  The world, and Chicago in particular, were never the same after the 1893 fair.

And then there's the Ferris wheel, a staple of pretty much every modern fair.  One of dozens (if not hundreds) of proposals, the wheel was suggested as a centerpiece for the Chicago Fair as an answer to the Eiffel Tower.  Also considered were a 9,000-ft. tower you could toboggan off of, a 1,500-ft. log cabin tower with a restaurant on top, and a 4,000-ft. tower with a bungee-jumping car. I'll let the stupidity speak for itself:

"... R.T.E. envisioned a tower four thousand feet tall from which he proposed to hang a two-thousand-foot cable of "best rubber." Attached at the bottom end of this cable would be a car seating two hundred people. The car and its passengers would be shoved off a platform and fall without restraint to the end of the cable, where the car would snap back upward and continue bouncing until it came to a stop. The engineer urged that as a precaution the ground "be covered with eight feet of feather bedding." (Larson, p. 156-7)

The late 1800s were wild. Thankfully for the attendees of the fair, the Ferris wheel was chosen over the bungee car proposal.  It turned out to be the right decision; the wheel raked in a gross profit of $726,805 in 1893 money (about $20.5 million, today).  After the fair, was relocated to the 1904 St. Louis World's Fair, where it was later demolished using dynamite, because people back then knew how to fucking party.

Squint.  That's the original wheel at the 1904 St. Louis Fair, in the far distance.

Nowadays you'd be hard-pressed not to find a Ferris Wheel at a fair, World or otherwise.  Like the Midway, the Ferris Wheel became an iconic staple of fairs thanks to its impact at the 1893 Fair.  (The iconic midway of this fair was the first in which "amusement" attractions were separated from more educational "exhibit hall" features.) 

It would be easy to criticize World's Fairs as opulent displays of wealth, their extravagance a waste of time, money, and brilliant minds.  But I would argue the opposite.  Fairs inspire and influence and educate, and are perhaps one of the most fantastic peaceful achievements of our modern era.  The 1893 Fair provided tens of thousands of men with work in the midst of a massive recession or "Panic," as it was then called, and their work was limited to eight-hour days and included healthcare (to prevent them from unionizing, God forbid!).  The 1893 Fair had far-reaching effects on architecture, industry, technology, and labor that we are feeling even today, even if we don't know about it, and I find that a hell of a lot more interesting than some blue-eyed serial killer.

 There was a ghostly beauty to the construction of the fairs that had nothing to do with serial killers.

Why aren't we still having World's Fairs today?  Perhaps because of globalization.  The world is a smaller place, and you can get a lot of it from the internet.  But I don't know... is the internet really a good substitute for the awe-inspiring, inconceivably grand scales of a World's Fair in real life?  I doubt it.  No, I think that the arguments against World's Fairs are unsubstantiated.  In my humblest of opinions, having competitive, peacetime displays of industrial leadership and technological dominance are a good thing.  Like the Space Race, they have unintended and far-reaching effects that tend to benefit mankind.  (The Space Race gave us memory foam, freeze-dried food, ear thermometers, and a superior ballpoint pen.)

 Another shot of the 1904 St. Louis fair.

So what gives?  I say we should have another World's Fair here in America.  If we didn't in my lifetime I would really feel that I had missed out.  (The last one in America was four years before I was born, in Louisiana.  There was supposed to be one in 1992 in Chicago, but it was cancelled.)  I guess it's true what they say: life really isn't fair.