Uncategorized

I’m an Old Millennial

This fantastic piece (copied below) by Jesse Singal of New York Magazine is of the same mind as my 2014 rageblog I am NOT a Millennial! Hope you enjoy.

 

There’s a sensation you get when you hear the name of a group you’re a member of. If someone says “Bostonian” or “liberal” or (sorry) “Patriots fan,” my brain perks up a little. Oh, they’re talking about me. Over the last few years, though, I’ve found I’m getting less and less of that ping from the term millennial.

Technically speaking, I’m definitely a millennial. I was born in 1983, which means I’m part of the generation, whether one uses the Census Bureau’s definition (born 1982–2000) or Pew’s (about 1981–1997). But the more I hear about millennials, the less I recognize myself. And I’m not alone on this front: In 2015, for example, Juliet Lapidos — born the same year I was — may have put it best in a column for the New York Times headlined “Wait, What, I’m a Millennial?” “I don’t identify with the kids that Time magazine described as technology-addled narcissists, the Justin Bieber fans who ‘boomerang’ back home instead of growing up,” she writes. And I’ve had plenty of conversations with other people my age who feel the same way. Many, many people who are in their late 20s and early 30s simply don’t feel like they are a part of the endlessly dissected millennial generation.

As it turns out, there are good reasons for this. Old Millennials, as I’ll call them, who were born around 1988 or earlier (meaning they’re 29 and older today), really have lived substantively different lives than Young Millennials, who were born around 1989 or later, as a result of two epochal events that occurred around the time when members of the older group were mostly young adults and when members of the younger were mostly early adolescents: the financial crisis and smartphones’ profound takeover of society. And according to Jean Twenge, a social psychologist at San Diego State University and the author of Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitledand More Miserable Than Ever Before, there’s some early, emerging evidence that, in certain ways, these two groups act like different, self-contained generations. (“Early” because there’s still a fair amount we don’t know about the youngest Young Millennials given how, well, young they are.)

Let’s start with technology. Millennials, we hear over and over again, are absolutely obsessed with social media, and live their entire social lives through their smartphones. I tweet too much, sure, but I’ve never blasted a ’gram (did I say that right?); even thinking about learning how to Snapchat makes me want to take a long, peaceful nap; and I still feel bad whenever I haven’t heard a distant friend’s voice on the phone for a while. I miss out on nothing, in terms of real-world socializing, by sticking to Facebook and texting. I still prefer to read things — particularly long things — on paper. And again, almost all my friends (there are a few social-media-obsessed exceptions) feel similarly. On this front, we are decidedly different from Young Millennials, and to the extent the social-media-obsession stereotype is accurate, it simply doesn’t apply to us in the same way.

Then there’s the more substantive issue of how millennials (supposedly) live and structure their lives, and how they relate to the prevailing economic tides. Millennials are way less likely to follow “traditional” trajectories with regard to careers and marriage, both anecdotes and some data suggest. They often flit from job to job without staying in one place too long — they’re “The Job-Hopping Generation,” says Gallup — and are much more likely, relative to previous generations when they were in their 20s, to live at home and to put off family formation for a long time. (It should be said that there’s some controversy here — just last week Pew released some numbers suggesting millennials aren’t any job-hoppier than Generation X was at the same age.)

Again, this just doesn’t resonate, either for me or for most of my friends who are my age. We’re so normal! Yes, some of us have been hit harder than others by bad career luck or missteps, or by the massive national catastrophe of student debt, but for the most part we’ve had very “traditional” career paths. Now in our 30s, those of us who have had the most successful career trajectories are taking on many of the same young management roles that similarly privileged, middle-class boomers and Gen-Xers did when they reached those ages. I’m not married, but I’d say that more than half of my good friends are. Everyone’s having kids; those who can afford it are buying houses. It’s just bizarre to hear countless accounts of the unique nature of this generation — my generation, supposedly — and to then log onto Facebook and see so many people settling into exactly the lives expected of people in their 30s. Nothing about our collective experiences as adolescents and young(ish) adults, overall, feels that different from the stories we’ve heard about how members of past generations grew up and carved out their personal and professional niches. (I’ve already used the term privileged in this paragraph, but it’s worth pointing out that privilege colors this entire discussion: Suffice it to say there are plenty of economically disadvantaged people who never have a fair shot at a good, remunerative career of any sort. In terms of my own life and the lives of my friends/colleagues, I can only speak to one, mostly middle-class slice of the millennial experience.)

To be sure, the dissociation I’m feeling from my own generation is partly an inevitable artifact of the artificial way we construct generations in the first place. Generations are usually defined as anyone who was born within a span of about 18 years or so, and a lot happens in 18 years. The baby-boomers, for example, consist of those who were born from 1946 to 1964, or thereabouts — their oldest members were born not long after America’s world-historical triumph in World War II, while their youngest grew up during the 1960s, a period of crescendoing turmoil in American civic and political life. The youngest and oldest boomers grew up in very different worlds.

But this time around might be different. When I emailed Twenge to ask about the possibility of meaningful differences between older and younger millennials, she quickly highlighted those two events: the financial collapse of 2008 and the rise of smartphones around that same time (the iPhone was introduced in 2007). Their impact can’t be overstated, and because of precisely when they hit, it really might be the case that in 2017 a 33-year-old is more different from a 23-year-old than at any other point in recent history. (That could explain why Twenge is working on a book about those born in the 1990s, and how they’re “vastly different from their Millennial predecessors,” as the publicity language puts it.)

Take the financial crash. Many Old Millennials were either already in the workforce by then, or close enough to entering it that we were able to “sneak in” before the crisis had fully unfurled itself. Which means we were raised and educated during a period in which we were promised that if we followed the rules in certain ways, there would be gainful employment waiting for us in our early or mid-20s — which there often was. The same definitely cannot be said of Young Millennials. The crisis permanently rejiggered the world for them. They grew up, like us Old Millennials, assuming that things would more or less work out if they followed the rules laid out by adults, only to have the rug pulled out from under them entirely during a very formative period in their lives.

This is a big deal, to have your expectations about your life so violently reoriented as a teenager or young adult. And while plenty of older millennials were affected, too — especially as the ramifications of the crisis rippled outward — the crisis really did hit Young Millennials in a different way. “Early millennials grew up in an optimistic time and were then hit by the recession, whereas late millennials had their worldview made more realistic by experiencing the recession while during their formative years,” explained Twenge. According to Twenge, this has led to certain differences between older and younger millennials that manifest in the data. For example, she’s found some evidence from survey data that younger millennials “are more practical — they are more attracted to industries with steady work and are more likely to say they are willing to work overtime” than older ones. Us Old Millennials could afford to develop views on work and work-life balance that were a bit more idealistic.

Then there are smartphones and social media, which hit the two halves of the generation in massively different ways. “Unlike [Young Millennials],” wrote Lapidos, “I am not a true digital native. The Internet wasn’t a fact of nature. I had to learn what it was and how to use it. I wrote letters home when I was at summer camp. I didn’t have a mobile phone until I was 19.” For us Old Millennials, the social aspects of our middle- and high-school-years were lived mostly offline. Sure, AOL Instant Messenger was a pretty big deal when it first caught on, but most of us didn’t even have cell phones until college, and smartphones until after. Think about all the stuff you go through between the ages of 12 and 22 in terms of your development as a person. Now think about how many of those experiences are affected by the presence or absence of a cell phone and social media.

According to Twenge, there’s a bit less hard data on how smartphones drove an intragenerational wedge than there is on the subject of the Great Recession — she’s working on this question, but doesn’t yet have hard answer. But it would be shocking if this technological revolution didn’t carve out some important differences between Old and Young Millennials. While there are certainly plenty of overhyped, underscientific opinions about how social media affects people, there’s little question that it has some effect (there is some evidence that extensive Facebook usage is correlated with unhappiness, for example, including some fairly meaty recently published research). Twenge said that she thinks the fact that younger millennials spend so much more time on social media might be able to explain, for example, why they seem to be more susceptible to certain forms of psychological distress, including depression. That said, “What we don’t have yet is research connecting these two areas of research” — that is, research making a stronger, more rigorous connection between generational differences between social-media use and rates of psychological distress.

What all this suggests is that there’s very little to be gained from lumping together all millennials in one group. Again, to a certain extent you can say this about any generation, but some genuinely unique and unusual stuff helped create the current divide. While the Old and Young Millennial categories aren’t carved in stone, and there is certainly some overlap (especially for those who were influenced by older siblings), it doesn’t benefit anyone to act like a 33-year-old and a 23-year-old came up in the same general climate, or with access to the same types of world-altering technology. No: These are profound differences. For the good of both us Old Millennials and our Young Millennial siblings and friends, let’s stop acting like we’re all in the same boat.


The adventures of that perennial herbalist, Mr Nicholas Culpeper

If you didn’t know I love herbs, you do now!


Aside

P-L-U-M Spells Relief!

Usually when I read, I try not to do two thematically similar books in a row unless they’re a series–and sometimes, not even then. This time, though, it wasn’t enough.

Since Alison Weir’s A King’s Obsession came out a month or so ago, I thought it would be a good idea to reread The True Queen first. So I read both, and got through Three Dark Crowns  before starting into Pour the Dark Wine. I feel like I didn’t even read 3DC, though, so my head is all flubbed up with stuff from the Tudor era.

Definitely time for Takedown Twenty! I thought. (If you’ve never read Janet Evanovich’s books, they’re light and funny–great even if you don’t like mysteries!)

First page, a giraffe goes galloping down the street.

First chapter, someone is mysteriously murdered with Stephanie and Lula nearby.

Oh Goddess, is it good to be back in the Weird World of Stephanie Plum!


That’s Kat Heigl from the 2012 movie


Small Squee

Ahh! Look at Auntie Jane! (First picture.) That cover looks great! Only…*counts on fingers*…ten months to go! (May 2018)

I guess we’re not going to see Road to Somerset before then. Not only is it not yet listed on Amazon, but Janet Wertman hasn’t had much to say except “working on it!”


Small Rant

Now that I’ve read nineteen of the books, I’d have to say that Kat Heigl is not my first choice for Stephanie. The character is definitely more Sara Rue’s cup of tea.

(Speaking of Sara, look at that picture! Dayum!)

I don’t mean to typecast her, but I can just imagine her kicking ass, taking names…and messing up several times per book. (*gigglefit*)

But if you’re going to have Sara, you should definitely bring back her Less than Perfect costar, Sherri Shepherd!

(Look at that costume from Beauty Shop! If you’ve read even one Stephanie book, you know that Lula would love that pink hair!)

I mean, I discovered how great Yvette Nicole Brown was in The Odd Couple, but–

Oh. Wait. Yvette is also 5’1″. Neeever mind!

Either way, I could totally imagine Sherri in the role.

Who would play Ranger? I don’t know, but I can confidently say, “Not Daniel Sunjata!”

I’m sure he’s a nice guy and everything, but I imagine Ranger to be a mix of Mark Consuelos, a tiny bit of Mario Lopez, and maybe some other Latin cuties that I’ve never seen before–and Daniel is not it!

 


A Grammatical Mess

Do you know what kind of writer I am? The kind that reads The Elements of Style and thinks, “Fuck that!” Fortunately, I’m not alone.

An article from The Chronicle of Higher Education. {Geoffrey K. Pullum is head of linguistics and English language at the University of Edinburgh and co-author (with Rodney Huddleston) of The Cambridge Grammar of the English Language (Cambridge University Press, 2002).}


50 Years of Stupid Grammar Advice
By Geoffrey K. Pullum APRIL 17, 2009

 

April 16 is the 50th anniversary of the publication of a little book that is loved and admired throughout American academe. Celebrations, readings, and toasts are being held, and a commemorative edition has been released.

I won't be celebrating.

The Elements of Style does not deserve the enormous esteem in which it is held by American college graduates. Its advice ranges from limp platitudes to inconsistent nonsense. Its enormous influence has not improved American students' grasp of English grammar; it has significantly degraded it.

The authors won't be hurt by these critical remarks. They are long dead. William Strunk was a professor of English at Cornell about a hundred years ago, and E.B. White, later the much-admired author of Charlotte's Web, took English with him in 1919, purchasing as a required text the first edition, which Strunk had published privately. After Strunk's death, White published a New Yorker article reminiscing about him and was asked by Macmillan to revise and expand Elements for commercial publication. It took off like a rocket (in 1959) and has sold millions.

This was most unfortunate for the field of English grammar, because both authors were grammatical incompetents. Strunk had very little analytical understanding of syntax, White even less. Certainly White was a fine writer, but he was not qualified as a grammarian. Despite the post-1957 explosion of theoretical linguistics, Elements settled in as the primary vehicle through which grammar was taught to college students and presented to the general public, and the subject was stuck in the doldrums for the rest of the 20th century.

Notice what I am objecting to is not the style advice in Elements, which might best be described the way The Hitchhiker's Guide to the Galaxy describes Earth: mostly harmless. Some of the recommendations are vapid, like "Be clear" (how could one disagree?). Some are tautologous, like "Do not explain too much." (Explaining too much means explaining more than you should, so of course you shouldn't.) Many are useless, like "Omit needless words." (The students who know which words are needless don't need the instruction.) Even so, it doesn't hurt to lay such well-meant maxims before novice writers.

Even the truly silly advice, like "Do not inject opinion," doesn't really do harm. (No force on earth can prevent undergraduates from injecting opinion. And anyway, sometimes that is just what we want from them.) But despite the "Style" in the title, much in the book relates to grammar, and the advice on that topic does real damage. It is atrocious. Since today it provides just about all of the grammar instruction most Americans ever get, that is something of a tragedy. Following the platitudinous style recommendations of Elements would make your writing better if you knew how to follow them, but that is not true of the grammar stipulations.

"Use the active voice" is a typical section head. And the section in question opens with an attempt to discredit passive clauses that is either grammatically misguided or disingenuous.

We are told that the active clause "I will always remember my first trip to Boston" sounds much better than the corresponding passive "My first visit to Boston will always be remembered by me." It sure does. But that's because a passive is always a stylistic train wreck when the subject refers to something newer and less established in the discourse than the agent (the noun phrase that follows "by").

For me to report that I paid my bill by saying "The bill was paid by me," with no stress on "me," would sound inane. (I'm the utterer, and the utterer always counts as familiar and well established in the discourse.) But that is no argument against passives generally. "The bill was paid by an anonymous benefactor" sounds perfectly natural. Strunk and White are denigrating the passive by presenting an invented example of it deliberately designed to sound inept.

After this unpromising start, there is some fairly sensible style advice: The authors explicitly say they do not mean "that the writer should entirely discard the passive voice," which is "frequently convenient and sometimes necessary." They give good examples to show that the choice between active and passive may depend on the topic under discussion.

Sadly, writing tutors tend to ignore this moderation, and simply red-circle everything that looks like a passive, just as Microsoft Word's grammar checker underlines every passive in wavy green to signal that you should try to get rid of it. That overinterpretation is part of the damage that Strunk and White have unintentionally done. But it is not what I am most concerned about here.

What concerns me is that the bias against the passive is being retailed by a pair of authors so grammatically clueless that they don't know what is a passive construction and what isn't. Of the four pairs of examples offered to show readers what to avoid and how to correct it, a staggering three out of the four are mistaken diagnoses. "At dawn the crowing of a rooster could be heard" is correctly identified as a passive clause, but the other three are all errors:

"There were a great number of dead leaves lying on the ground" has no sign of the passive in it anywhere.

"It was not long before she was very sorry that she had said what she had" also contains nothing that is even reminiscent of the passive construction.

"The reason that he left college was that his health became impaired" is presumably fingered as passive because of "impaired," but that's a mistake. It's an adjective here. "Become" doesn't allow a following passive clause. (Notice, for example, that "A new edition became issued by the publishers" is not grammatical.)

These examples can be found all over the Web in study guides for freshman composition classes. (Try a Google search on "great number of dead leaves lying.") I have been told several times, by both students and linguistics-faculty members, about writing instructors who think every occurrence of "be" is to be condemned for being "passive." No wonder, if Elements is their grammar bible. It is typical for college graduates today to be unable to distinguish active from passive clauses. They often equate the grammatical notion of being passive with the semantic one of not specifying the agent of an action. (They think "a bus exploded" is passive because it doesn't say whether terrorists did it.)

The treatment of the passive is not an isolated slip. It is typical of Elements. The book's toxic mix of purism, atavism, and personal eccentricity is not underpinned by a proper grounding in English grammar. It is often so misguided that the authors appear not to notice their own egregious flouting of its own rules. They can't help it, because they don't know how to identify what they condemn.

"Put statements in positive form," they stipulate, in a section that seeks to prevent "not" from being used as "a means of evasion."

"Write with nouns and verbs, not with adjectives and adverbs," they insist. (The motivation of this mysterious decree remains unclear to me.)

And then, in the very next sentence, comes a negative passive clause containing three adjectives: "The adjective hasn't been built that can pull a weak or inaccurate noun out of a tight place."

That's actually not just three strikes, it's four, because in addition to contravening "positive form" and "active voice" and "nouns and verbs," it has a relative clause ("that can pull") removed from what it belongs with (the adjective), which violates another edict: "Keep related words together."

"Keep related words together" is further explained in these terms: "The subject of a sentence and the principal verb should not, as a rule, be separated by a phrase or clause that can be transferred to the beginning." That is a negative passive, containing an adjective, with the subject separated from the principal verb by a phrase ("as a rule") that could easily have been transferred to the beginning. Another quadruple violation.

The book's contempt for its own grammatical dictates seems almost willful, as if the authors were flaunting the fact that the rules don't apply to them. But I don't think they are. Given the evidence that they can't even tell actives from passives, my guess would be that it is sheer ignorance. They know a few terms, like "subject" and "verb" and "phrase," but they do not control them well enough to monitor and analyze the structure of what they write.

There is of course nothing wrong with writing passives and negatives and adjectives and adverbs. I'm not nitpicking the authors' writing style. White, in particular, often wrote beautifully, and his old professor would have been proud of him. What's wrong is that the grammatical advice proffered in Elements is so misplaced and inaccurate that counterexamples often show up in the authors' own prose on the very same page.

Some of the claims about syntax are plainly false despite being respected by the authors. For example, Chapter IV, in an unnecessary piece of bossiness, says that the split infinitive "should be avoided unless the writer wishes to place unusual stress on the adverb." The bossiness is unnecessary because the split infinitive has always been grammatical and does not need to be avoided. (The authors actually knew that. Strunk's original version never even mentioned split infinitives. White added both the above remark and the further reference, in Chapter V, admitting that "some infinitives seem to improve on being split.") But what interests me here is the descriptive claim about stress on the adverb. It is completely wrong.

Tucking the adverb in before the verb actually de-emphasizes the adverb, so a sentence like "The dean's statements tend to completely polarize the faculty" places the stress on polarizing the faculty. The way to stress the completeness of the polarization would be to write, "The dean's statements tend to polarize the faculty completely."

This is actually implied by an earlier section of the book headed "Place the emphatic words of a sentence at the end," yet White still gets it wrong. He feels there are circumstances where the split infinitive is not quite right, but he is simply not competent to spell out his intuition correctly in grammatical terms.

An entirely separate kind of grammatical inaccuracy in Elements is the mismatch with readily available evidence. Simple experiments (which students could perform for themselves using downloaded classic texts from sources like http://gutenberg.org) show that Strunk and White preferred to base their grammar claims on intuition and prejudice rather than established literary usage.

Consider the explicit instruction: "With none, use the singular verb when the word means 'no one' or 'not one.'" Is this a rule to be trusted? Let's investigate.

Try searching the script of Oscar Wilde's The Importance of Being Earnest (1895) for "none of us." There is one example of it as a subject: "None of us are perfect" (spoken by the learned Dr. Chasuble). It has plural agreement.

Download and search Bram Stoker's Dracula (1897). It contains no cases of "none of us" with singular-inflected verbs, but one that takes the plural ("I think that none of us were surprised when we were asked to see Mrs. Harker a little before the time of sunset").

Examine the text of Lucy Maud Montgomery's popular novel Anne of Avonlea (1909). There are no singular examples, but one with the plural ("None of us ever do").

It seems to me that the stipulation in Elements is totally at variance not just with modern conversational English but also with literary usage back when Strunk was teaching and White was a boy.

Is the intelligent student supposed to believe that Stoker, Wilde, and Montgomery didn't know how to write? Did Strunk or White check even a single book to see what the evidence suggested? Did they have any evidence at all for the claim that the cases with plural agreement are errors? I don't think so.

There are many other cases of Strunk and White's being in conflict with readily verifiable facts about English. Consider the claim that a sentence should not begin with "however" in its connective adverb sense ("when the meaning is 'nevertheless'").

Searching for "however" at the beginnings of sentences and "however" elsewhere reveals that good authors alternate between placing the adverb first and placing it after the subject. The ratios vary. Mark Liberman, of the University of Pennsylvania, checked half a dozen of Mark Twain's books and found roughly seven instances of "however" at the beginning of a sentence for each three placed after the subject, whereas in five selected books by Henry James, the ratio was one to 15. In Dracula I found a ratio of about one to five. The evidence cannot possibly support a claim that "however" at the beginning of a sentence should be eschewed. Strunk and White are just wrong about the facts of English syntax.

The copy editor's old bugaboo about not using "which" to introduce a restrictive relative clause is also an instance of failure to look at the evidence. Elements as revised by White endorses that rule. But 19th-century authors whose prose was never forced through a 20th-century prescriptive copy-editing mill generally alternated between "which" and "that." (There seems to be a subtle distinction in meaning related to whether new information is being introduced.) There was never a period in the history of English when "which" at the beginning of a restrictive relative clause was an error.

In fact, as Jan Freeman, of The Boston Globe, noted (in her blog, The Word), Strunk himself used "which" in restrictive relative clauses. White not only added the anti-"which" rule to the book but also revised away the counterexamples that were present in his old professor's original text!

It's sad. Several generations of college students learned their grammar from the uninformed bossiness of Strunk and White, and the result is a nation of educated people who know they feel vaguely anxious and insecure whenever they write "however" or "than me" or "was" or "which," but can't tell you why. The land of the free in the grip of The Elements of Style.

So I won't be spending the month of April toasting 50 years of the overopinionated and underinformed little book that put so many people in this unhappy state of grammatical angst. I've spent too much of my scholarly life studying English grammar in a serious way. English syntax is a deep and interesting subject. It is much too important to be reduced to a bunch of trivial don't-do-this prescriptions by a pair of idiosyncratic bumblers who can't even tell when they've broken their own misbegotten rules.


The Goddess of Cake

by Allie, the gal who created the Alot Monster

 

My mom baked the most fantastic cake for my grandfather's 73rd birthday party. The cake was slathered in impossibly thick frosting and topped with an assortment of delightful creatures which my mom crafted out of mini-marshmallows and toothpicks.  To a four-year-old child, it was a thing of wonder – half toy, half cake and all glorious possibility.

 
 
But my mom knew that it was extremely important to keep the cake away from me because she knew that if I was allowed even a tiny amount of sugar, not only would I become intensely hyperactive, but the entire scope of my existence would funnel down to the singular goal of obtaining and ingesting more sugar.  My need for sugar would become so massive, that it would collapse in upon itself and create a vacuum into which even more sugar would be drawn until all the world had been stripped of sweetness.
 
 
So when I managed to climb onto the counter and grab a handful of cake while my mom's back was turned, an irreversible chain reaction was set into motion.
 
 
 
 
I had tasted cake and there was no going back.  My tiny body had morphed into a writhing mass of pure tenacity encased in a layer of desperation.  I would eat all of the cake or I would evaporate from the sheer power of my desire to eat it.
 
My mom had prepared the cake early in the day to get the task out of the way.  She thought she was being efficient, but really she had only ensured that she would be forced to spend the whole day protecting the cake from my all-encompassing need to eat it.  I followed her around doggedly, hoping that she would set the cake down – just for a moment.
 
 
My mom quickly tired of having to hold the cake out of my reach. She tried to hide the cake, but I found it almost immediately. She tried putting the cake on top of the refrigerator, but my freakish climbing abilities soon proved it to be an unsatisfactory solution.
 
 
Her next attempt at cake security involved putting the cake in the refrigerator and then placing a very heavy box in front of the refrigerator's door.
 
 
The box was far too heavy for me to move.  When I discovered that I couldn't move the box, I decided that the next best strategy would be to dramatically throw my body against it until my mom was forced to move it or allow me to destroy myself.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Surprisingly, this tactic did not garner much sympathy.
 
 
I went and played with my toys, but I did not enjoy it.
 
 
I had to stay focused.
 
 
 
 
 
 
I played vengefully for the rest of the afternoon. All of my toys died horrible deaths at least once. But I never lost sight of my goal.
 
My mom finally came to get me. She handed me a dress and told me to put it on because we were leaving for the party soon. I put the dress on backwards just to make her life slightly more difficult.
 
I was herded into the car and strapped securely into my car seat.  As if to taunt me, my mom placed the cake in the passenger seat, just out of my reach.
 
 
We arrived at my grandparents' house and I was immediately accosted by my doting grandmother while my mom walked away holding the cake.
 
 
I could see my mom and the cake disappearing into the hallway as I watched helplessly.  I struggled against my grandmother's loving embrace, but my efforts were futile.  I heard the sound of a door shutting and then a lock sliding into place.  My mom had locked the cake in the back bedroom.  How was I going to get to it now?  I hadn't yet learned the art of lock-picking and I wasn't nearly strong enough to kick the door in.  It felt as though all my life's aspirations were slipping away from me in a landslide of tragedy.  How could they do this to me?  How could they just sit there placidly as my reason for living slowly faded from my grasp?  I couldn't take it.  My little mind began to crumble.
 
And then, right there in my grandmother's arms, I lapsed into a full-scale psychological meltdown. My collective frustrations burst forth from my tiny body like bees from a nest that had just been pelted with a rock.
 
 
It was unanimously decided that I would need to go play outside until I was able to regain my composure and stop yelling and punching.  I was banished to the patio where I stood peering dolefully through the sliding glass door, trying to look as pitiful as possible.
 
 
I knew the cake was locked securely in the bedroom, but if I could just get them to let me inside… maybe.  Maybe I could find a way to get to it.  After all, desperation breeds ingenuity.  I could possibly build an explosive device or some sort of pulley system.  I had to try.  But at that point, my only real option was to manipulate their emotions so they'd pity me and willfully allow me to get closer to the cake.
 
When my theatrics failed to produce the desired results, I resorted to crying very loudly, right up against the glass.
 
 
I carried on in that fashion until my mom poked her head outside and, instead of taking pity on me and warmly inviting me back inside as I had hoped, told me to go play in the side yard because I was fogging up the glass and my inconsolable sobbing was upsetting my grandmother.
 
I trudged around to the side of the house, glaring reproachfully over my shoulder and thinking about how sorry my mom would be if I were to die out there.  She'd wish she would have listened. She'd wish she had given me a piece of cake.  But it would be too late.
 
 
 
But as I rounded the corner, the personal tragedy I was constructing in my imagination was interrupted by a sliver of hope.
 
 
Just above my head, there was a window.  On the other side of that particular window was the room in which my mom  had locked the cake.  The window was open.
 
 
 
The window was covered by a screen, but my dad had shown me how to remove a screen as a preemptive safety measure in case I was  trapped in a fire and he couldn't get to me and I turned out to be too stupid to figure out how to kick in a screen to escape death by burning.
 
I clambered up the side of the house and pushed the screen with all my strength.
 
 
It gave way, and suddenly there I was – mere feet from the cake, unimpeded by even a single obstacle.
 
 
I couldn't fully believe what had just occurred.  I crept slowly – reverently – toward the cake, my body quivering with anticipation.  It was mine.  All mine.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
I ate the entire cake.  At one point, I remember becoming aware of the oppressive fullness building inside of me, but I kept eating out of a combination of spite and stubbornness.  No one could tell me not to eat an entire cake – not my mom, not Santa, not God – no one.  I would eat cake whenever I damn well pleased.  It was my cake and everyone else could go fuck themselves.
 
..
 
Meanwhile, in the kitchen, my mother suddenly noticed that she hadn't heard my tortured sobbing in a while.
 
 
She became concerned because it was unusual for my tantrums to stop on their own like that, so she went looking for me.
 
When she couldn't find me anywhere, she finally thought to unlock the bedroom door and peek inside.
 
 
And there I was.
 
 
I spent the rest of the evening in a hyperglycemic fit, alternately running around like a maniac and regurgitating the multi-colored remains of my conquest all over my grandparents' carpet.  I was so miserable, but my suffering was small compared to the satisfaction I felt every time my horrible, conniving mother had to watch me retch up another rainbow of sweet, semi-digested success: this is for you, mom.  This is what happens when you try to get between me and cake – I silently challenged her to try again to prevent me from obtaining something I wanted.  Just once.  Just to see what would happen.  It didn't matter how violently ill I felt, in that moment, I was a god – the god of cake – and I was unstoppable.