I.

For my sixth birthday, my family bought a house. It was 1990, and we Campbells already had the first-generation minivan, the gas grill, and the first kid; all we needed now was a piece of property to call our own. And our custom-built, 2,500-square-foot monstrosity, complete with Asian pear trees in the front and a gigantic deck in the back, officially marked our entry into the lower-upper-middle class. Maybe we weren’t partaking in the “irrational exuberance” of the Wall Street crowd, but to a family that just three years earlier had to forego Happy Meals on our European vacation in order to save money, it seemed like we had reached the suburban Promised Land all the same.

At night, the television confirmed that we were, economically speaking, in the right place. On weeknights, my parents and I would spend an hour watching the news, where segments like “The Fleecing of America” warned us about the dangers of a government with too much money to spend. After that, we’d watch sitcom families like the Tanners, Huxtables, and Taylors face every sort of problem imaginable, except those having to do with real financial insecurity or a lack of living space. On Sunday afternoons, I did my homework while listening to preachers of the prosperity gospel such as Joel Osteen and Joyce Meyers, who told me that God had a lot to offer me instead of the other way around.

But I received the strongest doses of fantasy in school, where I was subjected to the much-maligned “culture of praise.” It must have seemed like a good idea at the time; why not bring into childhood the same sense of well-being that the adult world was enjoying? As far as official recognition went, doing at all seemed to matter at least as much as doing well: I built an impressive collection of trophies, certificates, and glowing report cards, many of which reminded me that “Everyone’s a Winner.”

All this was supposed to boost my adolescent ego, but it only ended up confusing me. Humans are creatures of comparison, after all, which meant that in spite of my awards, I always remembered my actual standing in each event, whether I’d come in fourth place overall in the state geography bee or fifteenth out of twenty at a local chess tournament. I couldn’t really believe that everyone deserved a reward just for existing; instead, I assumed that such a system judged everyone’s personality and worth as a human being according to rules that were inscrutable, if not completely arbitrary.

As a result, I lived in complete fear of judgment from teachers, coaches, and even Mom and Dad, who, I’m sure, must have enjoyed my complete obedience. But I was quickly learning how to be paranoid. Although I didn’t understand how I could fail, I did understand that it would be social death to do so. And so I did everything I could to avoid the shame of failure. I never asked questions in class to avoid the appearance of challenging authority. I did not volunteer for show-and-tell, because I didn’t want to have my objects compared to everyone else’s. I tattled on anyone who had broken the rules even a little bit, and shunned those who had gotten into trouble even once. Like many of my peers, I understood failure not as a necessary and inevitable part of life, but as a cancer of the personality, like ugliness, laziness, or basic stupidity: it struck without warning, took an expert to diagnose, and was almost always a permanent condition. And all this from a system designed to increase our self-esteem!

II.

Fast forward a decade or so, and most of us kids have learned that the culture of success, which saddled Millennials with plenty of psychological baggage, has made us its victim in much more concrete ways.

First of all, our spending habits have not really evolved from the excess of the late nineties. In March 2008, TIME magazine declared that Millennials were poised to be “the next great luxury consumers”: nine out of ten Millennials surveyed agreed with statements like “I love wearing designer clothing, jewelry and watches,” or “I work hard, so I reward myself by splurging.” Apparently, we are also twice as likely as our parents’ generation to want to own a yacht, a private airplane, or luxury sports equipment—and we spend more than twice as much as our parents on underwear, hair and skin care, and fragrances. And to pay for those luxuries, the average Millennial accrues $2,300 in credit card debt by age 18 and $4,400 by age 35. Clearly, those of us who have gone into debt to buy the latest iPhones need to re-learn a thing or two about fiscal responsibility.

Those suburban single-family dwellings in which many of us grew up have also caused a problem or two. By late 2007, the heady cocktail of financial innovations, risky investments, and people spending beyond their means, which had fueled the boom of the ’90s and 2000s, left the United States with a collective hangover in the form of the worst financial disruption since the Great Depression. Another ghost from our past, come back to haunt us.

The fallout has been brutal, and it has happened just when the eldest of the Millennials, myself included, were starting their professional careers. As early as August of 2008, eleven months after the first major bank collapse of the recession (and, in fact, even before most people were calling the economic downturn a recession), business writers warned that the job market made it a “bad time to be young.” A year later, the picture was even worse, and you didn’t have to be an analyst to realize it. Younger Millennials who sought short-term employment for the first time in the summer of ’09 found that both seasonal work and minimum-wage retail or service jobs had all but dried up. Only one-fifth of the class of 2009 had found a job by the time they graduated. And friends of mine who had graduated that year from top-tier law schools like Harvard, Yale, and NYU, were having trouble finding and keeping associate positions. By September of 2009, the number of 16-to-24 year olds without jobs had climbed to 53.4 percent, more than double what it was just three years earlier, and the highest level it has ever reached since the Labor Department started tracking this particular statistic in 1948.

Economists, psychologists, and sociologists alike have started to worry much less about our consumption habits and much more about whether or not Millennials will become a latter-day “Lost Generation.” Historically, people who are lucky enough to begin careers during recessions still tend to earn less over their professional lives than their counterparts who start in more fortunate times—sometimes as much as 10 percent less. Those who instead remain jobless for long periods in their teens and early 20s tend to drink more heavily, become depressed more frequently, and die much sooner than those who start their working life in stabler times. If this trend of prolonged unemployment and career false-starts continues, we will be the first postwar generation to have a standard of living that’s lower than that of our parents.

So the expectation of success with which we grew up, and which gave us all a reputation as self-deluded spoiled brats, has all but disappeared. In the space of two years, we’ve watched the economic security promised us by our parents’ society crumble beneath the weight of mortgage-backed securities and credit-default swaps, and we’ve learned that no amount of self-confidence can insulate us from the brutal realities of the job market. We have, for the first time, had to take seriously the prospect of failure, and we are not taking it well, by any measure.

In fact, our acculturation to the outside world has turned into something of a public spectacle. We’ve been examined and re-examined hundreds of different ways in every media type imaginable, from books to blogs, TV shows to newspaper columns, and always with the puzzled disappointment of frustrated parents. And even though our critics have yet to agree on our sobriquet (to some, we’re the Millennials; to others, Generation Y; to still others, the Me Generation, the Latchkey Generation, the Trophy Kids…the list goes on), they have drawn up a rap sheet that they all seem to agree on.

The charges? We have outsourced our social lives from the real world to the virtual one thanks to Facebook, Twitter, and the myriad other platforms with which we can do online what people used to do in person (chatting, gaming, dating). We demand constant praise from our superiors in the workplace, just as we demanded automatic A’s for effort in the classroom. Many of us still live with our parents – and those who do are called a variety of flattering names that include “kidults,” “boomerang kids,” KIDDERS (Kids In Debt Diligently Eroding Retirement Savings), and YUCKIES (Young Unwitting Costly Kids).

Worse still, so the commenters say, is that we Millennials suffer from generational malaise. We have no real desire to shake up the political status quo and agitate for meaningful social change, except as the political establishment demands. According to New York Times columnist Thomas Friedman, Millennials are “not paying attention” and “so much less radical and politically engaged than they need to be.” New York criminal defense attorney Stephen Greenfield is much more pessimistic (and much pithier); at a Chicago law conference, he said that “Generation Y is entitled, lazy, selfish, tech savvy, and incompetent.” Mark Bauerlein, a professor of English at Emory University, has re-dubbed Millennials “the Dumbest Generation.”

It’s hard to find fault with this evidence: we do seem to be a generation that is self-absorbed and socially and politically maladjusted in unique ways, and we have grown so accustomed to the culture of consumption and success that we cannot realistically think our way out of it. Clearly, we have a complicated relationship with authority, recognition, and praise; we learned to expect constant approval and success, but not to understand why we should deserve it. And now that our world has been turned upside-down, we seem to be striving to piece together some semblance of the emotional bubble in which we were raised.

III.

But is this worthy of all the public hand-wringing? For one thing, the overpowering generalizations about Millennials are a bit unfair. One could just as easily say that the Boomers deserve a generational censure even more. After all, Boomers bought houses they couldn’t afford and ignored the long-term risks of the financial instruments that they sold, Boomer politicians turned a blind eye to the worst corporate excesses, and two Boomer presidents – Bush and Clinton – promoted reckless federal spending, the dismantling of public assistance programs without paying heed to their social costs, and a dangerous scaling-back of government oversight that has led to economic and environmental disasters. Millennials’ failure to launch seems like a venial sin in comparison.

In any case, the narrative of Millennial malaise is not so tidy as it seems. When journalists or commentators talk about Millennials, they never discuss those people under 30 who work in low- or middle-skill jobs like manufacturing, construction, or retail. They are instead mostly interested in that small subset of young people – suburban, middle-class, college-educated, more likely to be white or Asian American than black or Hispanic, and bound for a managerial or professional career – who were supposed to be the next generation of world-beaters but who have amounted to exactly nothing. Thus the resentment (YUCKIES and KIPPERS) and the name-calling (“the Dumbest generation”).

Even more galling must be the fact that those same Millennials who have ended up back on mom and dad’s couch are living reminders of the failure of the Baby Boomer lifestyle. Millennials were, remember, supposed to be made in the Boomers’ self-image: to put career ahead of all other considerations, to aspire to the suburban lifestyle, to value individualism and economic gain above all things. Instead, we have turned out exactly wrong. We crave praise like Gollum craves the One Ring; we have no work ethic; we only know how to spend, spend, spend.

As surveys from the Pew Research Center and the Center for American Progress show, the majority of Millennials also have an entirely different political outlook than their parents do. They are collectivists rather than contrarians; they prefer social and economic justice to complete individual liberty; they trust authority and expertise; they see government as a potential good rather than as a useless parasite. When combined with all of the other seismic shifts that Boomers have had to face in the last twenty years, from demography to technology to fashion, it must seem like the world has turned itself upside-down. And leading the way have been the Millennials, who have resisted or rejected their parents’ goals, values, and very way of life (as well as no small amount of psychological manipulations) to become the anti-Boomers.

But none of this should be all that surprising. The years between adolescence and adulthood are called the “odyssey years” for a reason, and given the set of circumstances that Millennials are living through, no wonder many young Americans are rethinking their curriculum vitae in ways that their parents cannot immediately understand. After all, didn’t the Boomers themselves pull the cultural rug out from under their own parents, forty years ago?

In other words, we’re dealing with variations on a very old theme. In fact, it seems like the only thing different about Kids These Days is that, thanks to statisticians, sociologists, and market researchers, there is much more data to show what’s wrong with them, and why.

To my mind, this is cause for celebration: Millennials, we are not as screwed up as our elders want to believe (and want us to believe), and Boomers, your kids are doing just fine, even if they might not be following exactly in your footsteps. After all, if Millennials’ problems can be solved by public nagging in newspapers and magazines, then they must not be that bad in the first place; if not, then no amount of cynical commentary is going to help, and we need not worry about the inter-generational censures.

When I began my junior year of college, I believed that I had mastered the art of the essay. By that point I had taken three semester-long courses on academic writing, and knew the necessary components of a good paper by heart. For every essay assignment, I spent hours, if not days, crafting the right thesis statement, making sure each of my paragraphs had topic sentences, balancing the ratio of analysis to evidence, and so on. I was convinced that I was doing all the right things.

I’ve since learned that I was wrong, however. It’s embarrassing to go back and read my old papers, not because they are incoherent, but because they are inelegant. They contain pretentious language, paragraphs that are slightly out of place, sentences with as many as twelve clauses, and general bloat. For example, an essay that I wrote about The Communist Manifesto began with the sentence, “A salient feature of Marx’s vision of history is its deterministic view; that is, history is a pattern of class struggles as the expanding means of production begin to be inhibited by societal structures which are then broken down and replaced by new ones.” The paper goes downhill from there.

I wrote this way because I did not that writing true sentences was just the first step toward writing comprehensible ones. In other words, I did not appreciate the power of revision.

Up until that point, no one had taught me how to revise. My freshman writing course dedicated exactly one class period to a discussion of editing and revision, and my other teachers insisted that revising was a process that I needed to do, and not much else. Style guides were no help, either. Strunk and White devote exactly one unhelpful paragraph to revising, much of which is about when “scissors should be brought into play.” Although the introduction of William Zinsser’s On Writing Well asserts in the introduction that “rewriting is the essence of writing,” there is less than a page on revision and editing in the body of the text. Even William Kelleher Storey’s Writing History: A Guide for Students simply encouraged me to ask myself several questions like “Will readers find this to be an interesting and significant argument?” before talking about comma splices.

This breezy attitude toward revision led me to believe that it was an intuitive process that demanded far less effort and direction than writing a rough draft. So I was not worried by the fact that I could only intuit about twenty or thirty minutes’ worth of corrections—moving a paragraph here, inverting a pair of clauses there, and finding the occasional place where I could throw in an alliteration. Such efforts were not enough to salvage my Communist Manifesto paper, among many others, from stylistic barbarism.

Where instruction and instinct failed, however, poetry succeeded. In the fall of my junior year, I had lunch with a friend of mine who was taking a class on modernist literature. She came straight from class, and as she sat down she produced, with a flourish, her reading assignment: a working draft of T.S. Eliot’s The Waste Land, complete with Ezra Pound’s annotations. This was the first time that I’d ever been privy to someone else’s writing process, and I was amazed to see this intermediate step between the poem’s embryonic stages and its published form, which I vaguely remembered from high school. More importantly, I realized that Pound and Eliot were not making one or two cursory cut-and-pastes or substituting a few words here and there, but thoroughly re-evaluating almost everything about the poem. Absurd as it seems in retrospect, only then did I realize how much revision can shape a piece of writing.

My next mentors were novelists, who taught me that my relationship with writing was exactly wrong. First, they emphasized that writing well is an unpleasant business. Joyce Carol Oates, for example, described a typical day of writing as “long, snarled, frustrating and sometimes despairing,” while George Orwell likened writing to “a long bout of some painful illness.” And second, they insisted that the key to writing well was ruthless revision. As Ernest Hemingway put it, “I write one page of masterpiece to ninety pages of shit. I try to put the shit in the wastebasket.” Ninety to one? Wastebaskets? For someone whose editing process usually involved the addition of fluff in order to hit page limits, the suggestion that I ought to throw out more than I keep was absolutely terrifying.

Nevertheless, I resolved that, for my next writing assignment, I would make the editing process as long and unpleasant as possible. So I forced myself to revise for a half a day instead of half an hour. And, miraculously, the more I stared at my rough draft, the angrier and more ruthless I became. I excised sentences, I deleted whole paragraphs, I re-phrased and re-wrote and ended up with something that did not look very much like what I originally put on the page. Then I produced a third draft, and a fourth, until finally I was satisfied with draft number five. For the first time, I felt like I had written a paper that was actually good, instead of just merely competent.

But only this foray into the world of literary art taught me what revision really involved, and I realized that the well-meaning advice of teachers and style guides—who usually boiled the process down to unhelpful metaphors (“it is just a matter of making repairs,” wrote Zinsser) or lists of questions, tips, and tricks—had actually led me astray. Editing has less to do with developing an instinct or following a checklist; it has everything to do with time, effort, and the willingness to cut writing down to its bare bones—even if it means, to paraphrase Sir Arthur Quiller-Couch, murdering your darlings.

All of this hand-wringing isn’t mere melodrama, or a kind of histrionic half-apology from those writers who are a bit embarrassed about their thoroughly bourgeois occupation. Rather, as Mark Twain said, “the difference between the right word and the almost right word is the difference between lightning and the lightning bug.”  And editing is to the writer what practice is to the musician, what exercise is to the athlete. It is the only thing that can possibly elevate “correct” prose into readable prose, to weed out triteness and laziness and showboating, to quiet—however temporarily—the self-doubts that plague every writer from the moment he or she sets pen to paper, or fingertip to keyboard.

When I was little, my parents and I would watch television together. Usually, our TV diet consisted of evening news, Jeopardy! at 7:30, and  if I was lucky  an 8pm show like MacGuyver or Full House. But for a few nights in September of 1990, my parents let me stay up a little bit later than usual to watch something entirely different: a PBS documentary about the Civil War.