Thursday, July 29, 2010

Becoming Human

Repeated viewers of this blog may notice the tendency I have to write about topics at least a month after I’ve encountered them. Oh well, here I am again.

Close to two months ago I read a column by David Brooks called “History For Dollars.” In it Brooks makes a perfectly logical and necessary argument for pursuing a liberal arts education in the humanities. He begins by pointing out how struggling labor markets and unwelcome economic forces have caused a nearly 50 percent drop liberal arts majors from the past generation. Instead of exploring areas like English and history, students today choose more vocational paths like accounting, information technology, and even journalism in the hopes they will lead to a job.

Yet by doing so, Brooks believes we run the danger of depriving ourselves an essential part of human existence, a friend he calls “the Big Shaggy.” He says, “Over the past century or so, people have built various systems to help them understand human behavior: economics, political science, game theory and evolutionary psychology. These systems are useful in many circumstances. But none completely explain behavior because deep down people have passions and drives that don’t lend themselves to systemic modeling. They have yearnings and fears that reside in an inner beast you could call The Big Shaggy.”

Let me start off by saying that for the most part, I agree with Brooks’ argument. Though I majored in journalism, I did lay claim to the rare and likely pointless title of “dual-minor” in English and History. While these classes might have started out as subjects I simply excelled in, they turned into a whole lot more. The classes I took, the books I read, the people I learned of, the teachers I studied under all cultivated my indispensable worldview. And as trite as it sounds, I wouldn’t have traded it for anything.

Much like my intellectual superior David Brooks, I do believe these lessons played a vital role in understanding the feelings, desires, and actions of those around me. I can recall very vivid moments comparing my friends with characters from stories I read, or comparing current historic figures with old, and viewing this all as a part of human nature.

Yet unlike Brooks, I also believe there’s another component to this “Big Shaggy” character worth mulling over. What I’m talking about isn’t anything you can read or learn about, but interactions that are tangible, live and well…human. He seems to touch upon it when he says, “It’s probably dangerous to enter exclusively into this realm and risk being caught in a cloister, removed from the market and its accountability.” Yet the topical examples he gives prove that he’s missing this part of the picture.

He says, “this tender beast [the Big Shaggy] is also responsible for the mysterious but fierce determination that drives Kobe Bryant, the graceful bemusement the Detroit Tigers pitcher Armando Galarraga showed when his perfect game slipped away, the selfless courage soldiers in Afghanistan show when they risk death for buddies or a family they may never see again.”

Maybe it’s just me, but it sounds somewhat presumptuous to conclude that the only way to understand the meaning behind such actions is to immerse ourselves in the humanities. While doing so may help us to draw comparisons and create analogies, in the end there is nothing that compares to firsthand experience. The reason we seek so desperately to understand why people act a certain way is so that we can try to harness their abilities in our own lives. This is a large part of the reason why many of us feel that public figures should act as role models. But it also may be shortsighted to think this is the only way to learn.

Most of us will agree that, as talented and cultured as he is portrayed, Kobe Bryant probably did not gain his “mysterious but fierce determination” from reading Thucydides’ accounts of the Peloponnesian War. Maybe a better Kobe historian than myself can say for sure, but his desire to win likely stems from childhood playground battles, where losing ate him up inside more than anything else.

So if Kobe didn’t receive his distinctive attributes by reading books, why should it follow that this is how we should receive ours? It isn’t wrong to point out the many laudable qualities of the liberal arts, but it is wrong to say they are the only way to get in touch with the inner feelings in those around us.

Though I would never detract from the merits of liberal arts teaching, I will say that understanding behavior is about more than anything you can learn in class. Part of why I enjoyed journalism was because I had the chance to witness people acting determined or benevolent or courageous every day. It provided the sort of real-life insight into the human condition that the humanities could only supplement or fortify.

Ironically, all this talk about humanity and firsthand interaction brings me back to something I did glean from my liberal arts education. It makes me think, for a number of reasons, about Ralph Ellison’s “Invisible Man.” Reading the book for one of my English classes this past year, I remember the professor telling us that “humanity” was Ellison’s favorite word. In my brief investigation of this fact I stumbled upon a quote of his that I feel best sums up what I’m trying to say. He said, “The understanding of art depends finally upon one's willingness to extend one's humanity and one's knowledge of human life.”

I know it sounds confusing and redundant, but I believe he is trying to say that we can only learn to appreciate the art behind books, movies, all-star athletes and heroes once we have it for ourselves. Indeed in this and many other respects, this Ellison guy seems to have been onto something. You can never truly relate to anything unless you are relating to experiences ingrained in your memory bank and in your emotions. It’s chiefly out of those experiences that we have the capacity to understand the beauty, the passion, and even the despair in others, as well as we do our own.

Tuesday, June 8, 2010

Planning the Future, One City at a Time

About a month ago I shared a turbulence-ridden plane ride with a consultant for an urban planning firm in Madison, WI. After talking briefly about my intemperate weekend there – evidenced by my visible hangover and anachronistic “Mifflin 2009” T-shirt – our conversation turned to her plans. She told me that after she landed in Chicago she was heading straight to Amsterdam (which would have been interesting enough for me) and from there flying directly to conduct business in Abu Dhabi of the United Arab Emirates.

Learning about urban planning in Abu Dhabi? If there would be any redeemable aspect to my stay at UW, this was certain to be it.

Flooding her with an array of questions, I soon found out that she was part of a team probing the city to set up bike lanes to make it more convenient and safe for the residents. She told me that aside from being very densely populated, Abu Dhabi currently has a crucial transportation issue to deal with, including road congestion, on-street car parking, and widespread commuting troubles. She and her team were traveling to the Middle Eastern city to redesign the roadway system to alleviate some of these problems part of an overall urban revamping effort called Abu Dhabi 2030.1

After discussing both her excitement and apprehension about her inaugural visit to the Middle East, I decided to share some of my admittedly limited knowledge of her field. I told her that a few months earlier, I attended a lecture in Syracuse by sustainability expert Alex Steffen, whom I had the privilege of interviewing beforehand. Listening to Steffen, I was shocked to learn many things about our current environmental failures. Yet what surprised me the most was his belief that people should actually move into cities to limit carbon emissions from cars and other forms of transportation. Urban planners like the one sitting next to me could then implement structures to prevent congestion and allow city-dwellers to trek with lower carbon imprints, rather than driving vast distances from their suburban homes.

What urban plans such as these point to is the mindset we need to demonstrate if we want to start thinking about real environmental change in this country. As Mr. Steffen aptly put it during his lecture: it’s not about doing things differently, it’s about doing different things. Chiefly this means re-thinking and re-designing the way we live to provide for a sustainable future. While continued efforts like Abu Dhabi 2030 and New York’s “PlaNYC” can provide the blueprint, we still must convince a vast majority of Americans that it’s possible. Urbanization should no longer be looked at as a mayor’s pipe dream to create money and business for the city, but as a practical way to achieve minimal carbon emission.

Apart from traditional techniques listed on the above Website, there are countless measures we can take to make cities more sustainable and desirable places to live. While many of these are certain to work, the issue still seems more psychological to me. We have to rid ourselves of the notion that urban, suburban, and rural are all inter-dependent entities we can all be a part of at once. Instead we should focus all our resources on urban centers, letting only the necessary agricultural population live rurally with almost everyone else living in or close to compact, neatly regulated cities.

Of course radically changing Americans’ beliefs about anything, much less their comfortable suburban lifestyles, will never be easy. One of the greatest hurdles to this may just be our country’s physical expanse. When I lived in Copenhagen for 5 months, it amazed me to see how far the urban area seemed to extend outside the heart of the city even though the city itself wasn’t that big. Those living on the outskirts could easily navigate their way into town by bike (a preferred method) or train without feeling the suffocating barrage of cars, traffic, and pedestrians everywhere.

For some reason in this country we are captivated by the ideal of the urban metropolis. We think the presence of huge office buildings somehow proves our economic and international potency. This may be fine from an urbanization standpoint – building up isn’t always a bad thing – but not when the people who work in those buildings drive their SUVs 40 miles every day back to their centrally air-conditioned mini-mansions. While Americans take pride in sacrificing for a war or providing aid, they seem to care less when it’s about something less tangible and perhaps more important: our future.

Any political measures to disrupt this corrosive cycle will of course come with some backlash from the auto and energy industries and their respective lobbying ilk. But like so many of the other issues we face today, this is one that requires a staunch sense of vision from our political leaders. This means not succumbing to the pressures of economic and political capital, but finally standing up for what needs to be done.

As many great leaders have noted in the past, sometimes people need to be told what is right rather than decide for themselves. It’s time to put aside our selfish indulgences and show that we can make this work, both for the greater good of coming generations and one another.


1The population of the city is expected to rise dramatically by then, soaring from its current level of 900,000 to more than 3 million by 2030.

Friday, May 21, 2010

From Robert to Eldrick

Many questions have been asked lately about Tiger Woods' neck injury and subsequent withdrawal from the Players Championship: Was it related to the car accident in November? How long will it take him to recover? Is there a connection to this infamous Dr. Galea? Yet one concern that’s flown a little under the radar during this whole mess is Tiger’s history of problems with his back and neck, particularly related to his swing change in 2003.

As my adoration of Tiger began to take hold at an early age, I vividly remember my dad scoffing at the notion that Tiger would become the greatest golfer of all time. Though he conjured up a variety of reasons to believe this – primarily stemming from his own adoration of Jack Nicklaus – there was one real reason that would hold him back: his swing. “No one can sustain a career with that type of swing,” I’d hear him say. “It puts too much pressure on his spine. He won’t make it 10 years on the PGA Tour.”

Well, clearly that didn’t happen. Clearly Tiger is already widely considered to be the greatest golfer to ever live. But let’s just pose a hypothetical for a second and say that Tiger’s career ended in 2010. Due to residual damage from his old swing, Tiger could no longer put that same physical strain on his body and was forced to retire at the ripe old age of 34. If that were to happen, would he still be considered the greatest golfer of all time?

Of course there’d be arguments on all sides of the debate. I’m not going to stoop down and say what I think – even though it’s probably obvious at this point – but I will share a little piece of history that makes things a little more interesting.

Before TW came along, Jack Nicklaus was unequivocally hailed as the greatest man to pick up a set of sticks. He shattered almost every record in the book, at a time when the competition was as stiff as ever. But what if the Golden Bear really wasn’t the best? What if there was someone who retired early from the game – like my Tiger hypothetical – who had all the same potential? Well low and behold there was someone. And he went by the name of Robert Tyre Jones Jr.

Dutiful golf historians like myself will remember that other than Tiger Woods, Bobby Jones is the only man to hold all four major tournaments at one time (and he did it in a single calendar year). If Tiger somehow doesn’t eclipse Jack’s record of 18 major tournament titles, this may turn out to be a fitting coincidence.1 Jones unfortunately retired from the game voluntarily at the age of 28 to pursue a law career in Atlanta, but his record of 13 majors stood for more than 40 years after his retirement.2

By all estimations, Tiger likely won’t pull a “Bobby Jones” any time soon. Aside from his prolific talent, Tiger is also the most physically fit and physically determined golfer ever – which means he will essentially force his body to recover in order to demolish every record in golf history.

What the Jones parallel really points to, however, is the over-emphasis we put on records in today’s sports culture. While records are no doubt important and useful, they also don’t always tell the whole story. Worse yet, records can ruin our appreciation and connection with the game itself. Certainly Bobby Jones understood this when he said, "It (championships) is something like a cage. First you are expected to get into it and then you are expected to stay there. But of course, nobody can stay there."

Unfortunately, the attention on Tiger’s recent mishaps all boils down to an expectation. If Tiger weren’t poised to break golf’s most coveted record, he obviously wouldn’t receive half as much attention as he does. The sheer glimmer of possibility that he actually won’t do it makes everyone that much more interested.

Over the years Tiger has made it evident that he cares about nothing more than winning majors. He plays in far less tournaments than the average player and the thought of miss missing U.S. Open is probably eating him up inside. Yet If I were Tiger, I’d probably try to take a page out of the Bobby Jones book and view those championships a little less seriously.

If he starts to have some fun, if he focuses less on titles and numbers and more on the game that he grew up loving, maybe he will reach that peak a little sooner. And who knows? Maybe somewhere along the way he will develop that fleeting human quality he’s never seemed to have – that now more than ever, he desperately seems to need.


1It makes it all the more coincidental that Jones’ return to golf was halted by a spine condition called syringomyelia, a fluid-filled cavity in his spinal cord that eventually led to his paralysis.
2It makes it all the more coincidental that Jones’ return to golf was halted by a spine condition called syringomyelia, a fluid-filled cavity in his spinal cord that eventually led to his paralysis.

Tuesday, May 4, 2010

On Hippies

Check out my discussion with Bank about hippies on his blog. Read the comment to see my response.

Monday, March 8, 2010

Facebook's Foibles

When I first created my Facebook profile a little under 4 years ago, neither I nor anyone else could’ve ever imagined it would come to this. Call it naïve, innocent, shortsighted even, but I never thought that my Facebook interests and information would some day direct what advertisements I saw online. Though maybe I should have.

My intuition to post “Phish” and “The Big Lebowski” as my only 2 interests may have been a subtle harbinger of the advertising juggernaut we see today, an inclination against sharing information that might come back to haunt me (it was most likely due to the fact that I thought I was one witty 18-year-old).

Before I get into the legitimacy of my early Facebook instincts however, I want to talk for a second about the dirty little business known as advertising. As most of us know, advertisers try to convince us we want things that most of us probably don’t even need. Whether by word of mouth, “effective” ad campaigns, or plain reputation, companies brand their products to stand out from a competition that can usually provide the same desired result. Just as a quick example, sitting in my room right now are a box of Nutri-Grain bars and a box of Pop-Tarts. At some point either in my childhood or more recently, I got it in my head that both foods satisfy my on-the-go breakfast needs. Even though other foods can do so just as capably, I stick with these two brands because I’m simply comfortable with what I know.

All boring anecdotes aside, ads that cater to what we know (or at least what they want us to know) are most effective. They don’t want us to think that all breakfast foods or all T-shirts or all flower bouquets will do the job. Instead, they want us to think that only their products can do the job because they match our interests, our knowledge, or our demographic. It follows then that ads don’t exist to meet our needs (apologies for the Marxist rhetoric), but to capture our attention, either emotionally, graphically, or in any other way they can think of.

The more advertisers understand about where that attention lies and how it’s attracted, the better positioned they are to capture it. This explains why Facebook is in many senses an advertiser’s wet dream. Instead of bothering with psychographic surveys and experiments, advertisers now have a tap into all of our emotions and desires. All someone has to do is put Yankees or Giants under his interests and he automatically has an ad linking to StubHub or Ticketmaster for NY sports tickets.

While the formula they’ve concocted seems perfect – especially for us elusive and attractive Millennials – it doesn’t always work out that way. Part of this has to do with people like me who don’t publicize their interests, but an even greater part I’d say has to do with Facebook as a mode of communication. For all of the information advertisers have on their subjects’ interests and feelings, they don’t really have a viable way to reach them. All they have is a small, peripheral picture along with 3 to 4 lines of text, which pales in comparison to the impression of a 30-second televised ad or even full-page print ad.

For the past half-century or so, advertising has been something of a cat and mouse game, an effort to foster a sense of emotional attachment with items that ostensibly have none. While advertisers can know everything about what we like, who we are, and what we do, they can’t always gauge our reactions. The degree to which an advertisement fails to stir our emotions is usually the degree to which it fails as an advertisement.

As consumers we don’t always relate to what’s overt or obvious (like an online poker ad because we list poker under our interests), but sometimes to what’s more subtle and engaging. The reason why comedy is so prevalent in advertising – which has something to do with why online advertising in general hasn’t been successful – is because it communicates a feeling that can’t be captured by a static medium. We don’t want to be hit over the head with simple, blatant, or in some cases downright creepy directives. As with all other art forms, we appreciate the grace it took to capture our attention, and with advertising have the capacity to reward the ones that succeed.

While I don’t know what the future of online advertising will hold, I do know some of its brief history. Little niches have succeeded here and there – like the Wall Street Journal, which serves a finance audience needing news in real time – but most have failed. I don’t think Facebook will fail necessarily, but I don’t think it will turn out to be the advertising haven some expected either.

As much as it’s time to embrace the Internet and social media – particularly for me as an aspiring journalist – it’s also time to be wary of its potential. For now, those fearing the death of television and print media can rest on their laurels knowing that they aren’t going anywhere anytime soon. Advertisements like this one have proven success for a reason (though this year’s Super Bowl ads might indicate that they might not be successful for much longer).

As sneaky and deceptive as it often is, advertising will always remain more of an art than a science. Lagging behind others that have done so more successfully – journalism, photography, video – it’s an art form that has yet to see a successful online transition. How long that process will take, no one really knows. But there is one thing we do know for sure: because of the ever-increasing number of Internet users, once it comes there will be a lot, a lot of money being made. And all I hope is that I’m on the other end.

Saturday, February 13, 2010

Vocation Vacation

Somewhere along the course of my college application experience I had a conversation with my father in which he plainly asked me, “Do you want to go to college?” Apart from his seeming desire to be relieved of the financial burden, he was hinting at the more trenchant observation that college isn’t necessary for a “successful” life. Growing up where I did and when I did, however, that concept seemed preposterous. This wasn’t because I didn’t think it was true (we’ve all heard success stories of people who didn’t go to college) but because I had never even considered the possibility. Almost everyone around me, down even to my lowest achieving peers, went to college and we all accepted this as the norm. Obviously, I followed suit.

A couple of times over my 4-year tenure at Syracuse I’ve thought back to this decision and considered it through what I know now. What I think about has little to do with the fact that I’m looking for a job in the midst of a severe economic downturn, though you could say it does play a role. What I really brood over is the time that I might have wasted here when I could have been (1) earning a living on my own (2) learning vocational skills necessary for potential success in my career, likely for a lot less money than I’m paying now and (3) becoming a productive, contributing member of society.

Like most of my high school classmates, I guess I didn’t exactly “know” what I wanted to do with my life at that point and attended college apparently to find out. And to some extent it worked; I now have a better understanding of how I want my career to take hold, maybe if only in a general sense.

But let’s suppose for a second that I didn’t grow up under the circumstances I did. Let’s suppose I lived in a district where unemployment was at 28%, where an average of 40% of students attend high school classes regularly, and where just 36% of college-qualified students complete college. If I lived there and achieved the same grades I did in high school, would I still think my father’s question was preposterous? Likely not. Alongside looking for scholarship money to college I would probably be looking for a job. And if those scholarships fell through, I would hope that I was prepared to work somewhere requiring more than a minimal skill set. I would hope that I learned skills in high school that I could fall back on to earn a respectable living. This brings me to the Hunts Point High School for Sustainable Community Initiatives.

The school, proposed by local teacher Steve Ritz, aims to integrate job training into conventional academic study to prepare students in the South Bronx for careers in “green” technologies. The initiative would then foster expansion of important areas like urban farming and natural resource management, areas that my tree hugging, hippie ass obviously loves (read more about it here). It would also take hold in the very district where those statistics I mentioned above actually live, in the Hunts Point district of the Bronx.

By all estimations, there should be no reason why this high school doesn’t yet exist. Vocational and post-secondary schools are already prevalent throughout the country, many in high-income suburban areas. Yet despite a Bronx Community Board voting unanimously to pass it, the proposal has been rejected twice by the U.S. Department of Education. It’s not hard to guess the reasons why – starting with a lack of funding and adequate teachers – but that’s not what I’m here to do. Nor am I here to necessarily advocate for this school in particular even though I agree it should pass.

Rather, I want to look at Mr. Ritz’s rejected initiative as an emblem for the current systemic failures in our country. Above all, it represents government’s inherent reluctance to approve any idea that requires money, no matter how innovative said idea actually is. Here we have an idea that’s beneficial on 3, if not more, fronts: it will create “green” jobs (which Obama recently called “the driver of our economy over the long term”); it will raise graduation rates among low-income students by teaching them on-the-job skills; and it will stimulate the kind of environmental progress this country desperately needs.

That it hasn’t yet passed is a shame. That ideas no longer stand alone – instead subjected to a stifling rigmarole determined by bureaucrats who care only about the number of zeros behind them – is a shame. The status quo is so fixed that it seems to actually hinder independent ingenuity.

While there is no easy solution to this labyrinthine dilemma, there is one thing I have learned these past few years that might help. That’s that all companies, and green businesses in particular, need specialized vocation. Whether they are biotech fuel-cell generators or simply recycling companies, employers need all the fresh thinking and help they can get. Students, particularly those who might end up on the streets or even worse, can lend invaluable manpower and innovation to the burgeoning race over green technology.

Knowing what I do now, it would’ve been tough to pass up an opportunity to learn a valuable green technology in high school, especially when the field was just emerging. We must capitalize on an opportunity to teach this legion of young minds about a field they can truly get behind. If green business is really the twin economic and environmental boon society needs, we shouldn't let it be thwarted by an outmoded education system, but should instead make every effort to let it grow.

Saturday, January 30, 2010

The Fight Against Early Millennial Extinction

A developing, but increasingly popular debate today deals with how the Internet, with its seemingly immeasurable degree of information, affects the minds of the young people who depend on it. While another popular debate about kids using electronic devices – which the NY Times hosted an online discussion about only yesterday –concludes that scaling back the use of is ultimately beneficial for a child, the one I’m interested in doesn’t have such a clear answer. The biggest reason for this is that, for all its diversion and superfluity, the Internet has quickly become our generation’s most valuable tool for finding information, whereas most electronic devices don’t carry such value.

An article I read last May in New York Magazine called “The Benefits of Distraction and Overstimulation” provided perhaps the most extensive investigation into this issue to date. With greater panache than I will ever have, author Sam Anderson takes a deep look into the many problems plaguing us so-called Millennials: our inability to focus on and complete tasks from start to finish; our relatively weak tolerance for distraction; and our desire to remedy these problems by “doping” our minds into focus (my college audience shouldn’t have any trouble comprehending this).

In spite of all these hurdles impeding our ability to truly focus (if we even can), Anderson concludes that we will emerge with a unique ability to multitask and bring disparate pieces of information together to create something original. Here’s a little summary: “More than any other organ, the brain is designed to change based on experience, a feature called neuroplasticity. As we become more skilled at the 21st-century task [David] Meyer calls “flitting,” the wiring of the brain will inevitably change to deal more efficiently with more information. The neuroscientist Gary Small speculates that the human brain might be changing faster today than it has since the prehistoric discovery of tools. Research suggests we’re already picking up new skills: better peripheral vision, the ability to sift information rapidly.”

While I don’t doubt that our brains are adapting to meet the ever-changing needs of an ever-abundant Internet, I do doubt that this change is necessarily good for everyone. All it really means is that those with an inherent ability to multitask will benefit while those with less natural ability will struggle to keep up. Obviously this is all part of the Darwinian1 selection process, but it also raises a number of interesting questions about what implications it may have. Chief among these questions, I think, is whether or not some of us are sacrificing our “natural” talents in favor of Internet-related skills that society tells us we should be good at.

Now, as a senior getting ready to graduate I recognize my own subjectivity and stake in what I’m about to say (especially making it on a blog). Yet I can’t help but notice how employers, particularly media employers, look for an increasingly varied skill set: HTML coding, proficiency in Adobe programs, proficiency in Microsoft programs, knowledge of social media/networking sites, or plain Internet “savvy,” just to name a few. I understand that having a diverse range of skills is helpful in today’s occupational environment, but I also think cursory knowledge of many skills leads to less than expert knowledge of any one skill.

I know that many people are multi-talented and can learn to be proficiently skilled at more than one thing. I also know that people can adapt their talents to the requirements of the online culture. What I’m really arguing is that everyone who has talent is not necessarily talented at more than one thing, and pushing that person to try may only inhibit his or her ability to pursue that one thing.

The Internet culture we’ve become so accustomed to I believe has forced us into the mindset that we are losing if we can’t do everything, and that we are stupid if we don’t know everything. It seems devoting real attention to one aspect of our lives is often an effort in futility, with everything else we may be missing out on. Recently I read that, due to the acceleration of the media, the amount of information up until 1940 equals the amount of information since 1940.2 While this wasn’t the first time I came across this statistic, it was the first time I thought about what it actually means for our cultural psyche. I thought about the competitive, almost ruthless nature with which we, as self-respecting cultural citizens, are expected to digest this galaxy of information. We expect that because access to this information has increased, we should all know more. Yet we can’t possibly make ourselves know more. We can know about a lot more things, but the more things we know about, the less we know about each of them, which gets to the heart of the problem with our new information age. We simply know less about more.

I suppose this might not be a problem that I have with culture (who am I to argue for or against a trend?) but more of a problem that culture has with me. I’ll be the first to admit that I’m not the type of person who likes to load my plate with thousands of different stimuli. Whereas many of my peers listen to music or watch TV while doing work, I typically have no such distractions. Whereas the average person in my generation has 8 tabs open in his or her Internet browser, I rarely have more than 4 and even become anxious when I have more because I think I forgot to do something or left it there for a reason (self-diagnosed “browser-tab anxiety”). Yet I am the type of person who likes to think through almost everything I encounter, who won’t stop until I understand something to its absolute core, and who lays awake at night thinking about concepts like “post-irony” until I can wrap my head around them.

Does this mean I’m dying breed? Well, not exactly. I used to lament the fact that I wasn’t born 100 years ago, where everything seemed so much simpler and I wouldn’t have had to worry about these types of problems. But I now realize that it’s people like me who are forming the impending, nascent backlash against such lofty ideals as “cultural globalization,” where people adopt universal and/or random ideas into the way they act because there’s simply nothing new. We form the link from new age eclecticism back to specialization and individuality. We will usher in a new era where people will no longer attend to 30-second sound bites and 500 word blog posts in favor of depth that requires a greater and more nuanced attention span.

So yes, my societal value may be down for now, but as we all know history moves in cycles. I’m no dinosaur; I just hope I subsist long enough to witness my revival.

1I am loath to say “natural selection” due to the fact that we are talking about something so diametrically opposed to nature in the Internet.
2I also read this in a book published over 20 years ago, which means that, due to the advent of the Internet, the latter has now probably far surpassed the former.