Advertisements

Monthly Archives: March, 2013

Facebook Goes Back to Basics

For the vast majority of Facebook users, there is no notion of Facebook without News Feed and its inevitable stream of political polemics, cat photos, and here’s-me-on-Mount-Everest status updates. But Facebook existed for 2.5 seemingly interminable years without it. Plus, its introduction in August 2006 sparked a strong backlash that, in retrospect, looks weird and out of step with the “progress” of social media, but also predicative of the byzantine privacy maze that would ultimately drive numerous Facebook users away to Snapchat, Tumblr, and Instagram (and yes, I know that Instagram “is” Facebook, technically, but all of its value-add was created by the original Instagram team and it would never have succeeded if mobile-addled Facebook had conceived it).

I began using Facebook in the late summer of 2004 on a Dell desktop running Windows XP (almost everyone at my university had a similar machine; if I were to return today, I suspect I would struggle to find any student who had a desktop running any OS). It’s startling to think about what a cesspool consumer Wintel computing was at that time. Microsoft’s blasé attitude toward security meant that your machine could contract a terminal virus anytime you ventured into the wild west of Internet Explorer. Meanwhile, the wide-open, custom-wallpapered world of MySpace was the default “open” (very much so) tool for networking with friends online, and it possessed a similar “enter if you dare” air. In this context, Facebook was startling. A walled garden with a minimalist look, limited to other university students, and safe: it was an oasis. It was a harbinger of the end of Wintel dominance (which incidentally peaked that year) and the rise of newer, more closely controlled platforms.

Opening up pre-News Feed Facebook took you to your profile, when included some oddities as a fully editable “wall,” a laundry list of your attributes (birthday, last update, etc), and a modest “Quick Search” box in the upper left, which let you find your friends or perform general stalking (just kidding). During this time, the careful pruning and customization of one’s profile consumed most of the time that s/he spent on Facebook.

News Feed changed that in 2006. It was perhaps Facebook’s last innovation, the one that made Facebook look like it does now, and it was very forward-looking for a company that would in subsequent years so often find itself playing catchup. Twitter had been launched only a few months earlier, but wouldn’t really take off for several more years. News Feed was, for ~8 million users, the first look at an easily navigable timeline of status updates and posts. Facebook was now as much about reading the brags of others and it was about bragging on one’s own behalf. It was a flood of information, like a mini-Google for one’s own contacts.

But on the back of Facebook’s runaway user growth in the late 00s and its attendant drive toward monetization, News Feed’s value as a way to get, well, news about friends, began to deteriorate. The informally dubbed, completely opaque EdgeRank algorithm began to prioritize certain types of posts and traffic, meaning that you might never see a friend’s post without having to go to her actual profile. Facebook Ads further muddled the News Feed with “targeted” garbage, making AdBlock all but necessary for viewing the site, but even that slight fix didn’t address the eccentricities of EdgeRank.

Basically, Facebook (in its defense) matured at an odd time in computing. Its explosion in growth neatly dovetailed with the rise of the iPhone and Android. Along the way, it clung to the desktop-born paradigm of having a “profile” while also trying to keep up with new mobile app usage paradigms. It’s funny to look back at it now, but Steve Jobs’ demo of something as completely-taken-for-granted-now as a scrollable contacts list (when he showed off the first iPhone) forced Facebook’s hand. It needed better, more scrollable content in a continuous stream, rather than the discrete profiles that has been its original bread and butter.

The obvious solution to this need has been more focus on images and videos, both from a content curator’s perspective (for Facebook) and a user’s perspective (since sharing images/video feels a bit easier than typing out long textual updates on a mobile device). Facebook has been forced to keep up with both Google+ and Tumblr on these fronts, two networks that came of age once mobile was already in full swing and hence had more time to accommodate image sharing and streams/feeds from the ground up.

Facebook is too late in making these changes. At best, they could have implemented them several years ago when Flipboard launched, since Flipboard’s ability to aggregate your Facebook feed is already a far more progressive view of how one’s News Feed and Wall can be translated into a mobile-friendly, images-first format. Of course, the number of Flipboard users is small relative to the number of Facebook users, so Facebook hasn’t been hurt by its dalliance. And that’s part of the issue: Facebook, like the horrible XP cesspool I mentioned at the outset, has so many users that it can almost afford to be lazy or careless with the quality of its product, since critical mass was reached so long ago and the costs of leaving can be painful.

Today’s Google+–inspired updates to the News Feed – which now permits multiple feeds and greater priority for photos and videos has a ton of minuses for users, including louder/more prominent ads (for the sad souls who don’t use AdBlock) and more opacity in terms of EdgeRank algorithms. But it may have a slight plus (no pun intended), too. With the gradual pollution of the News Feed and the concomitant rise of Graph Search (even if overrated), there may be an opportunity for individual profiles to shine again.

Tired of News Feed? Then use Graph Search to get away and find profiles more easily. It’s perhaps ironic that Facebook’s efforts at mobile-centric modernization may take it back to its roots as a profile-based service (with an assist from its Graph Search), but I think it’s a predictable consequence of Facebook’s monetization. It isn’t part of a wider ecosystem like Google+, nor well-defined by a particular demographic (artists) like Tumblr, so it has to be increasingly forward it how it tries to get revenue from its incredibly varied users. With these stabs at making more money from larger photos and larger ads, user fatigue may continue to rise and drive users back to the profile basics, whether they were there in the 2004-2006 ancient history era or not.

But will that profile-checking occur on Facebook or elsewhere? A major part of Facebook fatigue is that Facebook has too many opaque mechanisms – byzantine privacy controls, EdgeRank algorithms, inconsistent/unpredictable search results – which get in the way of actually comfortably/safely connecting. For many users, these obstacles can be overcome, in the way that enterprise software users often overcome terrible products and continue being productive, at least at some cost. The relative anonymity of Tumblr, or the clean feeds/profiles of Google+ provide real alternatives, but as much as I’d like to hope that better products will win out, Facebook will sputter on, til at least 2023 or something.

-The ScreenGrab Team

Advertisements

10 Basic Chrome OS “Apps” to Get You Started

Chrome OS has come on big in 2013, thanks to the proliferation of cheap but reliable machines from Samsung and Acer, as well as the meaningless (for now) glitz of the Chromebook Pixel. While some people may easily embrace Chrome OS’s continuous, Web-based model of computing, others may balk at a platform that has no native apps except for a browser and file manager.

Fortunately, Chrome OS still gives us the illusion of having discrete apps that can be docked and clicked to open their own webpages. Here’s a list of some easy-to-use apps to get started

Evernote

Recent hacks aside, Evernote is a reliable tool for storing or creating just about any type of content (text, photos, quotes, videos, screenshots). The Web-based version is fast and lightweight but still highly functional, making it a great counterpart to the native Evernote apps elsewhere.

New York Times

This “app” takes you the Chrome-optimized version of the NYT sight, which is far less-cluttered than its standard page. It also supports Chrome OS’s desktop notification system, which is handy for keeping track of breaking news.

Gmail Offline

Gmail can sometimes be slow, an issue further compounded by limited resources on many Chromebook models. Gmail Offline solves two main issues for Chromebooks: it lets you manage your email more quickly, and it gives your device some real (and rare) offline functionality.

NPR Infinite Player

Infinite, free, customizable listening to NPR stations.

PicMonkey

I’ve actually gotten GIMP to run (albeit painfully slowly) on my ARM Chromebook, but this is a solution much more suited to Chrome OS’s style. It allows for some light photo editing and sharing, with the option to upgrade for more sophisticated features. It also has a handy extension for detecting, capturing and editing images on the current page.

Google Play Music

This one actually comes bundled with Chrome OS. Its Web app is one of the easiest ways to listen to music online, and a must-have in lieu of a fully functional Spotify Web app. You can listen to any of the songs stored in your Google Play Music locker, or songs purchased from the Google Play Store.

TweetDeck

A Web app that runs in a native app-style standalone window, TweetDeck is the best way to use Twitter on Chrome OS. Luckily, it also seems to be getting even greater attention from Twitter now that the iOS, Android, and AIR version of TweetDeck are being retired.

Write Space

My favorite text editor for Chrome OS. White on black, simple, and fast.

IMO Messenger

No native IM apps? No problem! IMO lets you manage all your major IM accounts (AIM, Skype, Jabber, Google Talk) from its Web app.

Pandora

Perhaps a stretch, since this app is just a link to the usual Pandora website, but it’s free music (or paid, higher-quality music, if you have a subscription) nonetheless.

-The ScreenGrab Team

Twitter’s TweetDeck Tweaks a Validation for Chrome and Chrome OS

Quick entry, since the last one is a bit longer.

Twitter has announced that it will abandon the iPhone, Android, and Adobe AIR versions of TweetDeck, a powerhouse Twitter client that it acquired two years ago. It’ll also cease Facebook integration, which I find unsurprising given Facebook’s gradual decline. It’ll maintain the native Mac and PC apps, but it really wants to use its versions for Web and/or Chrome.

The focus on Chrome is interesting. TweetDeck for Chrome has gotten access to some features (like notifications) more quickly than other versions, and its sleek, almost Holo-esque column layout is seemingly made for Chrome’s aesthetic.  On Chrome OS, TweetDeck is indispensable: it maintains its beautiful Web aesthetic while also running in its own application window, a trait reserved almost exclusively for the Chrome OS Files app, which manages your downloads and Google Drive storage. Other than possibly Falcon Pro for Android, it’s the best Twitter experience I’ve had on any platform.

For Twitter power users and Chrome OS fans, there’s something validating about TweetDeck migrating to a largely Web-based existence. For one, it validates Chrome OS’s approach to software, that is, that software can serve most people’s needs by simply running from within a well-designed browser. But Twitter’s move here also demonstrates the subtle divide between mobile and desktop OSes – Twitter’s statement said:

“Over the past few years, we’ve seen a steady trend towards people using TweetDeck on their computers and Twitter on their mobile devices,”

meaning that native remains the way to go for mobile while Web is good enough for desktops. Chrome OS’s subtle blend of a Web Tweetdeck app that runs as in its own window as a pseudo native app really is a perfect distillation of how Chrome OS nimbly straddles the mobile/desktop OS boundary.

In any event, Twitter seems to be emphasizing that it is a platform and not an “app.” It’s been a rough go lately for 3rd-party Twitter apps, thanks to the new API rules that limit those apps to only 100,000 user tokens per version and now this scaling back of TweetDeck. The migration away from app-centric computing bodes well for Chrome OS, although I doubt that the native Twitter apps for iOS/Android are going away anytime soon.

-The ScreenGrab Team

Rebutting David Wong’s “6 Harsh Truths”

No article has been shared with me more this year than the pseudonymous David Wong’s “6 Harsh Truths That Will Make You A Better Person” entry for Cracked. It is exactly what it sounds like: a list, written in an aggressive “face-the-facts” voice, of “truths” that purport to improve your worth as a human being. It’s hard to know if its reference to the Four Noble Truths is intentional or coincidence, though I’m guessing the latter in light of the author’s literary and cultural carelessness.

The problem with “The World”
Jason Pargin (Wong’s real name) traffics in broad, ill-defined terms such as “the world.” The assumption appears to be that all persons experience the same reality, a state of affairs that does not hold even for individuals in the same culture, let alone persons separated by thousands of miles and linguistic barriers.

Along those lines: have you ever stopped to think that “globalization” is just a bullshit word for “wouldn’t it be great if the whole world spoke English and adopted neoliberal economic policy?” “Globalization,” or “cultural imperialism” as I think it should be termed, seems so reductivist and  small to me in light of the default cultural setting  – for millennia – being separation. Have you ever tried to compare a Turkish book with its English translation? There’s tons of info being lost there, and “the world” created by “globalization” is not going to reclaim it or make it intelligible to the lowest common denominator.

Anyway, Pargin says:

“If you want to know why society seems to shun you, or why you seem to get no respect, it’s because society is full of people who need things. They need houses built, they need food to eat, they need entertainment, they need fulfilling sexual relationships. You arrived at the scene of that emergency, holding your pocket knife, by virtue of your birth — the moment you came into the world, you became part of a system designed purely to see to people’s needs.”

Here, “the world” appears to mean “21st century Western society in the wake of neoliberalism,” which is a strange default choice for human experience. Moreoever, Pargin is addressing persons who want things – how ridiculous is it to say that someone living in the prosperous West “needs” entertainment?. Most of all, “they” (and it really is an issue in Pargin’s text, trying to figure exactly who this vast, gray sea of “they” is) want money and the convenience that comes with having more money.

There is no “world” that is constant across all time, nor any singular truth that issues from it, since truths are socially constructed. Look at how mankind has constructed different narratives – The Fall, progress, evolution, disruption – to explain the same phenomena throughout history. People from the 17th century may as well have lived in a different world entirely. Don’t take my word for it, though; take that of philosophy professor Ron Yezzi, who succinctly summarized Michel Foucault’s outlook on truth:

“Claims for a neutral, objective pursuit of knowledge that culminates in truth independent of the pursuit and exercise of power is an illusion.”

It seems that Pargin realizes this if only accidentally by identifying multiple items as “truths,” rather than the more semantically absolute “Truth” or “the truth.” But he hedges by making sure that we know that said truths are “harsh,” which strikes me as an aesthetic term that doesn’t carry the right weight for describing subjective thought. Do truths smack you in the face, drag you across the pavement, and leave you with rashes?

What Does it Mean to be “Harsh”?
The degree to which they are “harsh” depends only on how fervently one believes in and worships them. The person who craves monetary success as the “true” form of happiness, for example, will make sure that everyone knows about and feels the consequences of his truth. The entire ordeal, all the way to death, will feel “harsh” as he vigorously pursues a small notion of happiness though a tiny tunnel, using “truths” as a coping mechanism for some emptiness. David Foster Wallace once said:

“There is no such thing as not worshipping. Everybody worships. The only choice we get is what to worship. And the compelling reason for maybe choosing some sort of god or spiritual-type thing to worship–be it JC or Allah, be it YHWH or the Wiccan Mother Goddess, or the Four Noble Truths, or some inviolable set of ethical principles–is that pretty much anything else you worship will eat you alive. If you worship money and things, if they are where you tap real meaning in life, then you will never have enough, never feel you have enough.”

But Pargin’s worship isn’t religious (or maybe it is; see below), and it seems to fall into the “eat you alive” camp, as evinced by the syntactical and argumentative madness (maybe it’s just the Cracked in-house voice at work) to which it has driven its author.

Pargin himself is a winner of sorts. He made it through the gauntlet of bullshit and corporate money that doubles as “the writing world” to publish the successful John Dies at the End, which also became a film. Here’s how he described his own ordeal:

“I was the world’s shittiest writer when I was an infant. I was only slightly better at 25. But while I was failing miserably at my career, I wrote in my spare time for eight straight years, an article a week, before I ever made real money off it. It took 13 years for me to get good enough to make the New York Times best-seller list. It took me probably 20,000 hours of practice to sand the edges off my sucking.”

I don’t care about his self-admitted shittiness; I care about the words “real money” and “best-seller list,” which denote him having “made it,” in American Dream parlance, independent of quality.

What do Pargin’s truth look like to someone who doesn’t worship the same things? Pretty odd, in the same way that Wiccanism looks odd to a Pentecostal.

It’s hard to parse the “harsh truth” mantra – how is any given truth, in a world in which multiple truths are available for subscription and belief, “harsh?” If I don’t believe in Wiccanism, then does the continued existence of Wicca’s principles (they aren’t being disproven, after all) constitute a “harsh” truth for me in the eyes of a true believer?

It’s as if the multiple truths that Pargin introduces – many of them tied to certain periods in history, like his screed against hippies – are not things that societies constructed and formulated out of observation and attention to need, but rather Mosaic tablets, inscribed and remembered for all time after being discovered fully-formed in the wild. It’s odd that the “rationalist” Western society to whom Pargin is so in tow, so particularly enamored of science and the iterations, changes, and evolution that are central to its form, would construct such bizarrely religious and dogmatic truths.

Each society must act as if its own set of truths are the “right” ones, borne out by resort to latter-day terms such as “progress” and “innovation.” Pargin’s society is one that favors masculinity, prestige for persons like himself who indulge the West’s fascination with confessing “how things really are,” and well-defined jobs for persons, which tie their livelihoods to corporate well-being. It’s important to realize just how specific his “world” is, despite its pretenses to some kind of universal morality.

He Doesn’t Know What Glengarry Glen Ross is About
His affinity for Alec Baldwin’s speech in Glengarry Glen Ross is a case in point:

“It’s brutal, rude and borderline sociopathic, and also it is an honest and accurate expression of what the world is going to expect from you. The difference is that, in the real world, people consider it so wrong to talk to you that way that they’ve decided it’s better to simply let you keep failing.”

Pargin is misunderstanding Glengarry Glen Ross. The Baldwin character was inserted into the film version of David Mamet’s play as a parody of the overzealous, bullshitting alpha types that take hold of organizations and run them into the ground. Neither the play nor the film exactly ends happily; the result of all these guys “closing” is the ruination of lives and the collapse of business. That sounds a lot like what happened to “the world” after 2007.

A comment on this post contained a link to an excellent takedown of Pargin’s interpretation of the play/film. For someone who is crafting a vast argumentum ad auctoritatem, the inability to parse and understand literature is a fatal blow, but it’s symptomatic of the writing community’s shortcomings.

“6 Harsh Truths” is the Product of the Dreary Contemporary Writing Community
Like a clichéd undergrad paper, this is where I should swoop in to tell you that it’s more complicated than that. It is..sort of. You see, there’s a “war” on the humanities in America. I’ve been told by a friend (who I wish would guest blog here one day) that there is also one occurring in China. I’m not surprised, as many industrial societies in North America and Asia seem to have given up on the value of humanistic studies. The resort to confessions of a “harsh” world out there is a product of the weird current obsession with “disruptive” and violent worlds in which everything is precarious – especially if it is has no obvious relation to capital – for reasons that no one seems to want to examine.

Karen Michalson has an excellent series of blog posts on the war of humanities, which she attributes to the right wing, the corporatization of American higher education, and the attitudes of artists themselves. Of the three causes (all of which I believe are accurately identified), the third one has the most destructive potential, because, well, if confident destiny can sustain even the most seemingly hopeless enterprises agains their scores of external enemies (Christianity’s struggles against Rome come to mind, or the Vietnamese struggle against France, Japan, China, and the United States over much of the 20th century), then what happens when instead the afflicted parties begin believing what their opponents say about them? Michalson’s post about artists is full of virtuosic passages.

On the intersection of the social media age and the arts:

[T]he creative industries (publishing, music, film, etc.) put far more value on “platform” and how many people (dollars) an individual can “bring in” than they do on quality.  Actually, they are utterly disinterested in quality.  So the competition today is not so much which minstrel creates better, more compelling, mind-blowing work than the next minstrel, but which minstrel is better at self-promotion.  As a result of this corporate colonization of the creative arts, which in many respects resembles the corporate colonization of higher education…artists find themselves competing not on the basis of what insights and critiques their work can grace the culture with, but on how effectively they can advertise themselves as viable businesses.  Scholars, intellectuals, and scientists are also forced to compete on the basis of their ability to generate money rather than intellectual capital.

Yup, it’s an extrovert’s world, and the always-on, am-I-really-better-than-everyone-else mentality that they enforce (and about which another person wrote a stellar entry vis-a-vis Pargin in particular) has made artists less into aesthetes than in would-be titans of industry, if said titans had nothing but felt like they were entitled to critical and commercial immortality borne on the back of +1s and Likes.

I mean, just look at Tao Lin. Gawker described him as “an overbearing self-publicist with a literary career attached.” Having read some of his work, I tweeted my thoughts on Gawker’s subsequent quasi-review of a review of Lin (ugh, long train of removals), saying that I thought calling anything “interesting” was the lowest form of criticism. But looking back, it’s a level of criticism appropriate to this exact brand of “public appeal first, quality second” literature.

But, back to Michalson.

It isn’t that artists hate the humanities.  What they hate is other artists.  Particularly other artists who happen or threaten to get more recognition than themselves.  This is true at every level, and results in an unintentionally tragic assault on all creative endeavors that bruises and scars serious humanities studies in the public eye.

Since nothing can be judged or turned into a moneymaking machine without massive approval, the production of art is now a business. As a “writer,” you compete with other “writers” in the same way that Pepsi competes with Coca-Cola. I put it in quotes because, well, everyone is a “writer” now: there’s no special training or guild membership or specific technical skill that sets one apart, no more than there’s an entrance exam for being a blogger. That reality alone is a major part of why the writing competition is so spiteful, so hateful: it’s full of everymen who don’t care about art.

On rage:

” I’m not blaming external forces for the decisions these individuals made, but it’s clear to me that the creative industries have worked very hard to create conditions under which these kinds of responses are inevitable.

 

The more important point is that there is a tsunami of rage out there, and it’s worth examining why.   Because even though the humanities encompass more than the critical study of the creative arts, that study is such a vital part of the humanities that the public easily equates the humanities in general with the arts in particular, and throws the creative arts into the current maelstrom of popular contempt for humanities studies. 

[N]ow that artistic recognition has been severed from artistic ability (an Orwellian feat that took tons of corporate money to accomplish), and artists are told to put far more time into self-promotion efforts than art, things are much much worse.

Specifically, it’s the “winner takes all” credo that the corporations who own publishing, music, and (increasingly) universities are foisting on an ignorant public.  You see, under “winner takes all” the only artists/ thinkers/scholars that matter to the public are the ones in the media.

That’s a lot to take in, I admit, and this is a long post. So, just a simple question:

What kind of life lessons do you think Pargin can impart after “making it” as a writer in corporatist America? (And what does his ascent, despite his poor interpretative abilities, say about the state of literature?)

Anyway, for someone who had scaled this mountaintop of sorts (I’m recalling a John Oliver/Andy Zaltzman bit about “a glorious pan across a mountain of bullshit” but I can’t quite think of how to incorporate it here), there’s a certain amount of self-justifying debriefing to be done. E.g. (n.b: don’t use “i.e.” if you don’t know what it means), “I made it because I did ____”.  “I graduated from grad school because I could do research better than anyone else,” etc.” This is an understandable urge; everyone wants to feel in control of his destiny.

But in light of Michalson’s utter dissection of how the humanities have been decimated by capitalism (which does NOT care about quality!), is there really any question of “good” or “better” here, or “harsh” or “gentle” (“6 Gentle Truths That Will Make You A Better Person” would be something I would read, if someone would write it)? I mean: how weird is to pronounce as universal truth the recent political realities of an economic system that has existed for but a fraction of all human history? How many hands created the circumstances under which a writer like Pargin “makes it” while others do not?

Life isn’t Completely about Résumés and Job Hunts

“But make no mistake: Your “job” — the useful thing you do for other people — is all you are.”

What corporatist dreck. Your intellect, your own truths, your private life – that’s just fluff, apparently, and what really matters is how you tap into the highly specified corporate/industrial world of the 21st century. Pargin seems not to realize that the idea of associating one’s job with one’s status and self-worth isn’t some universal concept, and would be incredibly foreign to societies like Classical Greece.

But in his attempts to argue for universality, Pargin trips in comparing different professions:

“There is a reason why surgeons get more respect than comedy writers. There is a reason mechanics get more respect than unemployed hipsters.”

Surgeons get more respect because they enjoy prestige in a society that has decided to favor medicine over entertainment – despite Pargin’s earlier assertion that “entertainment” (usually in the form of comedy) is a need as pressing as any other. Medicine enjoys prestige not simply for the services it provides, but for the enormously profitable and lucrative position in occupies in the United States in particular (it also helps that doctors are a cartel in the U.S.). And even looking back to other historical epochs, how many famous surgeons can you name in relation to famous comedians? Are the latter members still “less respected”?

It seems that their particular societies, and even “ours” (in 21st century America) have granted them considerable respect. Prestige and favor can and do shift from one society to another, and not just across history, but among contemporaneous societies, too – would a society that grants less capitalistic glamor to medicine (it’s telling that medicine is Pargin’s first resort in his comparison, indicative of the hold that his society has over him) hold surgeons in such similar high regard?

In trying to fend off “hippie” protests against his self-elevated “harsh” truth, Pargin goes on to say:

“Or think of it this way: Remember when Chick-fil-A came out against gay marriage? And how despite the protests, the company continues to sell millions of sandwiches every day? It’s not because the country agrees with them; it’s because they do their job of making delicious sandwiches well. And that’s all that matters.”

This is a terrible example due to the particular, narrow demographics that Chick-fil-A serves. Moreover, the labor and economic markets in the West are so beaten down that people don’t have time to consider fast-food ethics in between working endless hours for ungrateful employers. But Pargin seems to argue that it “has to” be this way, and that the idea that there could be a, well, “world” in which having corporate employment wasn’t a badge of status, in which there was time to do truly enjoyable things supported by a basic income – it’s as if the current inequality-ravage state is the only possible outcome.

It didn’t have to end up like this, to paraphrase the Killers. Keynes once posited that automation would reduce the working week to 15 hours by the end of the 20th century – and it probably could have. Instead, political and economic actors invented tons of useless jobs to sap up resources and ensure that most individuals had to spend most waking hours toiling away at occupations that frankly don’t do anything for humanity. To give into this state of affairs – which came about not because of some objective, deterministic force of “progress” (yuck)  but because of political contingencies – is to sadly admit powerlessness. Being an “unemployed hipster”…well, I’m not exactly sure what he even means with that, as it feels like a rhetorical crutch, but clearly this type of position could have value in different “worlds” in which luftmenschen (a great word once aptly applied to the revered Aaron Swartz) contributed to better cultural awareness.

To his credit, Pargin seems to realize that his argument is money-centric, and tries to fend off critics by saying:

“If you protest that you’re not a shallow capitalist materialist and that you disagree that money is everything, I can only say: Who said anything about money? You’re missing the larger point.”

The “larger point” is that, apparently, you have to “close” (Pargin again uses his odd misinterpretation of Alec Baldwin’s imaginary character as a crutch – I wonder in what degree of “respect” Pargin holds the theater aficionados and writers who spent years studying “hippie” arts like drama in order to first compose the play and then the film based upon it?) and by “close” he means “benefit other people.” The association between those two terms isn’t clear, and if nothing else, it fails in his attempt to distance his allegedly universal truths from their narrow capitalist context. Do you want to date someone? Then it’s about what you can “offer,” according to Pargin – the language of the job interview and resume-reviewing robot underpins the entire article.

Depression and Suicide are no Joke

Pargin’s entry about individuals hating themselves because they don’t “do enough” is confusing. Do enough for whom? The same society that marginalizes many of them? I think it often isn’t a matter of not doing “enough” as it is a matter of not realizing how much does, everyday. Creative people are unbelievably hard on themselves, even if they do produce quality work. And the difficulty of getting through a normal day for many people (people who occupy the current “world” to which Pargin keeps alluding) is a task worthy of plenty of self-regard. It’s also the case that many persons – more than anyone would like to acknowledge, given that depression ruins more lives that many high-profile diseases – feel bad about themselves because they are depressed.

Pargin uses a confusing fruit/plant metaphor to tell readers that nothing they do matters except in relation to what it provides to others. Do you regard an apple tree as just a bunch of apples floating in mid-air, such that the rest of the tree is invisible, useless, and doesn’t even “exist”? Pargin’s insistent focus on products made for other people isn’t only capitalistic/industrial bias, but bias against interiority or introversion in general – one long screed against persons who prioritize thought over the actions often expected of them by society. That’s not to say that some people generally are lazy and try to obnoxiously pretend that they occupy a certain profession (note I didn’t say “existence” or “lifestyle”) when they do have the expected set of skills for it. But Pargin again too narrowly frames everything within the confines of capitalist economies run by extroverts.

The end of the article makes a slight recovery by telling people not to fear detractors and to fight against their own urges to just give up. That’s commendable. But it’s still a swirl of insensitivity (suicide is referenced disparagingly in passing) and overly loud prose. I think the only thing “harsh” about anything in this article is its own tone. If it had been entitled “How to Fight Resistance Against Self-Improvement,” it would be more accurate, and I might even recommend it. But joking about suicide is unforgivable – it’s an epidemic that’s often deadly than war.

So I’m a detractor, you could say. In Pargin’s analysis, this indicates that I likely: 1) Feel bad about myself because I don’t do enough; 2) need to do more. Who knows. But I think that this massively popular story deserved a rebuttal because I think it gives such a narrow view of life – that all that most of us can hope for is to fit into a recently constructed and transient concept of how skills can be used, in this case, only as things to be traded within a capitalist system, or for status betterment in hetero relationships in particular.

Wrapping up

Quieter types, or types who don’t necessarily buy into Pargin’s faultily constructed ideal of the Alec Baldwin boss, may struggle in a “world” dominated by the organizational equivalents of Pargin’s in-your-face writing style. And that’s too bad. Reality is too big for only one personality type to dominate or be given undue favor – when they do gain such overwhelming favoritism (i.e., when they become identified as having “the right” traits), it can damage not only individual lives (ironic, given the article’s title) but the health of organizations, too, which can turn struggle to adequately serve the differences and diversity that courses through their members.

But groups aside, the article is just one long sour outlook on humanity. It barely rises above “people are greedy, deal with it,” which would be disappointing enough from a low-line manager, let alone a “writer.”