I’m writing from the WordPress for Android app, in an apple orchard in Kentucky twilight. My cellular connection is EDGE and there isn’t any light except for this screen and a few stars.
I just blew a bug off my screen. There are jarflies squawking and dogs howling. The flowers from earlier, once striped, are dark for the night.
13 years ago, I started blogging by writing a poem for my school website, on a Windows 98 PC a few yards from here. The site has moved and the poem survives only on a framed scrap or newspaper, in a story announcing its receipt of a $75 prize.
What did that me, 13, think about when making that verse? A book, sure – maybe Moby-Dick, which he had finished a few months before. But this environment, too. An apple just fell to the ground nearby; gravity is a cliché, but inspiration still lives here.
“Information” is a pretentious word. So are its kin, “data” and “data points” (???) If bad writing is about things that are not concrete, then info-data is its muse.
It’s a fancy word for “stuff,” in the end. Imagine the following slogans recast to show how trite info-data is:
-“the stuff age”
Some uses – “mobile data” – are more concrete. But look: it’s scary that a basic synonym probes the shallowness of info-data. It’s about air, about ideas that are festooned with flowery words like “solutions” and “digital” that themselves are blank, yet somehow add more character (“solution ” is at least evaluative; info-data is nothingness, less material even than “stuff” and its vivid homophones).
Why resort to info-data? Because computers and the industries around them lack a clear reason for existing.
The Internet is an outgrowth of the telegraph that has done as much bad (spying, fake social media personae, argument for no reason, stress over minor things like email) as good (new tools for writing, reading, and chatting).
Computers themselves are often justified as “productivity” tools, but “productivity” is a ritual, not a result. New jobs and issues are created to feed the hunger for “productivity,” but it can’t be sated.
Like the Internet, financial services, and 100-hour workweeks, computers keep recreating the need for productivity, rather than satisfying its requirements. We’re solving a problem that isn’t there – maybe that’s why “solutions” is meaningless and a crutch.
Info-data is even more generic and, well, insincere. Something like info-data has always existed for humans, but it has enjoyed a moment now that it is associated with smartphones and PCs. Are “analog” media like books repositories of info-data? Why didn’t the invention of the codex form kick off The Information Age?
Whereas books have clear boundaries and purposes – a novel for leisure reading; a textbook for education – info-data media do not. The Web has no purpose, and computers, while no generating info-data, are little more than extensions of analog tools for gaming and writing.
The info-data lingo makes computers and the Internet seem profound, like clear breaks with what came before. But this language is vague, and it reveals summering so ordinary that terms for the most ancient, mundane things – information, data – have to be put into service because there’s nothing else there.
I like the word “blog.” It’s earthy – it sounds like “bog” and “log,” so it has the air of the swamp and the forest in it. But I hate the word “solutions” – I think of cleaning and stinky liquids, which isn’t the intent of the writer. Plus, calling a thing a “solution” is pre-judging it – what if it doesn’t work? What is the, um, problem it is solving (or is “solution” now so far from “solve” that asking this question is dumb?)?
Blogging (stirs up pictures of a lumberjack, writing) is what I turn to when I am not solutionizing for websites. The list of cliches and meaningless strings of words I write for clients is scary. Yes, there’s “core competencies” and “best-of-breed” (are we talking about horses?). But there’s even hipsters ones like “pure play” (not fun like it lets on).
Humans adapt quickly, so even this sticky vocabulary is easy to use after mere days of all-day practice. What comes from such handiwork? Well, “writing solutions” – I wasn’t joking when I told an editor that his title should be “director of editorial solutions,” since that’s how he’s seen. I don’t mind making these “solutions” – it’s easy. The phrases are readymade and thinking isn’t always needed, at least not in the same way that it is for creative writing or a letter to the editor.
My own blogging, though, isn’t a “solution,” and I doubt it is for a lot of other bloggers. The downfall of blogging has been foreseen – it won’t survive social media, no one reads long-form anymore, etc. But unlike Facebook et al, a blog is not a facade. It is not kept up to send a political or social message to everyone else – “look how great I am!” “here I am in Rio!, look upon my successes and weep!” – and lots of blogging, esp. from the pre-Facebook era, is sad and weirdly (to our eyes now) not curated. There’s lots of chaff with that wheat – where did it go?
Blogging has its place as the antidote to social networks, “writing solutions” (business papers, Sadya Nadella’s wordy emails, LinkedIn profile spam), and general jargon. Why is it so easy to write clear prose and be direct in a blog, while grinding through a paper or even tweeting something suitably amazing is labor-intensive? Maybe we’re more sincere on our blogs, and that’s why we keep updating them, to give us at least one outlet for our real thoughts, for problems and not necessarily…”solutions.”
I’m not sure if I’m a “journalist.” I interview people, do research, and supply regularly updated sites with news stories. But I do it all from an apartment, with only the occasional trip to a minimalist office. Do these details make the “journalist” label unfit?
“Journalism” is a word often in company with “disruption,” one of the most overused and annoying terms to enter the vernacular. We “journalists” are framed as under siege from the Internet, unable to adjust to free Web browsers eating into newspaper revenue.
We’re in such bad shape, apparently, that we pretend disruption doesn’t exist, according to Ben Thompson, in his misinterpretation of Jill Lepore’s virtuosic New Yorker essay. Lepore was not discounting the idea of change, but providing a history of how mankind has explained it, finding issues with Clayton Christensen’s methodology as well as the historically limited range and shelf life of “disruption,” a concept that could only have arisen from the era of 9/11 and cheap Asian manufacturing.
“When all you have is a hammer everything etc.” – although since Silicon Valley is too digital for something as analog and working-class as a hammer, let’s say that when all you have is no humanistic background and strong affinity for the violent terminology of “disruption,” everything looks like it is in danger. If I a “journalist,” am I in trouble?
Again, I don’t know, especially since the label may not even be apt. Maybe I am a “journalist” who has simply evolved (before “disruption,” evolution was nearly as ubiquitous a term for explaining everything, as Lepore pointed out) to use new tools. Why not take this optimistic, even progressive (another preeminent etiology of yesteryear) view, rather than the insecure, cynical stance of “disruption”?
Journalists don’t necessarily serve the interests of the VCs and technical folk that have made ” disruption ” de rigeur (well, unless they work for TechCrunch). It shouldn’t be surprising that these writers, especially ones like Lepore who don’t toe the line, are construed as not getting “disruption” or, worse, being disrupted. Any piece of confirmation bias – declining ad revenue, the agony of paywalls – then suffices for proving “disruption.”
At the same time, how often do you hear of these professions being disrupted?:
-VC – investing is often guesswork, so why not automate it and hook it into something like IBM Watson?
-Programmer – a software engineer is more like a car mechanic than a doctor. Why not move toward less tinkering and customization (as has happened with cars like the Tesla Model S) and make such human involvement obsolete?
-CEO – how many CEOs have been outsourced to China or automated because of “disruptive” forces? I’m not talking about firing one person to hire another, but about eliminating an incredibly wasteful position that is compensated at a crazy ratio to the rest of the organization.
“Disruption,” it seems, is weirdly selective, with class-related and political biases (surprise). As a “journalist” or something close to it, rather than a VP of engineering, I’m not surprised that I’m an actor in other people’s dramas about “disruption” and its impact on everything.
Writers have to worry less about nonwriters’ opinions and naysaying – “evolve” and “progress,” don’t “disrupt” or “be disrupted” (whatever that means). I wrote this whole entry on the WordPress app for Android, but I see it as just another tool rather than the ancestor of a robot waiting to take my job.
Yesterday’s theme was Lollapalooza and Vampire Weekend. Today, the pitchforks are out for indie music, at least in Chicago’s Red Eye on the eve of the 2014 Pitchfork Music Festival.
Music criticism has a low bar to entry and is heavily reliant on adjectives (how often have you heard “warm,” “harsh,” “loud,” “ambient” or words in their respective families?), which means that most of it is opinion and little else. Historical knowledge – of the band and its influences, for example – can only go so far in pushing the critique about “I like it/don’t like it.”
The Red Eye writers, given a limited space, gave their duly opinionated but charming takes on their peceived can’t miss and must-miss acts at Pitchfork. It was nice to see slapdowns of The Field, Animal Collective, and especially Neutral Milk Hotel (“snore fest of a band”), artists all entered into the indie pantheon by Pitchfork writers of yesteryear.
After reading, I got back to my own mass production of keywords, today cutting and pasting from someone’s copy on hybrid cloud and polishing it as I saw fit. Which got me thinking:
- When I smooth-over someone else’s words, one of the biggest tasks is always removing all the first- and second-person pronouns. No, “I” don’t want “you” to talk to the audience about the need for orchestration and cloud management platforms. Yet, a lot of professional music criticism is written in similar style (“If you were around when Neutral Milk Hotel were a working band…”, went Pitchfork’s review of the band’s Box Set).
- However, even an opinionated and ungrounded argument – overuse of “I” and “you” is a frequent, but not absolute, sign of such – can be useful. It can bypass the grunt work of getting the audience on your wavelength – so much writing is just about using the right buzzwords and not, say, thinking that “orchestrating” is an acceptable synonym for “coordinating” when writing about IT. “I” and “you” are two such words for music criticism and, apparently, for some in-depth white papers trying to get CEOs to buy into specific tech.
- Cut-and-paste has been a more common technique than I expected, since many of the clients I work with basically request it, or imply it by asking for a rewrite. It’s not easy to get the lifted text to align with words I created from whole cloth (strained metaphor, yep), but the process is so instructive – it parallels the challenge of writing in general, especially long form writing, of making sensible transitions and creating discernible story arcs.
I’m not going to Pitchfork this year. But I think that
“It’s the kind of sentiment that teenagers who feel assaulted by their surroundings will continue to discover, and its wide-eyed and wounded view of the world goes a long way toward explaining why they keep returning to this songwriter. Despite its vague and decidedly lo-fi profile… [it] also has its share of experimentation.”
could be a useful paragraph to cut, paste, and rewrite for my future music criticism: Ssufficiently generic, but a fleshy body with the bones of promise, all the same.