Monthly Archives: January, 2015

“Space Quest 6” and the Internet as space vs. medium

One of the distinctive things about the Internet is that no one used to call it “the Internet.” Throughout much of the 1990s, the act of accessing an IP network (likely over dial-up) was referred to as “going online,” entering “cyberspace,” or encountering “the Net,” “Web,” or just “AOL.” Then there was my favorite: the Information Super Highway.

Networks and history
Computer-driven networks had been evolving for decades by the time that Geocities et al made them directly accessible to consumers. There wasn’t a monolithic, unified network in development that whole time, though; “the Internet,” in all of its broad meaning, was a latecomer to the networking, software and hardware party that had been going on since microprocessors were invented in the 1940s.

Speaking of the 40s: I saw Ethernet inventor Bob Metcalfe speak at a conference in D.C. last year, and he half-joked that the Internet began in 1946 with the first microprocessor, making it exactly as old as he was. This quip is instructive, since, on the one hand, it demonstrates how the numerous technological inventions that makes today’s Internet possible go back many years, and, on the other, it shows how all of these developments have been retroactively wrapped up in the homogenous terminology of the “Internet” (so we have “the history of the Internet,” or “pre-Internet” instead of “the history of networking technologies and capitalistic decision making” or “pre-chip.”)

Card-processing networks and travel reservation networks, for instance, were among the disparate networks that emerged throughout the 1970s, as Evgeny Morozov noted in a recent interview. The discursive convergence on the term “Internet” didn’t happen until much later, and was never inevitable. Infrastructure control had to be handed off to the private sector and specific technologies and protocol stacks (like Ethernet and TCP/IP) had to win out over others.

Medium vs space
These days, the Internet is seen primarily as a medium. One might “use” the Internet in the same way she might use a phone line, magazine, or TV. It serves as a means of getting information, e.g., it has literally become “the media,” in a happy coincidence of terminology. But one doesn’t really occupy it; in the popular imagination, there is no longer a spatial quality to it, and talking about “cyberspace” feels anachronistic.

This wasn’t always the case. There was “cyberspace,” sure, but there were also “chat rooms” (another spatial reference) as well as weird artifacts like Apple eWorld that tried to represent connectivity as a traditional community – with buildings corresponding to different tasks – rather than one giant medium (“the Internet”). Even early browsers like Mosaic and Netscape Navigator had names that were spatial, representing a physical collection of objects and a guide through a cyber-landscape, respectively.

None of these modes of connectivity were strange in the 1990s. My favorite example in this mold was the Sierra On-Line adventure game “Space Quest 6: The Spinal Frontier” (hereafter “SQ6”), released, crucially, in 1995, which was the year that Netscape really began to pick up steam and Windows 95 was released. It was also, if I recall correctly, the first year I actually went online.

Space and “Space Quest 6”
SQ6, like its predecessors and most of Sierra On-Line’s games, was a point-and-click adventure, a genre that involves investigating a world, clicking on things, accumulating inventory, talking to people, and solving puzzles. Generally, the gameplay is slow-paced and intellectual. I grew up with these games in the early 90s, installing them from floppy disks, being stumped for hours on puzzles, and then having to order a hint book since GameFAQs didn’t exist yet.

Many of them also had manuals that contained crucial, proprietary hints to puzzles, as a means of copy protection. “Space Quest V: The Next Mutation,” for instance, had a tabloid that included important tips. With SQ6, there was a pack-in magazine called “Popular Janitronics” that you absolutely had to have to complete one of the game’s hardest tasks (creating a homing beacon).

Unlike, say, “King Quest’s VI: Heir Today, Gone Tomorrow” (also from Sierra), SQ6 wasn’t on the technological cutting-edge, although I thought it was at the time since it was the first “Space Quest” game to be built for CD-ROM distribution (to get a sense of how big a deal this was: KQ6 was initially available on 12 floppies, and, after a year, on 1 CD). Its graphics were ok and its gameplay standard.

The game’s hero, Roger Wilco, goes to several exotic planets on his quest to save someone named Stellar Santiago. The most memorable sequence for me, though, is when he goes into cyberspace, which looks like this:



Software, networks, and hardware
These screenshots show how a lot of people in 1995 conceived of “the Internet” (which didn’t really have that label at this time; that noun with the definite article is found nowhere in the game’s dialogue or literature): vast spaces, dotted with highways that carried information and ran past virtual buildings that held online accounts and files. The file cabinet screen grab above is accessed through a menu that looks like a dead-ringer for Windows 3.1, which is itself housed inside a trailer. That’s about as cyberspace-in-early-1995 as you can get, and not far off from eWorld, albeit in a Windows-centric universe (I played the game on a Windows 95 machine).

While all of this may seem outdated now, it really isn’t. For starters, Roger goes online not by using a phone or even a PC, but by donning a VR headset that doesn’t look much different from Facebook’s Oculus Rift. Everything old is new again; recycling sci-fi and fictional ideas is both a fascinating aspect and a potential weakness of the tech industry, which has a strange reliance on the entertainment industry for ideas at times when its own poverty of imagination shows through. Plus, the idea of the Internet as a space was never “wrong,” it just lost out to the homogenization that eventually grouped disparate histories in hardware, software and networking infrastructure into one story, as Morozov pointed out:

“For most of the nineties, you still had a multiplicity of different visions, interpretations, anxieties and longings for this new world, and a lot of competing terms for it – virtual reality [note: weird how this one has survived and actually flourished in the discourse of wearable technology], hypertext, World Wide Web, Internet. At some point, the Internet as a medium overtook all of them and became the organizing metacategory, while the others dropped away. What would have changed if we had continued thinking about it as a space rather than as a medium? Questions like these are important. The Net isn’t a timeless, unproblematic category. I want to understand how it became an object of analysis that incorporates all these parallel histories: in hardware, software, state-supported infrastructures, privatization of infrastructures, and strips them of their political, economic and historical contexts to generate a typical origin story: there was an invention- Vint Cerf and DARPA – and it became this fascinating new force with a life of its own. Essentially, that’s our Internet discourse at present.”

He’s asking good questions, and I can’t wait for him to write more books on “the Internet” and its history. Since we’re talking about video games here, though, I might note, on the subject of “origin stories,” that this tendency toward a specific, linear history of “the Internet” – one that scrubs out various continuity errors or false starts – is a lot like something from a comic book or fictional universe, which makes sense. The tech industry at present has considerable overlap with geek culture, which has led it to elevate the Maker movement and the sort of artifact-obsessed outlook that loves clean origin stories rather than messy human dramas.

Wilco and conclusions
Roger Wilco never starred in another official SQ after 1995. Like the rest of Sierra’s adventure gaming franchises, which had thrived as PCs became mainstream in the 1980s and 1990s, it struggled to keep pace with new types of games that sported better graphics, more violence, and online gameplay. The solo, introverted experience of the point-and-click game was no match for attention spans with access to Unreal Tournament and, eventually, Facebook games.

With that transition in mind, it makes sense that SQ6 would see “the Internet” as a bunch of filing cabinets, or an “offline” version of Windows 3.1, for someone to dig through. The notion of the Internet as an actual medium for other people’s information, rather than a quiet library for each individual, implies a broad social connection that computers did not deliver in the mid 1990s and further back.

It’s too bad, in a way. If the Internet were conceived of a space today, think of the impact such a mindset might have on data collection and privacy – Wilco would have been overwhelmed had he stumbled across the “F” filing cabinet in that building, stuffed as it would be now with Facebook data. Or the “N” (NSA) or “U” (Uber) cabinets. Maybe it’s time to bring “cyberspace” back, if only as a semantic nod to there being real consequences for data collection and online screeds.

Experts, non-experts and enthusiasts

I had planned a longer entry today – about adventure games and the terminology of the Internet – but I’ve shelved it for this weekend since it still needs some tweaking. I’m also planning to write at least a few entries about D.W. Griffith’s “Intolerance,” which I recently watched on Netflix.

So far this year, keeping this blog more up to date and with a mix of long and short form pieces has been revitalizing. Before this January, I had come to regard all writing as work rather than leisure for me, which led to long gaps in my output here (there are still some big gaps on my Tumblr, which I plan to fill with some of the creative projects I’ve tested here). Sometimes it was because of the feedback I receive on my professional work, which can really run the gamut from all-out praise to criticism that seems out of place given the stakes and circumstances.

Experts and non-experts
Writing, like education, is a field that everyone feels entitled to comment on since everyone deems herself an expert, if only subsconsciously. Let me explain.

To look at the last 20+ years of education history in the U.S. – which I had a front-row seat to, as the child of two educators – is to see a long string of opportunists from fields such as management consulting, software, and venture capital, not to mention state, local, and federal politicians with no history in education, prescribing what’s “right” for students and teachers. I can’t think of any field as large in which so much power is held by non-experts.

Not just non-experts, either, but people actively hostile to the profession, who want to destroy teachers more with obligations around standardized testing (useless), “metrics-based” reviews (based on the aforementioned useless standard testing), treating students as “customers” (in another sign of the creeping financialization/corporatization of everyday life), and sexist emphasis on being “makers” rather than, well, educators. I suspect so much of this bullshit is due to education being a relatively weakly credentialed field.

While certain positions are off-limits to individual without the appropriate degrees, it’s still relatively easy – if anything in the U.S. job market can be described as such – for fresh college graduates, regardless of background, to get a foothold in education via Teach for America or a similar program. Moreover, some of the most influential figures in education in recent memory – such as Michelle Rhee, who ran D.C.’s school district for years – have been objectively bad educators.

Politicians feel entitled to comment on education at a high level since there’s this notion that educating is easy, anyone can be an educator, and performance doesn’t even matter. Education has become a launching pad for all sorts of nonsensical political speech, from “we’re losing the race against [Japan/China/insert country here]” and “we have a skills gaps [nope].”

Now, imagine all of this happening with general practice medicine, law, or dentistry. It’s unthinkable since this fields have hard credentials, not because they’re superior to other fields but as as a result of the immense power of the upper middle class to resist the type of Uber-style disruption that has put cab drivers, musicians, and educators (who face massive open online courses, among other threats) up against the wall in recent years. Educators don’t have the cachet or prestige to hold the non-experts at bay.

Writing is in even worse shape. Virtually everyone has to write in some capacity, which isn’t the case with educating (or performing a dental operation). So everyone fancies herself a writer, even if she doesn’t identify as such. It has become so lowly regarded as a profession that it is an incidental trait – like wearing a blue coat or having size 11 feet – that one would never make synonymous with identity, making it sort of an anti-Maker label.

Accordingly, the criticism that can be directed at writers or people like historians whose work is writing-intensive – the backlash against Jill Lepore’s destruction of disruption last year is instructive here – is often intense. It’s sort of like “this is obvious, we’re all writers anyway, what are you not getting?”

An argument or phrasing that one doesn’t like is sometimes met with the intensity that one might expect for driving the wrong way in traffic or walking through Tiffany’s without pants on. The cost of failure is deemed so high, perhaps because everyone shares, at least in small part, in the underlying skill set (being able to write), just as they do with certain social norms. I feel that the insanity of Internet comments is partially rooted in the mentality of everyone being basically a part-time writer (comments have to be written, not dictated, after all).

For someone who writes all the time, though, the type of criticism that might normally be confined to a message board or paper critique can be both uninteresting and strangely bothersome. On the one hand, you get used to it and there’s a quiet confidence from knowing that your critiquers are often less qualified than you are. But there’s also the feeling that the sort of basicness of writing, the idea that anyone can and should do it and so you can be free and natural while doing it, gets lost in the torrent of feedback that channels writing toward some political end.

The long term effect from the latter can be waning enthusiasm on the part of the writer, which must be combated with, well, side projects, like blogging. Part of the appeal of blogging, for me at least, is the absence of feedback. I long ago disabled comments and I just write whatever I want to. The dailyness of blogging requires a certain enthusiasm and ability to shirk self-consciousness, but it also reinforces these traits over time, strengthening the same enthusiasm that it requires. Maybe it ends at some point. I haven’t reached it yet and I’ll always have John Gruber’s observation below in mind as I try to keep my streak up:

“Blogging isn’t hard work in the way that coal mining is, but above all else it demands enthusiasm. There’s no other way to keep going – blogs cease when their authors run out of enthusiasm. For many people, the enthusiasm seems to run out after just a few months, maybe a few years.”

Writing fatigue

The most successful stretch I ever had with this blog was in the months right after I left my first ever full-time job, at a software startup. I had long days with nothing to do and, crucially, no other writing obligations. Many of most-viewed entries (which I don’t think correlate with my ‘best‘ posts) were from this period.

Since then, I’ve taken on a day job at which I write 1000s of words per day, which I think initially sapped some of my energy for writing throughout the second half of 2013 and most of 2014. Part of the rebound in posting frequency and length this year has been willful – writing every day even if I don’t have something pressing in mind, and not being as exacting about achieving a certain aesthetic – and part of it has been a more relaxed attitude toward working.

Working “hard,” I feel, has never been the right approach for me. The sheer effort of making so many decisions and switching between tasks – what source should I use? what phrasing will make it seem ‘ok’ in my head? – leaves me exhausted by mid-afternoon, so I have tried a different tack.

I usually wake up a bit later and then let myself gradually come to my senses – drinking water and eating breakfast along the way – before I start writing. At first, I let myself write pretty freely since I know that I won’t get it right on the first try and will require a few pass-throughs to get the piece how I want it. Then I take a break between each piece instead of just trying to power through a bunch of them in a row – sometimes simply standing up and walking around can ‘reset’ my mindset and make it much easier to resume working when I sit down again.

It’s part of that cliche about working “smart” instead of working hard, I suppose. Today I didn’t follow my advice so well, though, and so I don’t have much left for today’s entry. Tomorrow, though, I’m planning a longer piece that will talk about 1990s adventure games and their influence on the term “Internet,” which I’m excited about.

Strong versus weak dollar (also, Apple)

A fistful of dollars/for a few barrels more
Everyone wants a strong dollar, right? In the U.S., politicians will pay lip service to the notion of a strong dollar – i.e., in their minds, a dollar that trades more evenly against the other major world currencies (sterling, euro, yen) – because A) it sounds good; B) it feels good for American travelers who travel to Europe and Japan and realize that their greenbacks go pretty far.

When I visited Italy in 2008, I remember that the USD-EUR exchange rate was unfavorable to me (I’m American) and accordingly I felt the pinch of 50 EUR cab rides and 14 EUR gelato cones in Florence. At the same time, I remember fuel being expensive that entire year, with it peaking at near $150 per barrel that summer when Russia invaded Georgia.

That all feels like a 1000 years ago now. The dollar has strengthened mightily against the euro and oil trading for less than one-third of what it did the summer before Barack Obama was first elected. King Abdullah bin Abdulaziz al Saud of Saudi Arabia, whom George W. Bush begged that same year to increase oil production as energy costs skyrocketed in the run-up to the Lehman collapse, is dead. Bush himself is reduced to speaking at events in the Cayman Islands, an unmentionable even among his own party. Russia, though still ruled with vim by Vladimir Putin, is in economic and diplomatic free-fall.

But the strong dollar isn’t everyone’s friend. For starters, it is burden on corporations that sell goods around the world. Tech analyst Ben Thompson recently framed the problem in stunning terms, in ridiculing the widespread perception that Apple is always on the verge of catastrophe:

“It’s difficult to overstate just how absurd this is, but here’s my best attempt: last quarter Apple’s revenue was downright decimated by the strengthening U.S. dollar; currency fluctuations reduced Apple’s revenue by 5% — a cool $3.73 billion dollars. That, though, is more than Google made in profit last quarter ($2.83 billion). Apple lost more money to currency fluctuations than Google makes in a quarter. And yet it’s Google that is feared, and Apple that is feared for.”

I have been trying to wrap my head around this all day. All those seemingly minor variations in currency trading, piled up over and entire quarter at the scale of Apple’s business, ended up taking a cut out of Apple larger than Google’s entire quarterly profit – and Apple still managed the best quarter of all time, with $18 billion in profit.

Apple’s turnaround over the past 18 years is probably the greatest business story of all time. if you look at a chart of all the biggest quarterly results in history, it’s dominated by oil companies (Gazprom, Royal Dutch Shell, etc.) and Apple and no one else. It’s a neat coincidence that Apple keeps outdoing itself at a time when oil – seemingly its only competitor in terms of product profitability – is taking a nosedive.

Growing up in the 1990s, this is all so surreal. For a kid growing up in rural America, at the peak of Windows (I had just turned 9 when Windows 95 was released) era, when every class at school was built largely around writing things in MS Works/Word and saving it to a floppy, Apple was nowhere to be seen. I remember reading about Macs when playing some Sierra On-Line games that were built for both PC and Mac, but I never even used one until 1999, in a school in Gallipolis, Ohio. Apple was on the margins.

Not anymore. To quote almost any stat about Apple anymore is to send the mind fruitlessly in search of anything else like it. The company’s iPhone business alone – just the iPhone, without even taking the iPad, Mac, iPod and iTunes into account – brought in more revenue than Google and Microsoft combined in the most recent quarter. Each quarter, it makes more profit than Amazon has ever made. It has enough cash to buy IBM outright at IBM’s current market cap – and still have tens of billions left over.

Paradoxically, the vast complexities of Apple’s supply chains as well as the efficiency of its manufacturing and marketing processes have ensured that simplicity wins out. The iPhone and its brethren feel natural and easy to use (despite mounting software issues, which is a topic for another conversation), reinforcing what I have always thought: that one significant part of the success of iOS in particular is that it eliminates the paradox of choice that is so paralyzing with Android or almost any other computing platform. It’s a good design, like John Gruber recently noted:

“Who knows how long Apple’s ride at the top will last, but this is a moment worth savoring. A toast to the value of good design.”

“We Are All We Need” by Above & Beyond

In July 2005, I went to Boston to visit a friend from college. That summer, like the one of 2011, is one that I don’t remember fondly. If ’11 was the peak of my post-grad school malaise and quarter-life crisis, then ’05 coming after my first year of college, was an awakening to a world beyond high school. I had mostly breezed through my freshman year, but by the end I felt like I was breaking down after taking medications for depression and being disappointed with some of the spring semester classes.

My ’05 Boston trip came while I was in New England, I think with my family as they were moving my sister into a summer program at RISD, where she would eventually attend college from 2007 to 2011. Boston was an important city to me throughout college, even though I went to school about an hour away in Rhode Island. I went to several Yu-Gi-Oh tournaments there, saw the Celtics victory parade in 2008, and discovered trance music there, during that first trip in July 2005.

Boston that summer felt like an optimistic place. The Red Sox had won their first World Series in forever the previous season, and the Patriots had won the Super Bowl the following winter. The day I went up there was sunny, in the 80s. My friend and I ate at a Chinese restaurant and walked through Boston Common. We stopped by his apartment to get something to drink and watch some TV. From his room I heard music playing – it was “Air for Life” by Above & Beyond, one of the band’s first and most memorable singles.

From that point on, my music tastes started to shift from rock music to trance and EDM (electronic dance music). By 2007 I was delving into Ultra compilations and listening to Tiesto’s albums. I think my peak was in 2008 when I used to listen to a triple-disc Godskitchen compilation in my John St. apartment in Providence while playing Castlevania III on an NES emulator.

The initial discovery of Above & Beyond was the catalyst, though. My interest in “Air for Life” and, a year later, “Good For Me,” opened the doors to many new sounds for me. The band felt like something bigger than trance or EDM. I remember listening to the King Roc Vocal Dub of “Good for Me” one morning while studying Latin at like 7am and it was a nearly religious experience.

Their first album, “Tri-State,” was the first trance/EDM album I ever listened to, which is perhaps strange since it is not exactly representative of the genre. It has 4/4 beats, sure, but it also has piano-laden instrumentals, beatless songs, and alt-rock trappings like guitars and angsty vocals here and there. Their sophomore effort, “Group Therapy,” came out during my low period in 2011 and I never really grew to love  it (or maybe I have just resisted it since I associate it with bad moments) despite memorable songs like “Sun & Moon.” Then their “Acoustic” album from last year showed the depth of their songwriting and their capabilities with traditional instruments.

We Are All We Need
Their newest effort, “We Are All We Need,” has been seeping out track by track in their weekly podcast for months now, so there wasn’t that sense of an entirely unknown world opening up that felt when I listened to “Tri-State” for the first time. Still, it feels nearly ironic that an EDM band has made such a coherent and listenable album in 2015, in a genre not traditionally known for its artist albums and at a time when streaming services threaten to commoditize long-form listening.

Screen Shot 2015-01-27 at 10.34.42 PM

The title track and “Sticky Fingers” have been concert and podcast favorites for some time now, and their hooks aren’t easily forgotten. While there are plenty of tuneful, melodic trance and EDM songs out there, I often think of vocal hooks as the province of rock or pop music. With these two tunes, as well as “Blue Sky Action,” though, I think of how the experience of EDM can sometimes yield the most memorable vocal hooks, stuff like the verses from “Breathing (Airwave)” by Rank 1 or “Satellite” by OceanLab.

There’s a balance between a unity of feel – that distinctive Above & Beyond airiness – and variety, with many guest vocalists (as is typical on many modern EDM albums, granted). For me, the album plays almost like a best-of from their podcast, which they have done each week for 2 hours for the last 10+ years.

When I was in Boston in 2005, the podcast, then called “Trance Around the World,” was just getting started, and by the time “We Are All We Need” was released, the group had surpassed 550 total episodes – including #ABGT 100 in New York, which I summed up here –  between TATW and its rebranded successor, “Group Therapy.” There’s a long continuity to everything Above & Beyond does – they’ve been so consistent and also so different from their peers – and their best work, which certainly includes much of this album, always brings me back to that one day in 2005 when I felt good during an overall bad summer.