Advertisements

Tag Archives: Facebook

Silent Film and “the Internet”

In late 2013, we watched our first full-length silent film, “The Thief of Baghdad” from 1924, starring Douglas Fairbanks. Fast-moving with an endlessly engaging score (a loop of “Scheherazade” by Rimsky-Korsakov), it’s a good “break-in” film for anyone unfamiliar with the silent era. Fairbanks excelled at swashbuckling roles, and “The Thief of Baghdad” is one of the swashbucklingest movies ever made. He dances around with his scimitar and dives into the sea to fight off monsters, too.

Since that time, we have explored a few other silent era films, including the corpus of Kentucky director and Hollywood godfather D.W. Griffith. I recently finished his “Intolerance,” from 1916, the follow-up to 1915’s blockbuster “The Birth of a Nation.” The latter rewrote the rules for feature-length films by being essentially the first feature-length film, with a continuous narrative structure documenting the before, after and during of the American Civil War. “Intolerance,” though less famous, may be Griffith’s best work.

16129443253_da177e2000_k

A screen grab from “Intolerance.”

Split
I have always liked the idea of split stories and parallel action; “Intolerance” provides nothing but for epic 3+ hour duration. There are four stories, each documenting a moment in history when intolerance of other belief systems or moral codes was the preamble to violence: there’s an ancient Babylonian story about the city attack by Cyrus the Great, a Judean story about Jesus, a French story about the St. Bartholomew’s Day Massacre, and an American modern story about a mill strike and a group of, well, intolerable moralists.

The variety of “Intolerance” makes its epic running time go by swiftly. Griffith employs many different color prints, a melange of musical samples, and some strange interstitial techniques like a woman rocking a baby in a cradle (representing the passage of time between the film’s chosen eras) and a background shot that includes what looks like the script/screenplay for “Intolerance” itself – how meta. Textual snippets are also given period-specific cards, such a tablet for the Babylonian story.

Relevance
“Intolerance” is 99 years old this year, but perhaps because of its cosmopolitan subject matter it seems less dated than “The Birth of a Nation,” which represented and embraced the retrograde racial attitudes of its period. Another thing that makes “Intolerance” seem so modern is it ambition. The budget ran well into the millions of USD – in 1916! The sets, such as the Babylonian city that Cyrus besieges, are sprawling and look great almost a century on – behind the color-tinted shots and film crackles, they now seem as old as the times they tried to depict.

Some of the film’s imagery and topics, especially in the Babylonian scene, remain relevant for 21st century viewers. The issue of whose god is mightier – Bel-Marduk or Ishtar – and the shots of people falling to their deaths while large (siege) towers topple has uncomfortable symmetry with 9/11, for instance.

Part of what is so striking to me now, though, about “Intolerance” and silent films in general, is how “Internet”-like the entire experience is. There’s the variable pacing of moving from one card to the next and reading the text, just like one would do with a webpage (with the important and obvious difference of not being in control of the direction – although one could say that people addicted to Facebook or forum arguments are hardly free from inertia in this regard…). There is the card-by-card, shot-by-shot attention to design and layout (“Intolerance” even has footnotes for some of its textual snippets!) as well.

History
Earlier this year, I wrote about “the Internet” is a term applied retroactively to a bunch of actually separate histories – networking, software, hardware, etc. – with the added current connotation as a medium through which its users receive information. It used to be called by different names – “cyberspace” is perhaps the best example of this class of outmoded labels, as it conceives of connectivity as a space rather than a medium – and really if one wants to get technical, the vague principles of “the Internet” go all the way back to the telegraph, which was a much bigger break with what came before it than, say, TCP/IP was with its predecessors.

Before watching “Intolerance,” I hadn’t though of silent film as a part of “Internet history.” But the design tropes of silent film are if anything becoming more, not less, prevalent in media. Pushing cards or snippets of content – say, Snapchat Discover, Twitter’s “While You Were Away” feature, or the stream of matches on an app like Tinder – is an essential mechanism for many of today’s mobile services in particular. Integration of video with services like Meerkat (it lets one show live video to her Twitter followers) only makes the lineage from silent film to “the Internet” more apparent.

In a way, “the Internet” hasn’t even caught up to the immersive experience of silent films, which often not only pushed discrete cards and pieces of narration at viewers (ironically, to support a continuous narrative) but also featured live orchestras in grand settings. Videoconferencing (FaceTime, Skype) and the likes of Snapchat are Meerkat strive for that same immediacy that Griffith et al captured in the 1910s.

One more intersection: For someone used to talking movies, watching a silent film can feel really lonely, because no one is talking. For me, this exact sort of silence and proneness to becoming lost in thought – for better or worse – is endemic to using “the Internet.” It’s strange, really, that in an extroverted society like the U.S., in which silence is barely tolerated in meetings etc., that so much mental energy is channeled into the inaudible actions of responding to emails or skimming BuzzFeed. I would much rather wordlessly watch “Intolerance” again.

Advertisements

Apps and social media fatigue

If I were to graph the number of apps installed on any device I own since I got my first Android phone in the summer of 2011 (an HTC Inspire), it would be left-skewed. A combination of concerns about battery life and storage space, the realization that some websites offer better experiences than their respective apps (especially Facebook), and an overall desire to just have few sources of information has led to delete nearly everything but preloaded apps.

What’s left? Not much.

15856075344_6c6a095976_h

What’s more, I haven’t actively searched for a new app in a while. I’m not sure if this says more about me being burned out on data and notifications (they feel so distracting, and I know I have written/read less because of them) or about the maturity of the app market.

I remember when “apps” became part of the lexicon some time in the summer of 2008. I had just moved to Chicago and I still had a Motorola RAZR that might have been cutting-edge in 2005, during my first year of college. When I got online for the first time ever in my first Chicago apartment – via a Dell desktop PC – the App Store was only 2 months old and Google Chrome was less than a week old. On my PC, I didn’t really think of “apps” except for Web browsers and games, and even then I thought of them as “programs.”

In 2009, I had my first brushes apps like Shazam and Grindr that offered something a lot different than what had been available from a PC or Mac. In 2010, I learned about Instagram and was for the first time jealous of people who had iPhones (I still had a dumb phone of some sort at that time). In 2012, I found out about Uber and was briefly enamored with it before it revealed itself as an ethically-challenged organization.

But since then, there haven’t been many “a-ha” moments for me in using mobile apps. The ones I use every day are based on age-old phone conventions like being able to send text messages (starting with SMS and now evolving into iMessage, LINE, etc.) and photos.

There’s also DuckDuckGo (a search engine, one of the oldest forms of exploring the Web), Lyft (since I can’t stand Uber), Flickr (for photo backup) and Tumblr (where I do some of my creative writing). There are ways to pay for my coffee (Dunkin’ Donuts and Starbucks) and then there’s Yo, which is a novel way to get updates on RSS feeds, Twitter accounts, etc. Although it started a gimmick, I think Yo has a lot of potential. There’s Pocket, my favorite. And 1Password, which simplifies so many headaches.

Part of the reason for the paucity of apps on my phone is that I have never been in love with social networking. With Tumblr, I can just publish from time to time and not worry about my real identity. But I steer clear of Facebook and Twitter on mobile since they just demand too much attention for too little return. I use Snapchat but have never used Secret (I don’t get it) or any dating app like Tinder (I’m married).

What is the future of social networking? Bleak, I hope, since it seems to make so many people anxious or unhappy, worrying about what others are doing and keeping track of when certain people are awake or active. I liked this passage from Tyler Brule:

“I have a theory about social media: that is exists not because people are dying to share everything but because of poor urban planning. The reason these channels have developed on the U.S. west coast stems from millions of people being lonely and trapped in sprawling suburbs. Apparently, the Swiss are among the lowest users of social media in Europe. I’d venture that this is due to village life, good public transport and a sense of community.”

In America, for someone born after 1980, there are so many barriers to meeting up with others unless: 1) you have a car; 2) have access to good public transportation. #1 is an issue for the cash-challenged Millennial generation, yet so much of American infrastructure – from sprawling parking complexes to office parks located in the middle of nowhere – assumes the ownership of one. #2 is surprisingly rare – I would venture to say that one can only comfortably be out and about in a city without having a car as back-up in exactly two American cities: New York and Chicago.

What fills the void? Social media and messaging apps. Maybe part of my own gravitation away from social media has been the fact that I have lived in one of these two cities for the past 7 years. Plus, no longer being single has also eroded a lot of the youthful fascination that once made, say, Facebook so exciting to use. It’s hard for anyone who joined Facebook after roughly 2006 or 2007 to know what it was like in the early years, when it was all single college students who send each other Pokes and edited each other’s Walls at will.

Less social media (and storage space – I settled for a 16GB iPhone 6 Plus) has led to a pretty spartan, utilitarian home screen. But it’s also, I suspect, left me happier since I don’t have to keep tabs on others as part of a lonely suburban existence.

“Space Quest 6” and the Internet as space vs. medium

One of the distinctive things about the Internet is that no one used to call it “the Internet.” Throughout much of the 1990s, the act of accessing an IP network (likely over dial-up) was referred to as “going online,” entering “cyberspace,” or encountering “the Net,” “Web,” or just “AOL.” Then there was my favorite: the Information Super Highway.

Networks and history
Computer-driven networks had been evolving for decades by the time that Geocities et al made them directly accessible to consumers. There wasn’t a monolithic, unified network in development that whole time, though; “the Internet,” in all of its broad meaning, was a latecomer to the networking, software and hardware party that had been going on since microprocessors were invented in the 1940s.

Speaking of the 40s: I saw Ethernet inventor Bob Metcalfe speak at a conference in D.C. last year, and he half-joked that the Internet began in 1946 with the first microprocessor, making it exactly as old as he was. This quip is instructive, since, on the one hand, it demonstrates how the numerous technological inventions that makes today’s Internet possible go back many years, and, on the other, it shows how all of these developments have been retroactively wrapped up in the homogenous terminology of the “Internet” (so we have “the history of the Internet,” or “pre-Internet” instead of “the history of networking technologies and capitalistic decision making” or “pre-chip.”)

Card-processing networks and travel reservation networks, for instance, were among the disparate networks that emerged throughout the 1970s, as Evgeny Morozov noted in a recent interview. The discursive convergence on the term “Internet” didn’t happen until much later, and was never inevitable. Infrastructure control had to be handed off to the private sector and specific technologies and protocol stacks (like Ethernet and TCP/IP) had to win out over others.

Medium vs space
These days, the Internet is seen primarily as a medium. One might “use” the Internet in the same way she might use a phone line, magazine, or TV. It serves as a means of getting information, e.g., it has literally become “the media,” in a happy coincidence of terminology. But one doesn’t really occupy it; in the popular imagination, there is no longer a spatial quality to it, and talking about “cyberspace” feels anachronistic.

This wasn’t always the case. There was “cyberspace,” sure, but there were also “chat rooms” (another spatial reference) as well as weird artifacts like Apple eWorld that tried to represent connectivity as a traditional community – with buildings corresponding to different tasks – rather than one giant medium (“the Internet”). Even early browsers like Mosaic and Netscape Navigator had names that were spatial, representing a physical collection of objects and a guide through a cyber-landscape, respectively.

None of these modes of connectivity were strange in the 1990s. My favorite example in this mold was the Sierra On-Line adventure game “Space Quest 6: The Spinal Frontier” (hereafter “SQ6”), released, crucially, in 1995, which was the year that Netscape really began to pick up steam and Windows 95 was released. It was also, if I recall correctly, the first year I actually went online.

Space and “Space Quest 6”
SQ6, like its predecessors and most of Sierra On-Line’s games, was a point-and-click adventure, a genre that involves investigating a world, clicking on things, accumulating inventory, talking to people, and solving puzzles. Generally, the gameplay is slow-paced and intellectual. I grew up with these games in the early 90s, installing them from floppy disks, being stumped for hours on puzzles, and then having to order a hint book since GameFAQs didn’t exist yet.

Many of them also had manuals that contained crucial, proprietary hints to puzzles, as a means of copy protection. “Space Quest V: The Next Mutation,” for instance, had a tabloid that included important tips. With SQ6, there was a pack-in magazine called “Popular Janitronics” that you absolutely had to have to complete one of the game’s hardest tasks (creating a homing beacon).

Unlike, say, “King Quest’s VI: Heir Today, Gone Tomorrow” (also from Sierra), SQ6 wasn’t on the technological cutting-edge, although I thought it was at the time since it was the first “Space Quest” game to be built for CD-ROM distribution (to get a sense of how big a deal this was: KQ6 was initially available on 12 floppies, and, after a year, on 1 CD). Its graphics were ok and its gameplay standard.

The game’s hero, Roger Wilco, goes to several exotic planets on his quest to save someone named Stellar Santiago. The most memorable sequence for me, though, is when he goes into cyberspace, which looks like this:

hqdefault

superhighway

Software, networks, and hardware
These screenshots show how a lot of people in 1995 conceived of “the Internet” (which didn’t really have that label at this time; that noun with the definite article is found nowhere in the game’s dialogue or literature): vast spaces, dotted with highways that carried information and ran past virtual buildings that held online accounts and files. The file cabinet screen grab above is accessed through a menu that looks like a dead-ringer for Windows 3.1, which is itself housed inside a trailer. That’s about as cyberspace-in-early-1995 as you can get, and not far off from eWorld, albeit in a Windows-centric universe (I played the game on a Windows 95 machine).

While all of this may seem outdated now, it really isn’t. For starters, Roger goes online not by using a phone or even a PC, but by donning a VR headset that doesn’t look much different from Facebook’s Oculus Rift. Everything old is new again; recycling sci-fi and fictional ideas is both a fascinating aspect and a potential weakness of the tech industry, which has a strange reliance on the entertainment industry for ideas at times when its own poverty of imagination shows through. Plus, the idea of the Internet as a space was never “wrong,” it just lost out to the homogenization that eventually grouped disparate histories in hardware, software and networking infrastructure into one story, as Morozov pointed out:

“For most of the nineties, you still had a multiplicity of different visions, interpretations, anxieties and longings for this new world, and a lot of competing terms for it – virtual reality [note: weird how this one has survived and actually flourished in the discourse of wearable technology], hypertext, World Wide Web, Internet. At some point, the Internet as a medium overtook all of them and became the organizing metacategory, while the others dropped away. What would have changed if we had continued thinking about it as a space rather than as a medium? Questions like these are important. The Net isn’t a timeless, unproblematic category. I want to understand how it became an object of analysis that incorporates all these parallel histories: in hardware, software, state-supported infrastructures, privatization of infrastructures, and strips them of their political, economic and historical contexts to generate a typical origin story: there was an invention- Vint Cerf and DARPA – and it became this fascinating new force with a life of its own. Essentially, that’s our Internet discourse at present.”

He’s asking good questions, and I can’t wait for him to write more books on “the Internet” and its history. Since we’re talking about video games here, though, I might note, on the subject of “origin stories,” that this tendency toward a specific, linear history of “the Internet” – one that scrubs out various continuity errors or false starts – is a lot like something from a comic book or fictional universe, which makes sense. The tech industry at present has considerable overlap with geek culture, which has led it to elevate the Maker movement and the sort of artifact-obsessed outlook that loves clean origin stories rather than messy human dramas.

Wilco and conclusions
Roger Wilco never starred in another official SQ after 1995. Like the rest of Sierra’s adventure gaming franchises, which had thrived as PCs became mainstream in the 1980s and 1990s, it struggled to keep pace with new types of games that sported better graphics, more violence, and online gameplay. The solo, introverted experience of the point-and-click game was no match for attention spans with access to Unreal Tournament and, eventually, Facebook games.

With that transition in mind, it makes sense that SQ6 would see “the Internet” as a bunch of filing cabinets, or an “offline” version of Windows 3.1, for someone to dig through. The notion of the Internet as an actual medium for other people’s information, rather than a quiet library for each individual, implies a broad social connection that computers did not deliver in the mid 1990s and further back.

It’s too bad, in a way. If the Internet were conceived of a space today, think of the impact such a mindset might have on data collection and privacy – Wilco would have been overwhelmed had he stumbled across the “F” filing cabinet in that building, stuffed as it would be now with Facebook data. Or the “N” (NSA) or “U” (Uber) cabinets. Maybe it’s time to bring “cyberspace” back, if only as a semantic nod to there being real consequences for data collection and online screeds.

The Internet and The Physical World

10700194_10100554021493291_7519051451388774533_o

A 2004 album reimagined for the iPhone 6 Plus lock screen

Look out: Death From Above
Last year, Canadian band Death From Above 1979 (their name, if you’re curious, was created at the last minute so as to dodge legal action from DFA Records) released a record called “The Physical World.” It came 10 years after their only other record, 2004’s “You’re a Woman, I’m a Machine.” In the intervening years, I had attended college, moved from Providence to Chicago and gone through a slew of jobs en route to my current gig. The band didn’t know these facts, of course; the record sounds like it could have been recorded back during that same autumn as the debut, when George W. Bush was facing off against John Kerry in the U.S. presidential election.

In 2004, if I wanted to explore music, I would take the 30 minute walk from my dorm to the Newbury Comics in the city mall. Web services like Ares were available for downloading MP3s for free, but I didn’t want to risk it on the university network.  I saw a lone copy of “You’re a Woman, I’m a Machine” one day and picked it up, having really only heard the band’s name on Pitchfork, not intending to buy in when I went down there, and only nudged into doing so by seeing it at that moment.

By 2014, this mix of ritual – the walk downtown with iPod in tow – and impulsiveness seems ancient. Finding “The Physical World” on the Internet, legally or otherwise, takes seconds. The only chance to “bump into” it, like one would in a record store, is now limited to seeing in a YouTube sidebar or having it come in after many other similar-sounding songs on a socially curated Spotify playlist.

If nothing else, the Internet – if there is any really single, organic “Internet,” rather than just an amalgam of the globe-spanning properties of American companies like Google and Facebook, bankrolled by advertising dollars and venture capital, and threatening professional death from above for publishers and artists everywhere – has in such ways offered to replace many of our social experiences with what basically amount to simulations. Often, words like “easy,” “convenient,” and “at your fingertips” justify the change – don’t walk to the record store, here’s everything Death From Above 1979 have ever recorded, right at your fingertips!

“Social”: What came after 1979
But how social is the Internet? The question comes off as both tone-deaf (where have you been during the last 10+ years of social media?) and Ted Stevens-y (he once called the Internet a series of tubes, which was widely lampooned but accurate in a strange way). The social dimension of the Internet – its impact on conversations, sharing, etc. – seems undeniable.

I recently listened to the first episode of the podcast “Upvoted,” from reddit, the self-proclaimed front page of the Internet. The story was about a man, named Dante, who had gone to prison for drug offenses, getting a much shorter sentence than he expected after a right-wing judge presiding over his proceedings was injured and replaced by a Clinton appointee. During his time in prison, he mastered drawing and sometimes sketched out what an iPhone looked like for prisoners who had been incarcerated so long that their last experiences with a computer was via Windows 95.

Near the end of the podcast, one of Dante’s friends talked about how justice was not meted out equally, not only across demographics but across Internet users. He asserted that kids who were less social and who didn’t have a lot of friends but instead hung out all day on the Internet were somehow at greater risk of punishment. I thought:

  • Isn’t the entire Internet “social?” Isn’t that what has driven so many startups to record-setting valuations and fueled the ambitions of Facebook to connect every last person on the world to a website? Isn’t its difference from the physical world the notion that anyone and everyone is just a tap away, rather than cordoned-off from communications or in a faraway place? Isn’t the presence of these so-called awkward kids on a website like reddit (of all places) just the digital version of an analog community (to use a stupid digital dualism crutch) and somewhat of a problem for labeling these people as “not social”?
  • What if, though, that guy from the podcast was right, that whatever “social” experience the Internet was ultimately providing wasn’t ultimately an equivalent of, nor a replacement for, what had come before in terms of “social” – the in-person social activities, or even the private rituals like record buying? What if the Internet had just as much reinforced the positions of the naturally sociable (in much the same way that it has come to entrench huge corporations, the top 1 percent of music artists, and millionaires and billionaires more generally) as it had given introverts/shy nerds/whatever label you like more freedom? What if all of the Internet’s activities really were just simulations that couldn’t overcome issues like inequity in justice?
Screen Shot 2015-01-14 at 10.43.40 PM

“Upvoted,” from that same lock screen

The 1979 in Death From Above 1979’s name is the year before the Millennials generation is generally agreed upon to begin. People born from 1980 onward came of age at the same time as any number of Internet-reliant technologies. For me, born in 1986, it was the Web browser, which came into its own when I was about 10 years old, paving the way for social networks just a few years later.

The first social network I used was naturally MySpace, then Facebook in July 2004, not long before “You’re a Woman, I’m a Machine” came out. I guess I’m one of the earliest users of Facebook and that I’ve explored its feature more than most (e.g., using Skype to see the entire News Feed, not just the EdgeRank-filtered results). All of this expertise and experience has done nothing to make me a “social” person in the physical world (“real life,” I guess, though I don’t like that phrase since it has so much baggage). My time on Facebook, in other words, hasn’t given me the social high or prestige that I would need to avoid what that one podcast speaker had deemed the demographic disadvantage of shy, Internet-addicted kids.

Everything on Facebook isn’t really the physical me. I don’t make long speeches in person that are equivalent to my Facebook comments. I don’t leer at faces the ways I stare at images. I don’t try to find out what news articles, lists, and videos someone at the restaurant I’m in is interested in. I don’t have anything resembling a “network” (in recruiter-speak) of actual, contactable people that maps to my list of Facebook “friends.”

The same mostly holds for reddit. Reading posts in the Bitcoin and Nintendo subreddits are ways to waste time rather than reflections of what I really think about when I’m out walking or in bed. I would never make some of the comments I had made were the interlocutor standing in front of me (this is the tragedy of Internet comments, which are still good for something though).

You’re a Man, I’m an Internet Social Network
For someone who is not naturally social or sociable, the Internet – in this case, social media sites and forums like the ones discussed here – can be dispiriting. It’s possible to make new friends or relationships on the Internet (I met my spouse this way after all) but it’s also possible to have a good email exchange or emailed job application torpedoed once other forms of communication – a phone call or meet-up – enter the picture. The latter example deserves a post all of its own, but I’ll just say that Internet job postings paradoxically give everyone and no one a chance – volume is often so high that candidates who have put in more legwork in the physical world – met the right people, gone to the right seminars – are best differentiated.

Likewise, having scores of LinkedIn contacts or Facebook friends doesn’t necessarily give one an advantage in physical world situations in which cronyism, who-do-you-know, it’s-always-been-like-this, and you-can’t-sit-with-us still rule the day. And then there’s the way in which a friend’s Facebook photo at some famous monument makes us feel like we’re missing out (on physical activities and places, mostly), or some listicle about how we all need to be more “spontaneous” (i.e., insane), which of course would require a lot of activity beyond just being on the Internet all day – despite its often-cited deep “social” character.

It feels like the Internet is still a poor map of the physical world and many of the behaviors – secret meetings, hard labor, conversations that involve more than texts and “…” [this person is typing] balloons – that made it the way it was. This includes even “inefficient” processes like walking to some store to buy a Death From Above 1979 album (or, even further back, a copy of Windows 95!) – the time I spent doing that is now “saved” so that I can just waste it straight away on BuzzFeed or getting to the top of the Twitter stream. Moreover, by only giving us, in most cases (not all), simulations, it really can subtly weaken people who aren’t predisposed to being social, by giving them the illusion that they can change (“disrupt” would be the cliché word choice here) things and get ahead, when they’d probably have a better chance of doing so by just taking a walk outside and buying whatever they wanted to.

I can still listen to “You’re a Woman, I’m a Machine” anywhere I go, just as I can do with “The Physical World.” If I hadn’t had the longwinded physical world experience of the former, though, who knows if the band or album would be special to me at all a decade later, or if I would have taken the 30 minutes to write this…

Facebook’s strategy revealed in creepy bus stop ad

I was going to lunch today, walking along Lake St. in Chicago to a Jimmy John’s (conveniently next to a Starbucks, where I would get a coffee afterward) and I saw a Facebook ad. No, not one of those “Save 20% on designer shoes” or “Buy Dawg Pound merchandise here” in-stream shills, but a real, physical banner on a bus stop. It looked like this:

10258605_10100583175643141_4745893577866225703_o

Kinda creepy. It was for Facebook Messenger, the app that was recently split off from Facebook proper on mobile (though you can still use all the features of Facebook together in the convenient mobile Web app) and that now has 500 million users. There’s not much Facebook branding here, really, which I think is intentional. Messenger is meant to be something as basic and habitual as text messaging or IM clients were just a few years ago. Facebook’s enormous databases – your friends, your profile, your history – are just the back-end, the magic behind the scenes.

I thought about calling this post “Facebook-as-a-service” (you can see it in the slug still) in a cheeky way, since the “as-a-service” moniker is most often applied to resources like servers and software that are delivered to customers on-demand, without the need to install anything. Messenger seems like just another app – a LINE or WhatsApp clone – but it’s being marketed as a way to do Facebook without really being “on Facebook,” i.e., scrolling through News Feed chaos. In that way, it resembles infrastructure- and software-as-a-service, which let you get more computing power and packaged applications without dealing with the mess of equipment management or software downloads.

Also, this ad is one of the only ones I’ve ever seen with a Windows Phone rectangle next to the App Store and Google Play equivalents. The sticker centerpiece is strange but overall the ad seems at least as effective as all those in-stream ones I’m missing out on by using AdBlock.