The Technological Pendulum and the Reconsidered Seventies

From Alex's weekly letter. Sign up here!

For a while, they were a hot item here in the Bay Area, those little ā€œNo Glassā€ stickers. The design was simple: a white background with an eye and a rough Google Glass display piece outlined in black, crossed by a red ā€œnoā€ circle, Though I only saw about a dozen in the wild myself, rumors and banter suggest many more are out there circling around.



Iā€™m just back from giving a talk at SxSW on futurism and staying current, so technologyā€™s on my mind. BoingBoing posted a piece the other day, headlined ā€œGoogle Glass chief ā€˜amazedā€™ by privacy issues that helped kill his project.ā€ [LINK] In it, Astro Teller, chief of Google's defunct Glass project, delivers this remarkable quote:

"I'm amazed by how sensitively people responded to some of the privacy issues," Teller explains, expressing frustration about the backlash against Glass in public, given the prevalence of mobile video. "When someone walks into a bar wearing Glass... there are video cameras all over that bar recording everything. ā€¦ Iā€™m not making an apology for Google Glass. Google Glass did not move the needle... it was literally a rounding error on the number of cameras in your life."

The disconnect here is stunning. Leaving aside the silly contention that every bar has video surveillance, and even granting the idea that embedded cameras in phones and other tech are commonplace, Teller really clearly fails to understand how Glass made people feel.



Putting a camera in front of our eyes and pointing it at someone makes them feel targeted, in an unambiguous way that phones do not. It is, within our social contract, socially aggressive. As a piece of design, it turns every user into a paparazzi, invading the lives of people who never sought to be celebrities. Itā€™s mind-boggling that no one at Google seems to have noticed this, or cared enough to attempt to soften that first impression.



All of the negative effects impact the people the user is looking at (unless they retaliate against the user). Without their permission, the user has changed the social contract in which they exist. That's obnoxious at best. As Jan Chipchase put it: [LINK]

"Today we assume that our conversations and our image is not by default recorded by other people in proximity Not having a persistent record allows us to present a nuanced identity to different people, or groups of people, provides with the space to experiment with what we could be. The risk that what we say will be broadcast, narrowcasted to people we donā€™t know, or may underpin someoneā€™s future business fundamentally changes what we want to talk about. The challenge for Glass is that the costs of ownership falls on people in proximity of the wearer, and that its benefits have yet to be proven out."

From its skydiving unveiling to its selection of beta-owners to its aesthetic, the project screamed classism at a moment when the juxtaposition of precarity and privilege is particularly alarming in America. More: Glass landed to many as a public ā€œf*** you,ā€ at a moment when anger about tech-driven economic inequality and the rise of a selfish and anti-public gentry was just starting to crest (weā€™re nowhere near the end of that, obviously). Add to all that revelations about tech companiesā€™ willing participation with NSA mass-surveillance, and the whole thing felt to many like the anti-iPhone: not open, but prying; not democratic, but oligarchic; not a platform so much as an occupation.



This is not (merely) a marketing question. If Glass had been rolled out exclusively to an audience of beloved teachers, librarians and social workers, it still would have felt creepy. It is a creepy design.



The problem was one of design intention. Itā€™s next to impossible to design great consumer tools if you lack interest inā€”and insight intoā€”people and culture. No matter how smart you are, assuming the awesomeness of your hack will blow aside social mores and bend peopleā€™s behavior to your will is a dumb design strategy. Understanding when something's creepy is not rocket science. If you can't sense it, hire (and listen to) someone who can.



Though for business models based on being creepy, that may not help. I think we are in the early stages of a major pendulum swing in the culture of technology, back towards a active public engagement with (and interventions in) technological development. I see signs of this everywhere: the slogan ā€œIf you donā€™t know what the product is, the product is youā€; increasing skepticism of the hype curve around new technologies like robocars and drones; hostility towards slimy business practices like those revealed at companies like Uber; growing awareness that employment in ā€œthe gig economy,ā€ life ā€œbelow the APIā€ is destabilizing already precarious middle-class lives on a pretty massive scaleā€¦ the list goes on.



In cities, this is about to reach a crisis point, as people begin to understand that much of the rhetoric of smart cities, Internet of Things, sharing economies and civic collaboration has been repurposed to support a number of attempted (and soon to be attempted) systemic lock-ins and ā€œland grabsā€ā€”claiming of modes of thought and action as private intellectual property. Selling people a piece of infrastructure is not nearly as profitable as owning and licensing the rights to the idea of it. In the worst form, we could experience a sort of public works patent trolling. (One of the reasons Iā€™ve written about the importance of prior art.)



Okay, now I'm way off in the weeds. More about something completely different next weekā€¦