The title of the book is a bit misleading; it contains a collection of thought-provoking essays rather than a coherent statement of principle. But it’s worth reading closely because it’s an insightful critique of current techno-culture made by a prominent technologist that helped to create that culture.
Jaron Lanier is one of the pioneers of virtual reality technology. He’s worked on human computer interface research and design in Silicon Valley for decades. When he discusses the dark side of social media, social software, and the Web 2.0 ethos, he does it with an insider’s eye and enthusiast’s attention to detail.
If you’re looking for a thinker to challenge the current techno-centric zeitgeist championed by futurists like Ray Kurzweil and Clay Shirky, Jaron Lanier is your man.
Lanier touches on many different topics relating to culture and technology in this series of essays. But if I had to pick three themes that captured my imagination, these would be:
To understand fully the arguments Lanier is making, you’ll need to read the book itself. But I’ll try to summarize the key points here, partly as a reminder to myself.
One of Lanier’s central themes is that the ethos of anonymous social production devalues the contributions of creative individuals. Contributing ideas and artifacts anonymously often means that the significant contributions to knowledge and culture made by individuals go unacknowledged.
If people aren’t rewarded for their efforts somehow, most will stop participating. Only the tiny minority that find the act of creating new music or books or software intrinsically rewarding will continue to do so. But these can be expensive hobbies. How will those individuals pay their bills?
And if fewer folks make original contributions to culture or knowledge, then online culture might degenerate into nothing more than remixing or repeating previous material.
The praise lavished upon the current crop of social media and social software tools must be galling for a researcher that has spent his entire career inventing technologies to deepen human interactions on the Internet.
Lanier provides many examples of how these “social” systems dumb-down human thought and opinion, and eliminate any nuance in relationships.
What does it mean to boil down all interpersonal connections to the single act of “friendling” or “following”? This reduces human relationships to the equivalent of a second-grader’s hastily scribbled “Do you like me? Check this box!” note.
Other means of social or collaborative filtering are just as primitive. The advantage of simple thumbs-up/thumbs-down rating systems is that the results can be easily aggregated and calculated by computers. The results might help gauge popularity, but lack the depth and context that human reviews provide.
Many of these tools, with their profiles and avatars, make self-definition a conscious and deliberate act. In the past, only public figures and celebrities needed to manage their identities carefully. Now we all do.
Worse, these constructed identities tell us much more about our tools than they do about ourselves. We stuff our resumes with keywords so that search engines can find them. We tailor our public profiles, leaving out gender or age or location, so we won’t be deluged with advertising. It’s not about who we are. It’s not even about how we want other people to understand us. It’s about how we want our computers to see us.
Lanier devotes several large sections of You Are Not a Gadget to deconstructing the idea of computationalism. There are several strands to this belief, the primary tenet of which is that the human brain processes information like an ideal Turing Machine would.
If human thought is computable, then it is only a matter of time before computers become humanly intelligent.
This goal has become the holy grail of the Artificial Intelligence community. But Lanier points out that today’s hardware and software have limitations that cause them to deviate from an ideal Turing Machine. (It’s not physically possible to construct one, so it’s less a real “machine” than a thought experiment carried out by Alan Turing, for which it is named.)
But even if you could work around the technical limitations, it’s not clear that human intelligence is computable at all. The brain-as-computer analogy might be completely false. Despite the fact that processing power is getting faster and cheaper all the time, it could be that machine intelligence is a very different thing than human intelligence. Computers will never catch up to people; they are running a different race.
Lanier notes that despite a few well-publicized successes, progress in artificial intelligence has been slow, and its definition of success has shifted over time. Yet it’s still an article of faith in Silicon Valley. It is the motivating philosophy behind Google and hundreds of other companies there, and it infuses the culture of the technology industry in general.
He worries that the drive to create artificial intelligence harms our understanding of the human kind. In the tech industry’s rush to build and promote machine intelligence, it could accidentally create solutions that constrain individual creativity, thought and judgment.
Worse, the notion of computationalism contributes to the sense that computers and people are interchangeable. If you only had a big enough server farm in the cloud, the thinking goes, you could replace most of what people do with their brains. And the bits that computers can’t calculate or simulate could be gathered or harvested by aggregating the actions of billions of anonymous users online. Humans become just another input device to the computing cloud.
While the idea that an individual is just one component of the vast collective consciousness of the Internet might be appealing to some, it’s a horrifying proposition for humanists that celebrate individual creativity and initiative.
Lanier reminds us that our technologies have cultural implications. We shape our tools, but they also shape us. We should be mindful of the effects that these tools will have on the way we think and interact.
We are not gadgets, he says, and we should be skeptical of technologies that treat us that way.
Sign up for our Mailing List to receive articles directly via email.
Copyright 2013 Infovark, Inc. All rights reserved.