Read Who Owns the Future? Online
Authors: Jaron Lanier
Tags: #Future Studies, #Social Science, #Computers, #General, #E-Commerce, #Internet, #Business & Economics
Bill actually humored me. The retort would eventually come, however, intoned in an inimitable Tennessee twang, “Jaron, I
tried
. But it’s coming out dark.”
Of course it was all in fun. I knew Bill wasn’t about to listen to me about lightening up—what a literary disaster that would have been!
Some decades later there are days when the world does seem to be plowing right into one of Bill’s novels. But the story is not over. It has barely begun.
The ’Net Is Watching
Worries about who can see what on Facebook, or whether it’s safe to enter a password while on the café Wi-Fi, are going to be superseded by serious questions about how well people understand the implications of their most basic activities, such as walking around. Paranoia is only getting started.
Around the turn of the century Google bought a little startup from a few people including me. It turned into the seed of the machine
vision part of the company, including such initiatives as Google Goggles. I mention this to make clear that I am not writing about some remote “them” but about a world I have helped to create.
Among many other tricks, good machine vision can track where people are whenever they’re in view of a ’net-connected device with a camera. Recognizing a face, for instance, or analyzing the gait (the characteristic motion of a person’s walk) can do the trick. It’s getting to be unusual to not be in view of such a device when in public in a city.
Machine vision has massive creepiness potential. Weren’t wars fought and many lives lost precisely to prevent governments from gaining this kind of power, knowing where everyone is all the time? And yet now, because of some cultural trends, we’re suddenly happy to offer exactly the same power to a few companies in California, along with whoever will come along with enough money to piggyback on them.
Long ago, working on the movie
Minority Report,
I proposed that billboards might grab the hero’s face and implant it in the ads. Then he’d never be able to run away from the police, because they could just watch to see where he popped up in billboards. State surveillance without the state having to lift a finger! That’s a classic Siren Server ploy, keeping an arm’s length and yet enjoying information superiority. I even made a demo at the time.
Sure enough, Facebook is now bringing your friend’s faces into ads, and location-based services are already targeting ads on mobile phones to specific people when they go to specific places.
Where is it all leading? The trend is toward ever more creepiness. Any of the technologies I described earlier that might put masses of people out of work, will also have vivid creepiness potential. The more a society bases itself on the wrong model of automated “efficiency,” the more potential there is for sudden outbreaks of evil. There will be more players who are motivated to act outside of a social contract.
A photo that happens to include a view of someone’s key chain resting on a café table might provide enough data to replicate copies of the keys. 3D printers might also be used to create parts for bombs, restraints, devices of torture, or other cruel props. (A pro
gun-rights group is already distributing “open source” gun-printing files online.) Self-driving cabs might someday be hacked to hit pedestrians and run, or deliver car bombs in coordinated attacks, or kidnap someone who was expecting a quick ride.
Despite their real potential for harm, I remain of the opinion that these tools are just tools. There is nothing inherently evil in a machine vision algorithm. However, it is also inadequate to say that the only level on which to address the ethical use of tools is personal responsibility.
No, what we have to look at is economic incentives. There can never be enough police to shut down activities that align with economic motives. This is why prohibitions don’t work. No amount of regulation can keep up with perverse incentives, given the pace of innovation. This is also why almost no one was prosecuted for financial fraud connected with the Great Recession.
The only effective point to intervene, to fight creepiness, is in the fundamental economic model. If the economic model tends to bring out noncreepy developments, then only true creeps will want to be creepy. True creeps will then be rare enough to be treated as a law enforcement problem. There will always be a few sociopaths and more than a few teenagers going through a phase, but society has always had to deal with those challenges. Legit companies and professionals should not be motivated to go creepy.
The long-term goal of a security strategy, for instance, cannot be to outsmart criminals, since that will only breed smart criminals. (In the short term, there are plenty of tactical occasions when one must struggle to outsmart bad guys, of course.)
The strategic goal has to be to change the game theory landscape so that the motivations for creepiness are reduced. That is the very essence of the game of civilization.
Some Good Reasons to Be Tracked by the Cloud
Given the way networks are structured now, one reaction to creepiness might be to pull back from connecting to cloud software. You might be tempted to go off the grid as much as possible to not be
tracked. That would be a shame, because there are real benefits to using cloud computing, and there will be more and more benefits in the future.
People already routinely tap “yes” to allow tracking options in their phones, and then expect the cloud to recommend nearby restaurants, keep track of their jogging, and warn about where the nearby traffic jams have formed. Could there be even more compelling reasons to accept being tracked, and being observed by remote algorithms in computer clouds? Yes, there will be many good reasons. I gave one earlier: knowing your carbon footprint moment to moment.
Other examples will come about because of Mixed, or Augmented, Reality. This is a technology that brings Virtual Reality into the everyday physical world. A typical way it might work is that your sunglasses would gain the ability to add an illusion of virtual stuff placed in the physical world. The glasses might reveal something about a flower as you walk by a garden in springtime. The compatible pollinating insect could gain an annotated halo.
Seeing the living world annotated with what science has been able to learn about organisms and their interdependencies is going to become a new common joy. I’ve been able to experience it in Mixed Reality research and it’s really wonderful. Augmenting nature might at first seem to miss the point, but it is also a way to see it in a new light without disturbing it. Don’t worry about losing track of the beauty of the real world. Virtuality only makes reality look better in comparison.
Maybe the birds and bees don’t excite you, but something probably would. Another potentially beneficial reason to be tracked will be keeping your life experiences with you.
For instance, suppose you once understood a tricky technical principle when a friend explained it to you, but years later understanding eludes you. Replaying the experience and circumstances of the initial exchange with your friend—perhaps using your Mixed Reality sunglasses—would be the most superb reminder.
There is no access to your memories except through resonance with either immediate experience or internal experience of related
memories. While technology can’t yet record and replay your inner mental state, it can record a lot about what you’ve sensed and done. That record can be made replayable in order to provide a wonderfully rich trove of mnemonics.
Replaying aspects of an older experience is a vivid prod to the mind, awakening dormant thoughts, sensations, emotions, and even talents. A general tool for re-creating old multisensory circumstance would open up memories, skills, and insights that would otherwise be obscure to you, even though you always carry them around hidden somewhere in your mind.
You’d dip back into your own past on those occasions when you get stuck but think there might be a clue hiding in your cortex somewhere. Where did I find that recipe? Haven’t I gotten into a similar fight with my boyfriend before?
*
*
While I am convinced this type of design would bring wonderful value to many people, it wouldn’t be for everyone. I might not use it. Possibly because my mother died when I was young, I’ve developed a cognitive style in which I forget a lot of what’s happened, and try to only retain what seems to be most important and what works best for me. There’s absolutely no reason to expect every information tech design to be right for everyone. The more powerful that personal information tech becomes, the more variety we should expect to see, unless we hope to dumb down our species to have limited variation in cognitive style.
David Gelernter’s “Lifestreams”
2
was one early stab at thinking about saving what can be collected of life’s memories.
†
My colleague Gordon Bell at Microsoft Research is another pioneer of this approach to personal information systems.
3
†
David and I even had a consulting company together for a while, trying to get clients to try this approach to supporting human cognition.
In all this foundational work, the emphasis was on personal benefit. Of course, we all understood there would be creepiness potential. Alas, the real world is on the path toward creepiness.
Companies like Facebook organize many people’s digital memories for the benefit of remote clients who want to manipulate what’s put in front of those people. Fortunately, this commercial development has come about before the devices to collect really intimate information are available. We still have time to get this right.
The Creepiness Is Not in the Tech, but in the Power We Grant to Siren Servers
Mixed Reality can get as creepy as any other advanced information technology, but the creepy part is how Siren Servers might make use of it. The technology becomes quite creepy indeed when another party is the manager and proprietor of your “externalized memories.” It had never occurred to me back in my twenties that people would someday find it young and cool and hip to give that power to remote corporations.
To add to the earlier glimpses of what a creeped-out version of Mixed Reality might be like, imagine a situation where a young man returns from college and wants to reexperience his old room as he left it, before his parents turned it into a guest room. He puts his eyewear on, and a message hovers, “To recall your old room, you must check this box accepting Company X’s latest changes to its privacy policy, and agree to use the company’s services for personal navigation for a year, and agree to publish the book you’re working on through the company’s store. Otherwise, good-bye old room.”
The online space feels a little creepier, a little less under individual control, every time a user is asked to acquiesce to a bunch of fine print no one reads. The reason no one reads the fine print is that even if you do take the time, there will soon be a new revision, and you’d have to make reading the stupid EULAs a full-time job. In those cases where the user is given more than an all-or-nothing choice, the options become so complex, and so dynamic, that once again it would be a full-time job just to manage the settings. This is what happens with privacy settings on Facebook. It’s become a smug, geeky achievement—with bragging rights—to be able to manage them well.
The reason people click “yes” is not that they understand what they’re doing, but that it is the only viable option other than boycotting a company in general, which is getting harder to do. It’s yet another example of the way digital modernity resembles soft blackmail.
Maslow’s Pyramid of Blackmail
Information technology changes the baseline of expectations as we go about our lives. It’s not yet possible to reexplore with Mixed Reality the room you grew up in; one could argue that being denied access to that recreation would really not be a serious problem. But once people are used to having an information service, their cognitive style and capacity becomes molded by the availability of the service. To remove it later is a serious matter. So while it might not seem important today, someday it could become deeply disruptive if unseen third parties are able to manipulate virtual environments, like blocking a re-creation of a childhood room, in order to manipulate you.
This is not only a personal problem. What if the real-world signage of a store was obscured when people looked at it through popular eyewear—perhaps as retaliation because the proprietors were not paid up on some future review or check-in service?
It would also be both creepy and sad if the virtual things you saw were not seen by your friends or family because you were all locked into different contracts with opposing business empires.
It’s bad enough that people can’t share apps because of which phone or carrier they got locked into, but it will be worse if people can’t see the same augmentations of the world they otherwise share.
The Weird Logic of Extreme Creepiness
Creepy concerns become weirdly intertwined and transformed when they are extrapolated to extremes. For instance, if we come to be utterly unconcerned about privacy, identity theft risks will be mooted. If everyone were under constant surveillance, each person would present a single, consistent, imperturbable continuity of identity and there would be no possibility of identity theft. A person whose identity was stolen would seem to suddenly split in two, or leap at the speed of light to a different location. Someone somewhere would always be looking, so you couldn’t get away with that sort of thing anymore.
One aspect of “identity” is to secure unique access to one’s assets. But why worry about whether someone stole your guitar/bicycle/shoes if any 3D printer can just spit out replacements?
What if everyone were really able to spy on everyone equally? Some believe they see this state of affairs emerging in today’s Internet, though it is not so. Players are actually highly segregated by technical ability, and, for big players, by ownership of central, privileged servers with closed internal data, and by control of other people’s connectivity. But if it
were
true that we could all spy on each other equally, then some utopian thoughts would become possible. Perhaps there would be more of a sense of privacy in big numbers. At some point no one would care anymore if a congressman tweeted a picture of his penis. Yawn. When people don’t care enough to look, then privacy will be restored. This is a common hope in the “transparency” movement.