Category Archives: Psychology

The Mental Health of Smart Cities

Not the mental health of the people living in smart cities, but that of the cities themselves. Why not? We are building smart cities to be able to sense, think, and act; their perceptions, thoughts, and actions won't be remotely human, or even biological, but that doesn't make them any less real.

Cities can monitor themselves with an unprecedented level of coverage and detail, from cameras to government records to the wireless information flow permeating the air. But these perceptions will be very weakly integrated, as information flows slowly, if at all, between organizational units and social groups. Will the air quality sensors in a hospital be able to convince most traffic to be rerouted further away until rush hour passes? Will the city be able to cross-reference crime and health records with the distribution of different business, and offer tax credits to, say, grocery stores opening in a place that needs them? When a camera sees you having trouble, will the city know who you are, what's happening to you, and who it should call?

This isn't a technological limitation. It comes from the way our institutions and business are set up, which is in turn reflected in our processes and infrastructure. The only exception in most parts of the world is security, particularly against terrorists and other rare but high-profile crimes. Organizations like the NSA or the Department of Homeland Security (and its myriad partly overlapping versions both within and outside the United States) cross through institutional barriers, most legal regulations, and even the distinction between the public and the private in a way that nothing else does.

The city has multiple fields of partial awareness, but they are only integrated when it comes to perceiving threats. Extrapolating an overused psychological term, isn't this an heuristic definition of paranoia? The part of the city's mind that deals with traffic and the part that deals with health will speak with each other slowly and seldom, the part who manages taxes with the one who sees the world through the electrical grid. But when scared, and the city is scared very often, and close to being scared every day, all of its senses and muscles will snap together in fear. Every scrap of information correlated in central databases, every camera and sensor searching for suspects, all services following a single coordinated plan.

For comparison, shopping malls are built to distract and cocoon us, to put us in the perfect mood to buy. So smart shopping malls see us like customers: they track where we are, where we're going, what we looked at, what we bought. They try to redirect us to places where we'll spend more money, ideally away from the doors. It's a feeling you can notice even in the most primitive "dumb" mall: the very shape of the space is built as a machine to do this. Computers and sensors only heighten this awareness; not your awareness of the space, but the space's awareness of you.

We're building our smart cities in a different direction. We're making them see us as elements needing to get from point A to point B as quickly as possible, taking little or no care of what's going on at either end... except when it sees us, and it never sees or thinks as clearly and as fast, as potential threats. Much of the mind of the city takes the form of mobile services from large global companies that seldom interact locally with each other, much less with the civic fabric itself. Everything only snaps together with an alert is raised and, for the first time, we see what the city can do when it wakes up and its sensors and algorithms, its departments and infrastructure, are at least attempting to work coordinately toward a single end.

The city as a whole has no separate concept of what a person is, no way of tracing you through its perceptions and memories of your movements, actions, and context except when you're a threat. As a whole, it knows of "persons of interest" and "active situations." It doesn't know about health, quality of life, a sudden change in a neighborhood. It doesn't know itself as anything else than a target.

It doesn't need to be like that. The psychology of a smart city, how it integrates its multiple perceptions, what it can think about, how it chooses what to do and why, all of that is up to us. A smart city is just an incredibly complex machine we live in and whom we give life to. We could build it to have a sense of itself and of its inhabitants, to perceive needs and be constantly trying to help. A city whose mind, vaguely and perhaps unconsciously intuited behind its ubiquitous and thus invisible cameras, we find comforting. A sane mind.

Right now we're building cities that see the world mostly in terms of cars and terrorism threats. A mind that sees everything and puts together very little except when it scares it, where personal emergencies are almost entirely your own affair, but becomes single-minded when there's a hunt.

That's not a sane mind, and we're planning to live in a physical environment controlled by it.

When the world is the ad

Data-driven algorithms are effective not because of what they know, but as a function of what they don't. From a mathematical point of view, Internet advertising isn't about putting ads on pages or crafting seemingly neutral content. There's just the input — some change to the world you pay somebody or something to make — and the output — a change in somebody's likelihood of purchasing a given product or voting for somebody. The concept of multitouch attribution, the attempt to understand how multiple contacts with different ads influenced some action, is a step in the right direction, but it's still driven by a cosmology that sees ads as little gems of influence embedded in a larger universe that you can't change.

That's no longer true. The Internet isn't primarily a medium in the sense of something that is between. It's a medium in that we live inside it. It's the atmosphere through which the sound waves of information, feelings, and money flow. It's the spacetime through which the gravity waves from some piece of code shifting from data center to data center according to some post-geographical search of efficiency reach your car to suggest a route. And, on the opposite direction, it's how physical measurements of your location, activities — even physiological state — are captured, shared, and reused in ways that are increasingly more difficult to know about, and much less to be aware of during our daily life. Transparency of action often equals, and is used to achieve, opacity to oversight.

Everything we experience impacts our behavior, and each day more of what we experience is controlled, optimized, configured, personalized — pick your word — by companies desperately looking for a business model or methodically searching for their next billion dollars or ten.

Consider as a harbinger of the future that most traditional of companies, Facebook, a space so embedded in our culture that people older than credit cards (1950, Diners) use it without wonder. Among the constant experimentation with the willingly shared content of our lives that is the company, they ran an experiment attempting to deliberately influence the mood of their users by changing the order of what they read. The ethics of that experiment are important to discuss now and irrelevant to what will happen next, because the business implications are too obvious not to be exploited: some products and services are acquired preferentially by people in a certain mood, and it might be easier to change the mood of an already promising or tested customer than to find another new one.

If nostalgia makes you buy music, why wait until you feel nostalgic to show you an ad, when I can make sure you encounter mentions of places and activities from your childhood? A weapons company (or a law-and-order political candidate) will pay to place their ad next to a crime story, but if they pay more they can also make sure the articles you read before that, just their titles as you scroll down, are also scary ones, regardless of topic. Scary, that is, specifically for you. And knowledge can work just as well, and just as subtly: tracking everything you read, and adapting the text here and there, seemingly separate sources of information will give you "A" and "B," close enough for you to remember them when a third one offers to sell you "C." It's not a new trick, but with ubiquitous transparent personalization and a pervasive infrastructure allowing companies to bid for the right to change pretty much all you read and see, it will be even more effective.

It won't be (just) ads, and it won't be (just) content marketing. The main business model of the consumer-facing internet is to change what they consume, and when it comes down to what can and will be leveraged to do it, the answer is of course all of it.

Along the way, advertising will once again drag into widespread commercial application, as well as public awareness, areas of mathematics and technology currently used in more specialized areas. Advertisers mostly see us — because their data systems have been built to see us — as black boxes with tagged attributes (age, searches, location). Collect enough black boxes and enough attributes, and blind machine learning can find a lot of patterns. What they have barely begun to do is to open up those black boxes to model the underlying process, the illogical logic by which we process our social and physical environment so we can figure out what to do, where to go, what to buy. Complete understanding is something best left to lovers and mystics, but every qualitative change in our scalable, algorithmic understanding of human behavior under complex patterns of stimuli will be worth billions in the next iteration of this arms race.

Business practices will change as well, if only as a deepening of current tendencies. Where advertisers now bid for space on a page or a video slot, they will be bidding for the reader-specific emotional resonance of an article somebody just clicked on, the presence of a given item in a background picture, or the location and value of an item in an Augmented Reality game ("how much to put a difficult-to-catch Pokemon just next to my Starbucks for this person, whom I know has been out in this cold day enough for me to believe it'd like a hot beverage?"). Everything that's controlled by software can be bid upon by other software for a third party's commercial purposes. Not much isn't, and very little won't be.

The cumulative logic of technological development, one in which printed flyers co-exist with personalized online ads, promises the survival of what we might call by then overt algorithmic advertising. It won't be a world with no ads, but one in which a lot of what you perceive is tweaked and optimized so it's collective effect, whether perceived or not, is intended to work as one.

We can hypothesize a subliminally but significantly more coherent phenomenological experience of the world — our cities, friendships, jobs, art — a more encompassing and dynamic version of the "opinion bubbles" social networks often build (in their defense, only magnifying algorithmically the bubbles we had already built with our own choices of friends and activities). On the other hand, happy people aren't always the best customers, so transforming the world into a subliminal marketing platform might end up not being very pleasant, even before considering the impact on our societies of leveraging this kind of ubiquitous, personalized, largely subliminal button-pushing for political purposes.

In any case, it's a race in and for the background, and once that already started.