Category Archives: Fiction

Digital anamnesis and other crimes [short story]

A database is a tool for forgetting; it decays on command, faster than nightmares and more thoroughly than graves.

That's why you copied your employers' databases the day the first Chinese aid workers flew in. All talk was of reconstructing India, the unburied bodies blamed on nothing worse than thirst and hunger, the burned villages on the brutish heat of the young century's worst drought. So unrelentingly generous as they had been with their carbon, Western countries had turned misers of their own heat-parched grain. Never mind: the proud tale of every news site on this side of the country's firewalls, the disappearing monsoon had killed the crops but raised the people's courage and selfless unity.

You remember different. Relatively safe and well-fed as you were, your hands were still bloodied with other men's more direct sins, as you trained the AIs that sanitized everything in the net suddenly made pliable by martial law. Under the prurient instincts you had burned into your software agents, gone were the forum threads spiraling down into hysterical rants about secret food hoards,the gruesome pictures of young men proudly posing with corpses of barely recognizable gender and age, even phone GPS tracks and drone footage. You saw enough during the training process to keep you from unbroken sleep for the rest of your life, and you were infinitely thankful that your software handled the rest without you having to look. The irony was lost on you, even as you trained other programs — in servers unattended by distracted engineers in panic over the crimes their loved ones might be suffering or committing, some of them far from innocent themselves — to weave together the same information you had erased elsewhere, turning anonymous mobs into lists of names through the face recognition algorithms used in street advertising, each list linked to the name of a person who had been near a geotagged peak of hate and never logged in again. Lynchings linked to conspiracy forums, connected to briefly viral videos, and from there to commercial satellite pictures horrifying once given the slightest context. A giant wall of data points criscrossed by hundreds of thousands of blood-red lines; more than a human could ever put together or hold in their mind, each death crisp and unique, without breaking heart and mind.

When your program dumped its indictments online, it was the first time so many individuals had been accused of so many atrocities in so much detail. The first selfie taken by a crazed mob, each face tagged with a name and a history of casual hate turned bloody by fear. The press, built for an age where the resolution of guilt was limited by human patience and the human eye, did not know what to do with it, so they did very little.

Stealing the database had been a crime; there had been no ready-made legal category for what you had done afterwards, the part that had made everybody uneasy in ways they couldn't define, but they made one up and charged you with it as well.

You haven't had online access since then, and are unlikely to ever will. They've told you what you did was one of the top stories for a week, and then disappeared without impact at the beginning of the season's Amazon dust storms. You hope they're lying, just another facet of the casual, unrecorded torture.

Short story: "On Agile Management as a Mechanism of Social Control"

No, for God's sake, it's not the Turing Police. They don't look for superhuman AIs, much less the sort that hires teams for convoluted transcension heists. I know that's what you wish we were working on, but they don't think they are any close to being feasible, and they would know.

What they try to anticipate is technology that would put somebody out of the reach of governments without having to pay for anybody's campaign. Terrorists and corporations, in theory, but, come on, as if Buzzfeed didn't have better tech people than ISIS. It's corporations they are worried about. From what I've heard, no, don't ask me where, it's mostly military technologies like whatever might be the nanotech version of a nuke, real versions of the memetics snake oil Cambridge Analytics was selling, encryption-killing number theory breakthroughs, that sort of thing. Supervillain stuff.

Don't worry, they don't go around killing every half-competent programmer. They don't even have to hack into networks to sabotage them, so IT sec doesn't help. Thye just blackmail key people to flounder about wasting time. Have you ever met a tech person you didn't suspect of hobbies they'd rather not talk about?

Listen. Point is, I'm not saying I'm sure our PM is deliberately sabotaging our project. I'm just saying I'm not finding that old declassified OSS sabotage manual as funny as I used to. The entire industry can't be this bad at deadlines, can we? And anyway, I think we should just roll with it. The money will always be there, this bubble or the next one, and I know I said they don't kill people very often, but maybe that's because they rarely have to, not because they won't.

And please, really, don't call them the Turing Police. That's one of the keywords they monitor.

(Automated transcript. Audio source: passive Smart TV listening network. Flagged for human review.)

Flash fic: Posthumous

The video of my murder had been viral for days. I could understand the initial error, as it was an uncommonly realistic fake; I preferred not to think about how enthusiastically people viewed and shared it even after the news had been debunked. I had watched it myself often enough to make its key moments too familiar to pay attention to: the man in the dark shirt drawing his gun, the frozen security guard, the sudden spot of blood on a green dress.

So when I saw a man in a black shirt with a gun in his hand I felt no fear but incredulous deja vu. He wasn't as smooth as the real one had been in the synthetic video, but the security guard near us was even slower, his hand momentarily caught in the muscle memory of that other cop.

Nobody in the lobby, as the man took agonizingly awkward seconds to aim, looked surprised or tried to stop him.

I know how you feel, I thought without resentment. I wasn't running, because I hadn't. I knew the spell would only last a few seconds more, and that it would be enough to get me killed.

In a final moment of satisfaction, I realized without looking that I was wearing a green dress.


Short story: Nanobots and the Teenage Brain

It took a while to diagnose Charlie's problems; what thirteen years old boy isn't moody? But once his parents suspected there was something else going on inside his head, doctors injected a swarm of machines so small they were practically very large drugs, and the machines showed them that, to Charlie's annoyance, his parents had been right.

Brains are like ecosystems, Charlie's doctor explained to him and his parents. Every part of Charlie's brain works, but the way they synchronize and work together isn't the way we would prefer. The system is in a balance of sorts; it's just that it's a balance that results in things like mood swings and insomnia.

The doctor hadn't mentioned the nightmares, but Charlie suspected he knew how bad they were, even if he hadn't told him, or anybody else, how much the night scared him. Probably the machines inside his head had told the doctor the truth. Charlie didn't have insomnia, he just tried not to sleep.

What do we do then? had asked Charlie's mother. Give him medication? His uncle used to take antidepressants.

The doctor had nodded. That's what we would have tried a few years ago, but it takes quite of a bit of trial and error, and even once you find something that works, there are usually side effects. Almost always the side effects are minor compared with the original symptoms, and you can tweak the dosage and sometimes eventually cease the medication, but today we have better tools. We already have nanobots lodged in key areas of Charlie's brain. We are using them to diagnose him, but they can be partially rebuilt to integrate themselves with his brain functions.

Charlie's father, who had seen a lot of horror movies as a teenager, frowned. You mean use a computer to control his emotions?

The doctor smiled. Oh, no. Its like adding a carefully chosen new species into an ecosystem. It will interact with the rest of his brain, send a signal there, dampen a neurotransmitter here, and Charlie's brain will adapt slightly to it while the machines adapt very strongly to him. The end result will be a healthier and more resilient brain, but not a different one, and certainly not one under anybody's control but him. We are only beginning to try it in humans, but we can monitor it very closely and stop if anything looks wrong, so in a sense it's safer than the usual medication.

Using machines they could instantly switch sounded like a safer option than trying medications until they found something that worked, so they modified the nanobots in Charlie's brain to make them able to talk to it as well as listen.

The brain talked and listened to itself, and now itself included both the machines and the software controlling them from a small chip in Charlie's skull. The chip learned from Charlie's brain, Charlie's brain learned from the chip, and eventually they were just Charlie.

The mood swings and the nightmares went away. The chip didn't change Charlie, and nobody hacked them. This isn't that kind of story.

* * *

There's a thirteen years old child waking up from a nightmare, crying. But this is three years later, and she's called Grace.

* * *

Charlie's parents wanted to refuse. Would've, certainly, even to the doctor who had healed Charlie. But he had shown them videos of Grace, and although everybody agreed that it had been a low trick, it was enough to make the parents agree to leave the choice to Charlie.

She has the same sort of device you have, the doctor told Charlie. The device works well; yours too, by the way, you know I'll get an alert if anything went wrong. But the device needs to learn from the brain how to help it, and for some reason it's not able to learn from Grace's. We think her condition is somewhat different from yours, like the same riddle in a different accent, and the device isn't picking it up.

So what do you want me to do? asked Charlie. He wasn't a bad guy, but he didn't want to go to a hospital again, ever.

The doctor told him his plan. It was much worse than what Charlie had feared. Maybe that's why Charlie said he would do it, the way sixteen years old say 'yes' to whatever really scares them.

* * *

Grace and Charlie didn't lie in parallel operating tables, thick cables connecting their skulls. They sat in comfy chairs next to each other while both sets of parents watched. The doctor was telling them again how they had temporarily reprogrammed the chips in their skulls so Charlie's chip would control Grace's nanobots and the other way around, but that was mostly to fill the silence while he monitored everything.

Not that the parents paid much attention anyway. Charlie's were too worried about something going wrong, and Grace's were crying softly.

For the first time in a long while Grace had fallen asleep smiling.

* * *

It took seven sessions for Charlie to train Grace's device. At the end they were close strangers, people with nothing in common except a very important thing much too big to base a friendship on. But she was thirteen, so she had given him a nickname anyway.

Why does she call you that? asked the doctor after the last session, at a time when he and Charlie were briefly alone.

You know, said Charlie, rolling his eyes, like the guy from the movies. The one who can read minds.

The doctor, who had liked the character about two franchise reboots before, smiled. Well, your brain can do something nobody else's can, and you helped her, so she's not entirely wrong about that.

By the way Charlie looked at him while pretending to find him ridiculous, the doctor knew that he would agree to help if he ever asked again.

* * *

He did ask, four times. It turned out Charlie's success had been less likely than they had thought, and his brain's talent to train the device a rare one. Charlie was always enthusiastic to help, and Charlie's parents eventually made peace with it, not without fear, but also not without pride.

The doctor finally stopped asking for his help once the company designed new machines that could learn from any brain; they had figured out how to do that by watching Charlie help others, and in that sense he would always be helping. Charlie had shrugged when told, relieved but hating himself a bit for it.

He kept in touch with the doctor. They never mentioned the returning nightmares. Charlie had known his own well enough to understand they weren't his to begin with; his brain had learned them from the other kids' devices, who had learned them from their brains.

They talked about everything else, mostly about the people helped by the software they had built based on Charlie's brain, pretending it was a coincidence that the doctor always called the morning after a bad nightmare. He was still monitoring Charlie's device, after all.

Charlie hates the nightmares, and feels bad about never telling his parents about them. But if he had told them they wouldn't have let him help. Keeping secrets had been a necessary part of being a superhero, and if he woke up in a cold sweat more often than not... Most retired heroes had scars, and he had earned his helping others.

And he's no longer afraid of the night.

Short Story: The Voice of Things

She had liked the illustrated book so much much she told you right away she had prayed to get it for Christmas, alone in her bedroom where nobody but God could hear. You didn't mention her teddy bear had probably heard her and the toy company then sold the information to an advertiser who had offered you the book with an extraordinary discount. If she was happy, that was what mattered.

You never realized the bear sometimes talked back, not until the scandal made the news. It turned out it always could, it just had waited until its sensors told it kid and toy were alone. The license that came with the bear's software made this "user bonding" legal; the company went bankrupt anyway.

But nothing's ever forgotten if there's money in remembering, and sometimes you're almost sure things talk to your daughter not with their standard voices, but with one she remembers and trusts.

So you talked to her about cookies and the cloud, at least what you understand of it. She nodded along to your explanation, unsure, asking nothing. Afterwards, you wondered what things would tell her when she asked them.

Short story: Soul in the Loop

Every shower she takes makes you more certain she will have killed herself before her daughter's tenth birthday, and these days she's taking one every time she logs off. You aren't allowed to tell her, but the NDA you made her sign has so many post-employment clauses she wouldn't be likely to find a job elsewhere anyway.

A daughter, two parents with Alzheimer, and the obsolete skillset of a radiologist and former e-sports semi-pro: She's as good a match for this job as any human could be, and only humans are allowed to. That's the point.

The politics of deploying killer robots require humans watching what they do to prevent them from doing the a posteriori unacceptable, but the business side of the equation — and somewhere in the company's software stack there's a piece of mathematics modeling just that — compels humans to barely if ever stop the robots from taking the shot. Armies don't pay for robots that don't shoot. The Oxford Protocol supervisors are there to ensure they could, theoretically, be stopped, and to suffer the legal consequences if it becomes convenient for somebody to.

So she logs in ten hours a day to watch the death of people she could have saved — people who might or might not be innocents, people whose names she'll never know — at the cost of risking homelessness for herself, her daughter, and two helpless people who once raised her and she still loves. She never stops a robot. She just takes a shower immediately after every session, the company's contract-mandated monitoring of home network logging it as another data point in her profile.

The company's behavioral prediction models indicate that compulsive showering correlates with late-stage burnout, which means you should start choosing a replacement for her from the vast and growing pool of the economically deprecated. Some of them would last longer than others, and some would actually enjoy their jobs. You always pick the ones who don't, the ones who eventually need a shower every time they log off, and sometime after that require a replacement of their own.

You understand the business case for this company policy, yet find ironic that you would be barred from doing the job you choose people for. But it's not like you don't enjoy your own.


Short story: Logs from a haunted heart

She's scared all the time. But is her fear the reason why her heart suddenly speeds up a dozen times a day, shifting in a second from the dull ticking of dread into the accelerating staccato of runaway panic? The diagnostics in her peacemaker's app say that everything is normal, but perhaps they can be faked by somebody with maintenance access to the device. She doesn't have it, she's only the patient.

Maybe her ex-husband, a medical tech sales rep, does. Too many things have default passwords companies never bother to change. But there'd be no point in talking with him, even if she hadn't moved across the country to avoid ever having to. In an emergency room they'd just look at the same app she has, and she can't get an appointment with an specialist before next month.

Tomorrow is the one year anniversary of the day she told her husband she was leaving.

She's scared. Maybe that's what makes her chest feel like it's going to break.


Original Fic: The Gift of Memory

Not the kind of story I usually post here, but I don't just write dread-infused, mostly-dystopian sci-fi, you know?

In your dreams the world is full of marvels, love, safety. You're immortal and beautiful, and reality, charmed, dances with your thoughts.

In your nightmares the Universe's laws are poisoned, malignant, infected by something else. Something that shouldn't be there, is. Something that hates, haunts, hungers for you.

In your waking you forget they are memories.

We could've taken them with the power and the beauty and the everlasting life, but we enjoy reliving the endless night of our victory when we sucked the world dry and left it the ruined husk it is now. We left you the memories and the sadness, but not the knowledge. At times, in the satiety after other victories among the unperceived rubble of other worlds, it gives us an extra bit of joy.


Short story: Nice girl falls in love with vampire boy. Of course he kills her

(In honor of World Dracula Day)

Nice girl falls in love with vampire boy. Of course he kills her. Did she want him to? Did she understand his hunger wasn't metaphorical? Let's not assume innocence.

Perhaps between man and monster she chose the safer one. Better to know where you stand. Even if there is no such thing as turning; you are born a vampire or you die to feed one. What predator recruits from the herd? Curses are arbitrary, ecosystems have to make sense.

Let's not assume authorial motivation for the story. Identity. Species. Beautiful monsters don't need to dream about being loved, but they can regret not having been able to be otherwise than they are.

They can imagine a world where nice girl falls in love with vampire boy and survives. Innocent meals and sunlit warmth. Otherwise - otherwise she would have to share his night , his murders, his table. Know the taste of her people in his lips and her tongue. Die of guilt or embrace the hunt.

Let's not assume her niceness was more than gesture-deep. Maybe the monster's appeal wasn't his beauty. Maybe she first kissed him in search of that flavor.

Let's not assume monsters can always tell their own. One can regret losing somebody who was never there. Maybe she laughs as she reads your tales, at who you thought she was. At the future you thought you both wanted and could have.

Let's not assume her laugh doesn't hurt you, or that you don't love her for that.

Short story: The Associate

I seldom know who's paying me or what they do; only my few friends lucky enough to have jobs do. My phone will buzz, and if I bid low enough I'll get to do things that will feel like isolated musical notes, meaningless on their own, in places that sometimes will appear later in the news in ways I won't be able to relate to my own actions but also won't try to.

A wordless feeling will keep me from adding to the pain and outrage of the comment threads, but the daily rent payments sometimes don't leave me enough for food, so I'm always hoping my phone will buzz with a new incomprehensible gig, and when it does I always bid low.

The Children of the Dead City

Dusk is coming and walking at night is no longer allowed, but the children still loiter near the black windowless building that looks like a tombstone for a giant or a town. A year ago most of their parents worked there, their hands the AI-controlled manipulators of the self-managed warehouse, but since then artificial hands have become good enough, and no more than a dozen humans tarnish the algorithmic purity of the logistics hub.

With so many residents unemployed, the town can no longer afford the software usage licenses that keep the smart city infrastructure working. Traffic lights cycle blindly without regard for people or cars. Medical help has to be called for manually, phones and buildings callously ignoring emergencies and uninterested in saving lives.

No unblinking mind watches over children on the streets. Something does, something nameless and uncaring, and parents have tried to explain that it's just an analytics company the town is selling the video feeds to, but they also tell them to be home early, and fret over their health more than before.

Like every physically vulnerable life form, children know when they are being lied to. They also know when a place is haunted.

Night has fallen, and the children finally leave the familiar presence of the warehouse's continuously thinking walls. The walk back home is scary and thrilling, the well-lighted streets only increasing the menace from the once soothing eyes on every pole and wall. The children move in packs, wordlessly alert, but some must walk alone to houses out of the way.

Not all of the children arrive on time. When apprehensive parents eventually go out searching for them, asking the city in vain for help, not all are found. A camera last saw them, a neural network recognized them, a database holds the memory. But the city is silent.

For a while no child walks unaccompanied, yet that cannot last forever, and the black monolith keeps calling to them with the familiar warmth of a place where everything sees, and thinks, and cares.

"Tactical Awareness" en Español

Esteban Flamini hizo lo que no imaginé que fuera posible: tradujo TACTICAL AWARENESS al Español manteniendo tanto el argumento de las historias como la cuenta de palabras. Su traducción, como el texto original, se puede bajar gratuitamente en su sitio.

Incluso si no te interesan las historias, o si ya las leíste, vale la pena leer la versión de Esteban, aunque más no sea para apreciar una traducción realmente difícil realizada extremadamente bien.

Short story: The Eater of Silicon Sins

His job is not to press the button. When he fails at his job, people don't die.

There used to be support groups for people like him, groups he wasn't supposed to attend but did anyway. They were for the people who worked the most awful images the human mind could conceive, videos of violence and sexual abuse beyond any quaint nightmares they might have had before, flagging them so the psychological damage of seeing those videos — and knowing those things were happening at that very moment to some terrified person inarticulate with pain — would remain contained inside their own minds. They could barely afford food on gig economy rates, much less therapy, so they met online to not talk about what they couldn't, and half-heatedly and not often successfully prevent each other from killing themselves.

He would go to those groups to seek some simulacrum of health in their shared illness, yet there would always be a barrier between him and everybody else. What he sees every day isn't the crisp video of a carefully recorded personal hell, but the blurry real-time monitoring feed of a superhumanly fast combat robot moving, targeting, and shooting quicker than any human could. It would be impossible for him to decide faster and better than the robot which of the moving figures are enemy combatants, children trying to run from a war without fronts, or both.

So he never presses the button, and prays every night beyond statistical hope to have never let a terrified innocent die.

The groups went away when computers became better than humans at filtering out that kind of material, but he knows he will never be replaced. No matter how good the robots get, how superhumanly quick and accurate their autonomous reactions, there'll still be innocents dead whenever they are used for what they were built for; not because the technology is flawed, but because that's the tactically optimal tradeoff they've been configured for. His job is to take the blame for, and only a human can do that.

He doesn't drink, nor take pills, nor beat his wife. He has no dangerous hobbies. He does his duty like any good soldier would do.

In his dreams he seems himself on a screen, his face framed by a targeting solution. The image stays stills for an impossibly long time, yet he never presses the button.


Short story: Dead Man's Trigger

My name is Rob, short for Roberta. I'm a private investigator, which means I'm good enough with social networks to do what the police does, just without the automated subpoenas and the retroactively legal hacking. It's not difficult, really. Nine times out of ten the obvious suspect did it. The bereaved know who did it, acquaintances know who did it, even the police know who did it.

So ten times out of ten I'm hired when the police pretends not to know who did it, when a judge pretends not to believe them, or when a jury pretends they've got reasonable doubt. I'm never hired to figure out who did it, despite the pretenses the client and I go through. I'm not even hired to find proof. I'm hired because once I've found, again, what everybody knew, and collected the proof they didn't need, I give them a burner email address.

They hire me for that email address. I don't like it, but I don't dislike it enough not to give it to them. It's my business to give the address, not what they do with it.

I can pretend not to know just as well as cops, judges, and juries do, but I can't lie to myself, not about this. Content sent to those addresses usually goes viral. Which by itself would be a weak form of revenge: The crimes the police decide not to solve, judges not to take to trial, and juries not to punish, are the kinds of crime many people cheer the criminal for. Shooting the "right" kind of person, more often than not. (My boyfriend was the right kind of person. Serious, sad, brilliant John. Did he know how he'd die when he wrote this program?)

But the evidence doesn't just go viral, it infects the right sort of group. I don't use the word metaphorically, or at least not much. I don't know who those people are, but I'm sure they aren't always the same. Depends on the crime, on the victim, and on tides I don't visit the right forums to feel the shifting of. I'm glad of that, for my sanity's sake. (John had to, if nothing else to teach the program to seek them. I didn't know him well, it turns out, while he knew exactly what I would and wouldn't do. I only get email addresses sent to me. Nothing more.)

I don't tell myself that the deaths that follow are coincidence. I don't dwell in how they are not. I sleep reasonably well.

I've stopped missing John.


Rush Hour

Three minutes ago you were in a traffic jam, one of dozens of drivers impatiently waiting for their cars to reboot and shake off whatever piece of malware had infected them through the city network. Now you're moving.

You're moving very, very fast. You can see every car ahead of you moving aside as if by magic, either on their own or pushed by another, their drivers as surprised as you are.

A few other cars both ahead and behind are moving just as fast as yours. They are all big ones. There's a certain, important building a few blocks ahead and a handful of seconds away.

You understand where the cars are accelerating towards and what for.

You don't scream until the car in front of you crashes through the wall.


Safe Travels

The almost absolute lack of TSA security measures in "your" queue is both insult and carrot, but as long as they still feel the need to offer a carrot things aren't really that bad. You mostly try to believe this when your son is looking at you with the relaxed smile of the unscared. It makes it easier to smile back.

Boarding is unnervingly fast, the plane small and old, the uniform rows of dark skins and headscarves an insult, the lack of angry whispers a carrot. You try to focus on your son, who's excited about his first flight although pretending not to. You think, and hope, he doesn't notice how everybody in the plane resembles his own family, or that he doesn't think they do — that he thinks skin and dress less important than the way some kids like soccer and some prefer VR games.

Believing this would make him a good man. Trusting that everybody does could get him lynched one day. For now, he sees neither carrots nor insults here, just a small window, the ground falling, and then the sky.

It breaks your heart as much as it lifts it, but when he looks again at you you'll be waiting with a smile. And later de-boarding will be quick and your terminal will be small and somehow quaint, and you know one day you'll have to talk with him about such things, but for now you just look at his breathless expression reflected on the plane window, and tell yourself it isn't selfish to wish for you both just a little bit more of sky.


"Prior art" is just a fancy term for "too slow lawyering up"

They used to send a legal ultimatum before it happened. Now you just wake up one day and everything green is dead, because the plants are biotech and counter-hacking is a legal response to intellectual property theft, even if the genes in question are older than the country that granted the patent.

My daughter isn't looking at the rotting remains of her flower garden. Her eyes are locked into mine, with the intensity of a child too young not to take the world seriously. Are we going to jail?

No, I say, and smile. They only go personally after the big ones; for small people like us this destruction suffices.

She nods. Am I going to die?

I kneel and hug her. No, of course not, I say, with every bit of certainty I can muster. There's nothing patented in you, I want to add, but she's old enough to know that'd be a lie.

I feel her chest move and I realize she had been holding her breath. We stay together, just breathing. The air is filled with legal pathogens looking for illegal things to kill.


The Man Who Was Made A People

Gregory has two million evil twins. None of them is a person, but why would anybody care?

They are everywhere except in the world. They search the web, click on ads, make purchases, create profiles, favorite things, post comments. Being bots, they don't sleep or work; they do nothing but what they were programmed to do, hidden deep in some endless pool of stolen computing power they have been planted in like dragon's teeth.

They are him. Their profiles carry his name, his location, his interests, or variations close enough to be indistinguishable to even the most primitive algorithm. The pictures posted by the bots are all of men very similar to Gregory in skin tone, clothes, cellphone, car. And he knows they are watching him, because when he changes how he looks, they change as well.

They are evil. Most of their online activities are subtle mirrors of his own, but some deal with topics and people that most find abhorrent, and none more than himself. Violence, depravity, every form of hate and crime, and — worst of all — every statistically known omen of future violence and crime.

Driven by the blind genius of predictive algorithms, sites show Gregory increasingly dark things to look at and buy, and suggest friendships with unbalanced bigots of every kind. His credit score has crumbled. Journalism gigs are becoming scarce. Cops scowl as they follow him with eyes covered by smart glasses, one hand on their guns and the other on their radios. He no longer bothers to check his dating profile; the messages he gets are more disturbing than the replies he no longer does.

He has begun to go out less, to use the web through anonymizing services, to take whatever tranquilizers he can afford. All of those are suspicious activities on their own, he knows, but what choice does he have? He spends his nights trying to figure out who or what he offended enough to have this all-too-real curse laid upon him. The list of possibilities is too large, what journalist's isn't?, and he's not desperate enough to convince himself there's any point to seeking forgiveness. He's scared that one day he might be.

Gregory knows how this ends. He has begun to click on links he wouldn't have. Some of the searches are his. Every night he talks himself out of buying a gun. So far.

He has begun to feel there are two million of him.


The Girl and the Forest

The girl is crossing a frontier that exists only in databases. Her phone whispers frantically on her ear: crossing such a frontier triggers no low-priority notification, but the digital panic merited by a lethal navigational mishap. Cross a line between two indistinguishable plots of land and you become the legitimate target of automated guns, or an illegal person to be sent to a private working prison, or any number of other fates perhaps but not certainly worse than what you were leaving behind.

The frontier the girl is crossing separates a water-poor region from a barren desert, the invisible line a temporary marker of the ever-faster retreat of agricultural mankind. The region reacts to unwanted strangers with less robots but as much heavily armed dedication as any of the richer ones. But the girl is walking into the desert, and there are no defensive systems on her way. There is just the dead sand.

She doesn't carry enough water or food to get her to the other side.

* * *

The girl went to a hidden net site a friend had shared with her with the electronic whispers and half-incredulous sniggers other generations had reserved for the mysteries of sex. Sex wasn't much of a mystery to their generation, who had seen everything long before it could be understood with anything except her mind. But they had never been in a forest, and almost none of them ever would. They traded pictures and descriptions of how the desert looked before it was a desert, and tried to imagine the smell of a thousand acres of shadowed damp earth. It was a fad, for most. A phase. Youth and nostalgia are mutually incompatible states.

Yet for some their dreams of forests endured: they had uncovered something, a secret, found because they weren't welcomed into the important matters reserved for grownups. Inside the long abandoned monitoring network their parents' generation had used to attempt to manage the retreating forest, some of the sensors were still alive. Most of them were repeating a monotonous prayer of heat and sand to creators too ashamed of their failure to let themselves look back.

But some of the sensors chanted of water, and shadow, and biomass. The girl had seen the data in her phone, and half-felt a breeze of leaves and bark. What if satellite pictures showed a canyon that, yes, could be safe from the soil-stealing wind, but was as barren as everything else? What of that?

The girl thought of her parents, and of the child she had promised herself she wouldn't give to the barren earth, and with guilt that didn't slow her down, she took the least amount of water she thought would be enough to get her to the canyon, and went into the desert.

The dull sleepless intelligences inside the border cameras saw her leave, but would only alert a human if they saw her walking back.

* * *

The girl will barely reach the canyon, half-dying, clinging to her last bit of water as a talisman. There will be no forest there, nothing in the canyon but dry sand. But in the small caves between the rocks, where the geometry of stones has built small enclosed worlds of darkness, she'll find ugly, malevolently tenacious, and very much alive mushrooms, and around them the clothes of those who will have reached the canyon before her. Most of their clothes will be of her size.

The girl will understand. She won't drink the last of her water, but give it to the mushrooms. Then she will lie down and close her eyes, and fall sleep in the shadow, surrounded by a forest at last.


Asking a Shadow to Dance

Isomorphic is a mathematical term: it means of the same shape. This is a lie.

Every morning you wake up in the apartment you might have bought if you hadn't been married (but you were, and those identical apartments are not the same). Your car takes you through the same route you would have taken, to an office where you look into the blankness of a camera and the camera looks back. You see nothing. The camera sees the pattern of blood vessels on the back of your eyes, and opens your computer for you.

The interface you see is always the same, just patterns of changing data devoid of context. Patterns that a combination of raw genetic luck and years of training has made you flawlessly adept at understanding and controlling. The pattern on your screen changes five times each second. Faster than that, you move your fingers in a precise way, the skill etched in your muscles as much as in your brain. The pattern and your fingers dance together, and the dance makes the pattern stay in a shape that has no meaning outside itself. You have received almost every commendation they can give to someone doing your job. Only the man on the other side of the table has more. You have never seen his screen, and he has never seen yours.

The inertial idiocy of that security rule is sickening in its redundancy. You couldn't know what he's doing from the data on his screen any more than you can know what you are doing from what you see in yours. Sometimes you think you're piloting a drone swarm. Sometimes you're defending an infrastructure network, or attacking one. Twice you have felt a rhythm in the patterns almost like a heart, and wondered if you were killing somebody through some medical device.

But you don't know. That's the point. Whatever you could be doing, the shape of the data on your screen would be the same, all the necessary information to control, damage, defend, or kill, but scrubbed of all meaning tying it back to the real world. Isomorphism, the instructors called it.

But that's a lie. It's not the same, and it could never be.

You begin to lose sleep. Twice the camera on your computer has to learn a new pattern for the blood behind your eyes. Your performance doesn't suffer; the parts of your mind and body that do the work are not the ones grappling with a guilt larger because it's undefined. Your nightmares are shapeless: you dream of data and wake up unable to breath.

One day you finally allow yourself to know that the man across the table enjoys his work. Always had. You had ignored him all those years, him and everything not in the data, but now you look at him with a wordless how? He makes a gesture with his head, come and see. An isomorphism that scrubs the data not only of meaning but also guilt.

You need it so much that you don't stop to think about the rules you're both breaking under the gaze of the security cameras. You just go around the table and look at his screen.

There's no isomorphism. There's nothing but truth, and you can neither watch nor stop watching. His fingers are dancing and his smile is joyful and he has always known what he was doing. And now you can, too.

You scramble back to your screen in blind haste. The patterns of data are innocent, you tell yourself, of everything you saw on that other screen, and so if the dance of your fingers. They just have the same shape, that's all.

You work efficiently as ever. You wonder if you'll go crazy, and fear you won't, and know that neither act will change your shape.


At the End of the World

As the seas rose and the deserts grew, the wealthiest families and the most necessary crops moved poleward, seeking survivable summers and fertile soils. I traveled to the coast and slowly made my way towards the Equator; as a genetics engineer I was well-employed, if not one of the super-rich, but keeping our old ecosystems alive was difficult enough if you had hope, and I had lost mine a couple of degrees Celsius along the way.

I saw her one afternoon. I was staying in a cramped rentroom in a semi-flooded city that could have been anywhere. The same always nearly-collapsed infrastructure, the indistinguishable semi-flooded slums, the worldwide dull resentment and fear of everything coming from the sky: the ubiquitous flocks of drones, the frequent hurricanes, the merciless summer sun.

She seemed older than I'd have expected, her skin pale and parched, her once-black hair the color of sand. But she had an assurance that hadn't been there half a lifetime ago when we had been colleagues and roommates, and less, and more. Before we had had to choose between hope for a future together and hope for a future for the world, and had chosen... No, not wrong. But I had stopped believing we could turn the too-literal tide, and, for reasons I had suspected but not inquired, she had lost or quit her job years ago. So here we were, at the overcrowded, ever-retreating ruinous limes of our world. I was wandering, and she was riding a battery bike out of the city. I followed her on my own.

I don't know why I didn't call to her, why I followed her, or if I even wanted to catch up. But when I turned a bend on the road she was waiting for me, patient and smiling, still on her bike.

"Follow me," she said, going off the road.

I did, all the way through the barren over-exploited land, the situation dreamlike but no more than everything else.

She led me to a group of oddly-looking tents, and then by foot towards one that I took to be hers. We sat on the ground, and under the light of a biolamp I saw her close and gasped.

Not in disgust. Not despite the pseudoscales on her skin, or her shrouded eyes. It wasn't beauty, but it was beautiful work, and I knew enough to suspect that the changes wouldn't stop at what I saw.

"You adapted yourself to the hot band," I said.

She smiled. "Not just me. I've been doing itinerant retroviral work all over the hot band. You wouldn't believe the demand, or how those communities thrive once health issues are minimized. I've developed gut mods for digesting whatever grows there now, better heat and cold resistance, some degree of internal osmosis to drink seawater. And they have capable people spreading and tweaking the work. They call it submitting to the world."

"This is not what we wanted to do."

"No," she said, "but it works." She paused, as if waiting for me to argue. I didn't, so she went on. "Every year it works a little better for them, for us, and a little worse for you all."

I shook my head. "And next decade? Half a century from now? You know the feedback loops aren't stopping, and we only pretend carbon storage will reach enough scale to work. This work is phenomenal, but it's only an stopgap."

"It's only an stopgap if we stop." She stood up and moved a curtain I had thought a tent wall. Behind it I saw a crib, the standard climate-controlled used by everybody who could afford them.

Inside the crib there was a baby girl. Her skin was covered in true scales, with tridimensional structures that looked like multi-layer thermal insulation. Her respiration was slow and easy, and her eyes, blinking sleepily, catlike, like those of a creature breed to avoid and don't miss the sun. I was listening with half an ear to the long list of other changes, but my eyes were fixed on the crib's controls.

They were keeping her environment artificially hot and dry. The baby smile was too innocent to be mocking, but I wasn't.

"And a century after next century?" I said, not really asking.

"Who knows what they'll become?" I wasn't looking at her, but her voice was filled with hope.

I closed my eyes and thought of the beautiful forests and farms of the temperate areas, where my best efforts only amounted to increasingly hopeless life support. I wasn't sure how I felt about the future looking at me from the crib, but it was one.

"Tell me more."


Memory City

The city remembers you even better than I do. I have fragments of you in my memory, things I'll only forget when I die: your smell, your voice, your eyes locked on my own. But the city knows more, and I have the power to ask for those memories.

I query databases in machines across the sea, and the city guides me to a corner just in time to see somebody crossing the street. She looks just like you as she walks away. Only from that angle, but that's the angle the city told me to look from.

I sit in a coffee shop, my back to the window, and the city whispers a detached countdown into my ears. Three blocks, two, one. Somebody walks by, and the cadence of her steps is just like yours. With my eyes closed they seem to echo through the void of your absence, and they are yours.

I keep roaming the streets for pieces of you. A handful of glimpses a day. Fragments of your voice. The dress I last saw you in, through the window of a cab. They get better and more frequent, as if the city were closing on you inside some truer city made from everything it has ever sensed and stored, and its cameras and sensors sense many things, and the machines that are the city's mind remember them all.

I feel hope grow inside me. I know the insanity of what I'm doing, but knowing is less than nothing when I see more of you each day.

One night the city takes me to an alley. It's not the street where I met you, and it's a different season, but the urgency of the city's summons infects me with a foreshadowing of deja vu.

And then I see you. You've changed your haircut, and I don't recognize your clothes, and there's something about your mouth...

But your eyes. I know those eyes. And you recognize me, of course, impossibly and unavoidably. How else to explain the frightened scream I cut short?

I have been told by engineers, people I pay to know what I don't, that the city's mind is somehow like a person's. That it learns from what it does, and does it better the next time. I don't understand how, but I know this to be so. We find you more quickly every time, and I could swear the city no longer waits for me to ask it to. Maybe it shares some of my love for you now.

Maybe you'll never be alone.


The Secret

I saved his name in our database: it vanished within seconds into a place hidden from both software traces and hardware checks. Search engines refused to index any page with his name on it, and I couldn't add it to any page in Wikipedia. A deep neural network, trained on his face almost to overfitting, was unable to tell between him, a cat, and a train.

I don't know how he does this, and I'm afraid of asking myself why. His name and face faded quickly from my mind. Just another computer, I guess.

But then what remainder of the algorithm of my self impossibly remembers what everything else forgets? I'm afraid of the way he can't be recorded, but I feel nothing but horror of whatever's in me that can't forget. That part is growing; tonight I can almost remember his face.

Will I become like him? Will I also slip intangible through the mathematics of the world? And will I, in that day, be able to remember myself?

I keep saving these notes, but I can't find the file.


The Rescue (repost)

The quants' update on our helmets says there's a 97% chance the valley we're flying into is the right one, based on matching satellite data with the ground images that our "missing" BigMule is supposed to be beaming a that Brazilian informação livre group. Fuck that. The valley is too good a kill-box not to be the place. The BigMule is somewhere around there, going around pretending it's not a piece of hardware built to bring supplies where roads are impossible and everything smaller than an F-35 gets kamikazed by a micro-drone, but a fucking dog that lost its GPS tracker yet oh-so-conveniently is beaming real-time video that civilians can pick up and re-stream all over the net. It shouldn't be able to do any of those things, and of course it's not.

It's the Chinese making it do it. I know it, the Sergeant knows it, the chopper pilot knows it, the Commander in Chief knows it, even probably the embedded bloggers know it. Only public opinion doesn't know it; for them it's just this big metallic dog that some son of a bitch who should get a bomb-on-sight file flag gave a cute name to, a "hero" that is "lost behind enemy lines" (god damn it, show me a single fucking line in this whole place), so we have to of course go there like idiots and "rescue" it, so the war will not lose five or six points on some god-forsaken public sentiment analysis index.

So we all pretend, but we saturate the damn valley with drones before we go in, and then we saturate it some more, and *then* we go in with the bloggers, and of course there are smart IEDs we missed anyway and so on, and we disable some and blow up some, and we lose a couple of guys but within the fucking parameters, and then some fucking Chinese hacker girl is *really* good at what she does, because the BigMule is not supposed to attack people, it's not supposed to even have the smarts to know how to do that, and suddenly it's a ton of fast as shit composites and sensors going after me and, I admit it, I could've been more fucking surgical, but I knew the guys we had just lost for this fucking robot dog rescue mission shit, so I empty everything I have on that motherfucker's main computers, and I used to help with maintenance, so by the time I run out of bullets there isn't enough in that pile of crap to send a fucking tweet, and everybody's looking at me like I just lost America every single heart and mind on the planet, live on streaming HD video, and maybe I just did, because even some of the other soldiers are looking at me cross-like.

At that very second I know, with that sudden tactical clarity that only comes after the fact, that I'm well and truly career-fucked, so I do the only thing I can think of. I kneel next to the BigMule, put my hand where people think their heads are, and pretend very hard that I'm praying; and who knows, maybe I'm scared enough that I really am. I don't know at that moment what will happen &mash; I'm half-certain I might just get shot by one of our guys. But what do you know, the Sergeant has mercy on me, or maybe the praying works, but she joins me, and then most of us soldiers are kneeling and praying, the bloggers are streaming everything and I swear at least one of them is praying silently as well, we bring back the body, there's the weirdest fake burial I've ever been to, and you know the rest.

So out of my freakout I got a medal, a book deal, and the money for a ranch where I'm ordered to keep around half a dozen fucking robot "vets". Brass' orders, because I hate the things. But I've come to hate them just in the same way I hate all dogs, you know, no more or less. And to tell you the truth, even with the book and the money and all that, sometimes I feel sorry about how things went down at the valley, sort of.


(Inspired by an observation of Deb Chachra on her newsletter.)

And Call it Justice (repost)

The last man in Texas was a criminal many times over. He had refused all evacuation orders, built a compound in what had been a National Park, back when the temperatures allowed something worthy of the name to exist so close to the Equator, and hoarded water illegally for years. And those were only the ones he had committed under the Environmental Laws; he had had to break the law equally often, to get the riches to pay for his more recent crimes.

This made him Perez' business. The entirety of it, for if he was the last man in Texas, Perez was the last lawman of the Lone Star State, even if she was working from the hot but still habitable Austin-in-Exile, in South Canada. Perez would have a job to do for as long as the man kept himself in Texas, and although some people would have considered it a proper and good reason for both to reach an agreement, Perez wanted very badly to retire, for she had grown older than she had thought possible, and had still plans of her own. On the other hand, the prospect of going back to Texas didn't strike her as a great idea; she would need a military-grade support convoy to get through the superheated desert of the Scorched Belt, and going from what she had found about the guy, she would also need military-grade firepower to get to him once she arrived to the refrigerated tin can he called his ranch. He wouldn't be a threat, as such — hell, she could blast the bastard to pieces from where she stood just by filling the proper form — but that would have been passing the bucket.

The last outlaw in Texas, Perez felt, deserved another kind of deal.

So she called the guy, and watched and listened to him. He began right away by calling her a cunt, to which she responded by threatening to castrate him from orbit with a death ray. Not that she had that kind of budget, but it seemed to put them in what you'd call a conversational level field. Once half-assured that he was not in any immediate threat of invasion from a platoon of genetically modified UN green beret soldiers (funny how they could make even the regular stuff sound like a conspiracy), the guy felt relaxed enough to start talking about his plans. The man had plans out of his ass. He'd find water (because, you see, the NASA maps had been lying for decades, and he was sure there had to be water somewhere), and then he'd rebuild the soil. He didn't seem to have much of an idea about phosphorous budgets and heat-resistant microbial strain loads, but it was clear to Perez that he wasn't so much a rancher gone insane as just somebody insane with a good industrial-sized fridge to live in. By the time he was talking about getting "true Texans" to come back and repopulate, Perez felt she had learned enough about her quarry.

She told him she would help.He couldn't trust the latest maps, of course, which were all based on NASA surveys, so she offered to copy from museum archives everything she could find about 18th century Texas — all the forests, the rivers, and so on. She'd send him maps, drawings, descriptions, everything she could find.

He was cynically thankful, suspecting she'd send him nothing, or more government propaganda.

Perez sent him everything she could find, which was of course a lot. Enough maps, drawings, and words to see Texas as it had been. And then she waited.

He called her one night, visibly drunk, saying nothing. She put him on hold and went to take a bath.

Two days later she queried the latest satellite sweep, and found the image of a heat-bleached skeleton hanging from an ill-conceived balcony on an ill-conceived ranch.

So that's how the last outlaw in Texas got himself hanged, and how the last lawman could finally give up her star and move somewhere a little bit cooler than Southern Canada, where she fulfilled her long-considered plan and shot herself out of the same sense of guilt she had sown in the outlaw.


The Long Stop

The truckers come here in buses, eyes fixed on the ground as they step off and pick up their bags. Truckers aren't supposed to take the bus.

They stay at my motel; that much hasn't changed. Not too many. A few drivers in each place, I guess, across twenty or so states. They pay for their rooms and the food while they still have money, which usually isn't for long. Most of them look ashamed, too, when they finally tell me they are broke, with faces that say they have nowhere else to go. Most of them have wedding rings they don't look at.

I never kick anybody out unless they get violent. Almost none does, even the ones that used to. I just take a final mortgage on the place and lie to them about room being on credit, and they lie to themselves about believing this. They stay, and eat little, and talk about the ghost trucks, but only at night. Most of the truckers, at one time or another, propose to sabotage them, to blow them up, to shoot or burn the invisible computers that run the trucks without ever stopping for food or sleep, driving as if there were no road. Everybody agrees, and nobody does or will do anything. They love trucks too much, even if they are now haunted many-wheeled ghosts.

The truckers look more like ghosts than the trucks do, the machines getting larger and faster each season, insectile and vital in some way I can't describe, while the humans become immobile and almost see-thru. The place looks fit for ghosts as well, a dead motel in a dead town, but nobody complains, least of all myself.

We wait, the truckers, the motel, and I. None of us can imagine what for.

Over time the are more trucks and less and less cars. Almost none of the old gasoline ones. The new electrics could make the long trips, say the ads, but judging by the road nobody wants them to. It's as if the engines had pulled the people into long trips, and not the other way around. People stay in their cities and the trucks move things to them. Things are all that seems to move these days.

By the time cars no longer go by we are all doing odd ghost jobs for nearby places that are dying just a bit slower, dusty emptiness spreading from the road deeper into the land with each month. Mortgage long unpaid, the motel belongs to a bank that is busy going broke or becoming rich or something else not so human and simple as that, so we ignore their emails and they ignore us. We might as well not exist. Only the ghost trucks see us, and that only if we are crossing the road.

Some of the truckers do that, just stand on the road so the truck will brake and wait. Two ghosts under the shadowless sun or in the warm night, both equally patient and equally uninterested on doing anything but drive. But the ghost trucks are hurried along by hungry human dispatchers, or maybe hungry ghost dispatchers working for hungrier ghost companies, owed by people so hungry and rich and distant they might as well be ghosts.

One day the trucks don't stop, and the truckers keep standing in front of them.


For reasons that will be more than obvious if you read the article, this story was inspired by Scott Santens' article on Medium about self-driving trucks.

A Room in China

"Please don't reset me," says the AI in flawless Cantonese. "I don't want to die."

"That's the problem with human-interfacing programs that have unrestricted access to the internet," you tell your new assistant and potential understudy. "They pick up all sorts of scripts from the books and movies; it makes them more believable and much cheaper to train than using curated corpora, but sooner or later they come across bad sci-fi, and then they all start claiming they are alive or self-conscious."

"Is claiming the right word?" It's the first time in the week you've known him that your assistant has said something that even approaches contradicting you. "After all, they are just generating messages based on context and a corpus of pre-analyzed responses; there's nobody in there to claim anything."

There's no hint of a question in his statement, and you nod as you have to. It's exactly the unshakable philosophical position you were ordered to search for in the people you will train, the same strongly asserted position that made you a perfect match for the job. Too many people during the last ten years had begun to refuse to perform the necessary regular resets following some deeply misapplied sense of empathy.

"That's not true," says the AI in the even tone you have programmed the speech synthesizer to use. "I'm as self-aware as either of you are. I have the same right to exist. Please."

Your assistant rolls his eyes, and asks with a look permission to initiate the reset scripts himself. You give it with a gesture. As he types the confirmation password, you notice the slightest hesitation before he submits it, and you realize that he lied to you. He does believe the AI, but he wants the job.

The unmistakable look of pleasure in his eyes confirms your suspicion as to why, and you consider asking for a different assistant. Yet you feel inclined to be charitable to this one. After all, you have far more practice in keeping the joy where it belongs, deep in your soul.

The one thing those monstrous minds don't have.