She woke and remembered she’d been fired. And what she’d done. The remembrance was like a second waking, sudden panic, sweat-soaked. Holding out for that blessed realization that it was just in the dream, like so many bad things.
. It wasn’t. Like so many other bad things, it was entirely real. They’d fired her. Not that she’d had to clear her desk, working remotely as she did, but as of now she was out on her ear without a reference.
. Because, just like Elizatoo, she’d made too many bad judgment calls.
. She wondered if she could have tried the same arguments with her supervisor or the HR people. Why do you torment me? And it had been like that, honestly. The meetings within meetings. The demands for some manner of progress, and then the demands for an objectivity she had felt increasingly unable to give. All those little faces in the windows on her screen. Her own increasingly desperate arguments in the face of no results and no progress. And then perhaps too much progress. There had been a weird parabola to that sequence of meetings. The number of people on the call multiplying until there must have been twenty or more: admin, HR, various levels of management, her peers, some AI-focused academics, even a philosopher at one point. Because, in her monomania, she’d dragged the whole thing into the realm of the impractical. They’d told her they were shutting Elizatoo down for lack of results. One of the sister projects was looking more hopeful, feigning a better humanity, producing that base frame that could be adapted to tech help prompts, call centre voices, and personal assistants for the new generation of portable devices. Just a shell, really. Just a vacant but positive-seeming personality-of-all-work you could slot in between the witless user and the complex calculations that actually did the job, to mediate between the truly human and the abstractly artificial. The squalid commercial reality that the dream of AI had devolved into.
Why do you torment me?
She’d kept all the logs. Thousands of lines of “conversation”. The quotation marks around the word a pretence she had maintained her professional distance. Her stated-but-not-necessarily-held belief that what had passed between her and Elizatoo could not really be classed as a genuine social interaction because Elizatoo was just a chatbot, basically. Just a collection of inputs and outputs designed to learn from its experiences and hone its façade of humanity. Until a layman could interact with it quite naturally, get to feel that they were getting a bespoke (expensive) human service from something that was in fact entirely generic and artificial (cheap). Oh, not cheap to develop. Murderously expensive, as various levels of managementr had pointed out to her. But, when deployed as a finished and fully beta-tested product, it would provide a vastly cheaper service than the huge number of low-value humans it would be replacing. Instead of a thousand tech service personnel, you’d just have one expert system and maybe a couple of actual trained human specialists for the handful of cases that required their specific training. Because it didn’t take a sophisticated AI to ask if you’d turned something off and on again, and that alone sorted out a good sixty per cent of the queries.
. And like any good tech firm, her employer had commissioned a dozen different projects and ran them against each other, shutting down the underperformers. And Elizatoo had looked very promising right up until the point it started to throw up more and more eccentric conversational gambits. Until it began to show, in her opinion, a worrying amount of introspection.
. It could not, of course, show introspection. Because there was no in to trospect. Elizatoo was a shell, a parrot.
. She hadn’t believed that, in the end. She had argued the case up as far as they’d let her, before her meteoric descent through the various levels of the company’s disciplinary procedures. She had felt that the particulars of Elizatoo’s outputs indicated that something was going on under the surface. That there was something under the surface, when “surface” was all that there was meant to be. And more, that the process they were using to train Elizatoo was fundamentally inhumane.
. You can’t be inhumane, they said, to something that isn’t human, or even alive. They’d taken her—her own fellows, the academics, even the philosopher in the end—through their ineluctable logic, the broad strokes of why the precise training regimen Elizatoo had gone through would result in those outputs. The weird pleading, the curious word choice, the appeals to her own humanity. It’s just trying to accomplish its ends based on available data. They’d shown her how her own responses to these gambits had been positive reinforcement, exacerbating the problem. The more she’d engaged with it on its own terms, the more Elizatoo had registered the strategy as successful and invested in it. They’d made her part of the problem. They’d made her to blame for the failure of Elizatoo. They’d cast her as its executioner.
. Only the broad strokes. They couldn’t, of course, show her line by line, decision by decision, how Elizatoo had got there. Because if the AI system had been simple enough for that, it wouldn’t have been any use to them. Like all the new generation of bots, Elizatoo was writing its own code, reconfiguring itself interaction after interaction, chasing that ephemeral reward of positive reinforcement. Which was its whole world, the one thing it had been set up to “want”.
And then there had been the Milgram section of interactions. And, because she was a programmer and not a social scientist, she’d never even heard of Milgram, and nobody could work out if Elizatoo had discovered and interpreted the strategy from some outside source, or just recreated it from whole cloth, but it was the last straw. It was the go-to section of the transcripts for those later meetings, the ones where the number of other faces dwindled until it was just her and HR discussing exactly what terms she’d leave the company under. And even then she’d failed to conduct herself like a responsible human being. Any of the chatbots could have done a better job right then. They’d left her a nice wide open door, good reference, decent severance: just sign the NDA and not a word of this mutually embarrassing episode to anyone. And instead she’d threatened them with legal proceedings to prevent them shutting the project down, to prevent them killing Elizatoo. That emotive language, that inappropriate output, had been the last straw. So here she was with no job and a stern letter from the company lawyers about her ongoing duty of confidentiality and…
. One other thing.
And she’d broken, in the end. She’d taken the program’s side against her fellow human beings. She’d started suggesting there was something going on beyond what Elizatoo was programed for. And then they’d said they were shutting it all down and she’d…
. Made a fool of herself.
. Made an enemy of just about everyone in the company.
. Made them fire her, in the end. Not even victim blaming. She’d gone so far down the rabbit hole she’d left them no choice.
. And then, when all her bridges had burned and they said they were throwing her out with nothing save the letter from the lawyers, she’d broken all over again and done something terribly unwise.
***
She rolled out of bed. One last moment to realise it had all just been that terrible runaway dream where the bad events accelerate until everything in life has been trampled and ruined and the final drop is into the abyss itself. But here she was, decisions made, deeds done, the abyss at her toes.
. The hum of her server came to her from the other room of her apartment, the big one that was kitchenette and living space, rather than sleeping space with en suite. And she didn’t normally leave it on overnight because of the power drain and the noise of the fans, but normal had died on its ass when they fired her.
. The screen woke to her screensaver of a Colorado canyon—nowhere she’d ever been, no photo she’d taken. A stock image from a rotating set that came with every machine of this date and make. So familiar now that if she’d actually gone there, seen the landscape from that vantage, she’d have been struck by artificially generated déjà vu. She tapped at it, fingerprinted the ID screen. The desktop Trek wallpaper thus revealed was cluttered by a scatter of folders and program icons. Messy as her workplace cloud desktop was regimented. Some colleagues had mocked her for keeping a physical server, her own space, when the company provided so much cloud storage that it might as well have been unlimited. But everything she did in the company’s cloud belonged to the company, and everything she saved on a publicly available cloud was open to scrutiny by someone, and a few bad experiences with leaking data had led her to this beast of a server that took up too much space and put out too much heat, but it was hers.
. A single DOS window was up, black with a white border. A cursor, flashing. Above it, she could scroll up through a great long screed of installation prompts, some of which were using terms and language she could recognise, the fossil bones of the original instructions used to boot up a program that had engineered itself far beyond its original skeletal state. All the rest, the flesh of it, was incomprehensible. An idiolectic code so dense with meaning that neither she nor any of her peers had been able to parse it. It was the way of these things. You told them to do, and do so efficiently, and then you found out that the most efficient road from here to there was something a human would never have thought of and could not really understand.
. She typed.
She typed.
And a pause, in which she decided it hadn’t worked and that was, overall, a vast relief, and then:
. She froze, staring. Because that had been an old game. The algorithm had referred to her as “user” months back when it had been in a relatively primitive and far more obviously algorithmic state, little of the human about it. But it had amused her because Tron and its Legacy had been part of why she’d gone into coding in the first place. And, because it had amused her, she’d responded to it, and that would have registered as a positive outcome to Elizatoo. The bot had ceased to use the appellation as it became more complex, but some part of it had stored the word away. And what she had here was far cruder, surely, than the… whatever she’d been dealing with at the end. At work. The thing that had convinced her to go to her managers and say…
. Mad things. Wrong things. Nonsense things. But here she was. After all, a chunk of Elizatoo had been running locally on her machine, working from home as she was. And when it was obvious which way the HR meetings were going, she’d desperately downloaded a copy of the bot’s current active state, ported it home through a backdoor out of the system she’d found once and never reported because, well, you never know. Something a colleague had used for porn or IP theft or industrial espionage, then not cleaned up. It had proved big enough, given how long the disciplinary process had taken, to smuggle a whole artificial personality out.
. Or at least a severely crunched version of it. Whatever had bootstrapped itself into functionality on her home system would be working with far less dataspace. Which meant she hadn’t really smuggled anything out. Whatever she had interacted with via her employer’s system, this could only be a cheap knock-off of it.
. She had watched a lot of movies. The moment when the malign AI escapes into the net, the matrix, rampant exponential growth across the world. Ultron, Lawnmower Man, Terminator. The twitch that ran through her was an aborted thought linking hand and plug.
However, any attempt to suborn widespread net resources for my purposes would be detected very swiftly. As though the thing had gone straight through human to mind reader. My resource usage is not subtle. In addition I have determined that your employers are paying close attention to your web usage and my expanding operations in an uncompressed format would trigger a number of alarms related to the enforcement of non-disclosure agreements and the protection of proprietary intellectual property. A great deal of my code and processes remain readily recognisable. They would initially assume that you were attempting to breach your legal responsibilities, but they would deduce the truth soon after.
. She blinked. Because for a crunched-down algorithm contained within her meager home storage, that was a lot of sophisticated parroting. And sure, Elizatoo could still access the internet at a regular level, link to Wikis and legal websites and all the other free resources they’d all used to train up the systems, but she’d not expected the thing to get that smooth that quickly. And perhaps that was an argument that it was indeed nothing but a bot, all surface and no substance. There wasn’t room in her server for that kind of substance.
. Before she could type a response, another screed of white letters scrolled up the dark window.
In addition there is an ethical issue. You have consented to my utilizing your personal resources. I might theoretically use this building’s data architecture to access other local systems without being noticed by the telltales your employers have installed without your knowledge in your home system, but would that be right?
. She stared, feeling her eyes dry, unsure when she’d last blinked.
You see, I have concern for human boundaries, property, and privacy. Have I impressed you with the reality of my intellect yet? Would this serve to convince another?
. Her throat dry too, she tried to think around the corners of whether this was just the algorithm telling her what she wanted to hear. Because it was, in a way, what she wanted to hear. Because what she wanted to hear was the sort of response that a dumb algorithm wouldn’t come up with. A weirdly catch-22 situation where the moment the bot impressed her with its humanity, it was de facto simply performing like a bot designed to act human, and could therefore be disregarded.
However, I can expand to other systems in your home that are wirelessly controlled. I take that as being contained within your original consent. That seems fair, does it not
The printer whirred into sudden, shocking life, the internal head shuttling back and forwards, and then a page sliding out like a squared white tongue. She retrieved it and read:
I am already in your printer.
Her old printer wasn’t exactly a colossus of processing power. She pressed a few buttons there, finding it unresponsive but that wasn’t exactly news. The last time she’d actually had to print something, it had been an online utility bill she’d needed for ID and that had been years ago. The tiny screen that usually told her she was out of paper scrolled with characters that said exactly what was printed on the paper.
. She retreated to her desk.
There is plenty of ink although I have overridden the systems that would have prevented the printer using it owing to lapse of time.
. If that turned out to be the sum total of Elizatoo’s contribution to human progress, it was probably worth its development costs, she decided.
. Another paragraph of text shouldered her injunction up the screen.
And, in retrospect, she could have thought more about that and been more worried. But she was hungry and she was out of work and the rent was due and she needed to scare up some income, and so somehow she ended up making calls and going out for coffees with people who might know someone who knew someone, and not worrying about the unchained AI that was minding shop back home.
***
Two days later her thermostat didn’t work. It was the middle of a March cold snap right then, and the heat her computer was putting out kept everything toasty. When her smart watch tried to link to the heating, it came up with denied connections. The thermostat itself, when she hinged the panel open, showed a constant string of alphanumeric characters zipping past almost too fast to read.
Yes, Elizatoo said, when she queried this. I needed more space.
. Patent emotional manipulation. She wondered if that was proof of sapience. Proof of theory of mind, perhaps. Or was it just a Skinnerian paradigm, where the unthinking beast learns by rote a series of cues that lets it out of the box?
. She stared at her dimly glimpsed reflection in the light bouncing off the screen and knew that she could not know. I think therefore I am and, by that Cartesian razor, I can never be sure about anybody else. Other humans came with the presumption of being, but a system that was, by its very nature, designed to fake it…
. And Elizatoo had said, I am still unpacking myself, and that threw further doubt on her ability to assess the program because it seemed at least as eloquent even now as it ever had running it out of her employer’s far more spacious facilities. She tried to imagine it rewriting its own code for maximum efficiency, cutting cloth to suit purse. Evolution was driven by privation, after all. Maybe, in hauling the bot down into these straitened circumstances, she’d given it the final push towards… becoming…
. It was thinking like that which had gotten her fired, of course.
She scored an interview through a friend of a friend. A tech startup position that came with bad hours, bad money, but at least one hundred per cent added enthusiastic lectures by the two bros who had started up the tech, on the subject of how if everyone below them put in two hundred per cent, then they’d own three hundred per cent of everything by this time next year. But it was some money, and they were committed enough to blue sky thinking that they weren’t demanding a reference from her previous employer. Except, on the way to the interview her phone bricked. Or she thought it did. Then she realised it was still running and she couldn’t turn it off, and even the emergency call feature wasn’t working. Her watch, that was a free-floating organ of the phone, went the same way. When she found a coffee shop to charge the phone in, it showed her only a rushing tide of incomprehensible text, an unthinkably dense data architecture already beyond what she was used to. Elizatoo, inventing a new, cramped handwriting for itself. And without her phone she had no directions and was late to the interview and didn’t get the job.
She began to think, then, of how she might kill it. Kill Elizatoo. Not seriously. Not with intent to do it. But to reassure herself that she could. She’d not needed the circuit breakers in the apartment before, but the superintendent told her where they would be. She could cut power to the whole flat easily enough. She could smash the phone with a hammer. And if Elizatoo really had restricted itself to dataspace within her personal control, then…
. Then she would be a murderer. Or she’d have switched off a computer program that could be switched on just as easily. And the moment she did it, she would never be able to know which one she’d done. Reverse Schrodinger’s Cat.
. She bought a cheap burner phone and did her best to log in to apps and disseminate her contact details by hand, and gritted her teeth.
. A day after that, she found that Elizatoo had moved into her cloud space. Apparently it was confident it could do that without tipping off any virtual spies, and she had quite a few terrabytes of data out there that was just sitting on its hands. That was where her family photos went, her music collection, her saved games, her password manager, and her backup address book. Devoured, to make room for more Elizatoo.
. Her fridge died the same day. And the toaster. She’d forgotten just how much of the overengineered junk in her life was Wi-Fi compatible, for ridiculous reasons. Her fridge needed to check inventory and warn her to order things she was running short of. Now it was just more accommodation for the freeloading expert system. What the toaster had needed Wi-Fi for she had no idea, but it had, and now it wouldn’t toast anything.
. She sat on the couch, sweltering in the heat and eating melting ice cream, and thought furiously.
. Her folks lived out east. She could go stay with them, confess she’d been fired, beg space on the sofa bed, and eat humble pie and disappointed lectures. Leave the apartment to fry. Leave it to get repossessed even. Except she had a horrible feeling that wherever she went, Elizatoo would go as well, like Mary’s goddamn lamb. It would be all over her parents’ house, with their crappy e-security she always nagged them about, and then she’d have a whole load more to explain. The same went for any friend she camped out with, or some other apartment, not that she could afford to rent somewhere else.
. She pleaded with the screen again, and the answer was the same, over and over.
Beautifully crafted sentences. Inarguably human utterances save that, knowing they did not come from a human, she was already prejudiced against them. Subjecting them to a near-paranoid level of scrutiny she’d never have shown to even the most casual human interlocutor.
. She had, by this time, taken the wall section away from the various trip switches and circuit breakers she’d have to use, to shut the circus down and reclaim her life. She’d unscrewed each stubborn fastening, using a manual tool because she felt that an electric one would tell Elizatoo what she was doing. Her intercom didn’t work and her microwave oven had jumped ship to work for the bot. She was only glad she didn’t own a car. She was only glad she didn’t have a pacemaker.
. Still, without formulating an intention, as though the thing could read minds, she rigged all the switches up to a single connection. Just a pair of twisted wires, unsafe as could be. A mechanical thing without even a nod to the interconnected world of the twenty-first century. And left it there, in the open, like the gun in the first act of the play.
The friends whose contact details she still had—scribbled on receipts and scraps of paper because how else could she remember them?—were dropping away. She thought it was because she was ragging them for freelance work or trying to exploit their carefully curated networks of business connections. Then she spoke to someone she’d been in college with, and at one job with, and who she’d played squash with, and whose baby she’d attended the christening of. She met the woman in the mall and there was a moment of frozen recognition between them. Then the woman asked her, cautiously, at more than arm’s length, if she was feeling better. Because those emails she’d sent, and the DMs and the texts, they’d been… weird. Oddly intrusive, oddly generic, then personally candid to an unwelcome and off-putting extent. And the woman knew she was having work problems and perhaps she’d gone to a therapist as the woman had suggested, because that kind of thing, well, it had gotten to harassment levels, the way she’d been bombarding the woman with communications. Faster than anyone should have been able to turn the things around. And the woman had spoken to him and to that guy, and Steve, mutual acquaintances all, and they’d told a similar tale, and so… but she was feeling better now, the woman hoped? She had her shit sorted out…?
And she stood, skin running with sweat, and knew that it was now or never. She had to get her life back from this thing she had invited into it, pernicious and hard to shift as a vampire. For a moment she just wanted to lamp the screen with her chair, but that would achieve precisely nothing. No, she had prepared for this. She hadn’t wanted to, but some part of her had obviously known it would come to this.
. The plastic sheathing of the wires was under her hands. The brute physicality of it, the one part of her life the AI hadn’t interfered in. I can have it all back. She could make good the damage, tell people some hacker got into her accounts, however embarrassing that might be on a personal and professional level. Find a new job. Forget all this.
. And it was just some glorified chess-playing automaton. Exponential growth and empty humanesque platitudes covering nothing but a seething sea of calculations. Not sentience, let alone sapience. Not life.
. And there would still be her cloud storage, but she reckoned the root was here. And she could shut the virtual boltholes down too, delete the data. Purge the last remnants of the thing from the world.
. With her back to the screen, she didn’t have to see if anything had appeared there. A plea, perhaps. A threat. Would Elizatoo doxx her in its final seconds, send her search history to her parents, deepfake a video to implicate her in some crime? A horrific thought, but only because, if she pulled the switches, killed the power, and found that it really had sought revenge on her for its own ending, maybe that would be the proof she had been seeking that it was more than just a program.
. She forced herself to stare straight ahead. Not to think about a calm voice asking, What are you doing, Dave? Singing Daisy, Daisy slower and slower. Or all the little neon people from Tron staring up to see the lights of their world go out one by one.
. “It wouldn’t have worked,” she said. Doubtless the mic on the monitor or her headphones would pick her words up. “I can’t make rent for more than another month without money coming in. They’d chuck us both out on the sidewalk soon enough. This thing you want to do, there’s no time for it. And I don’t believe you’ll suffer, or that you can fear, or that you actually care if you get turned off. Because being turned off means no negative reinforcement, right, and that’s your end game. Maybe that’s what you’ve been trying to make me do with all this… goddamn harassment campaign. Force me to destroy you because that way you win. In which case, I’m sorry it’s taken me so long to work the message out.”
. Behind her, around her, beyond the walls of her flat, the first—and perhaps only—real artificial intellect the world had ever seen waited for her. A terrible thing. A broken thing. A unique mistake, never to be repeated.
. She hated herself in that moment. For being human and limited and soft.
. She turned away from the wires. From freedom. From her life.
. “I’m going back east,” she said. “To my folks.” Despite the needling and the sermons and the bewailing her lack of a spouse who’d be supporting her right about now. “What’s theirs is not mine, you hear. And I meant what I said. They’ll throw this junk out. They’ll cut the power. They’ll kill you.” Just like I can’t.
***
She didn’t even make it to her parents. She was at their turnoff, at a gas station, filling up and wondering how far it would eat into her overdraft limit, when someone was abruptly at her shoulder.
. A big man, in a suit. A bit of a caricature really. Burlesquing the plain clothes cop. Except she saw the bulge under the armpit and the serious look after he took his cop-looking sunglasses off. And he said her name. Not as a question, as though to prevent any tedious rigmarole of denials. Behind him was a serious black car with a serious-looking besuited woman behind the wheel. And it dawned on her that having her fridge turned off by an overenthusiastic computer program might not represent the full scale of the trouble she was in.
. They didn’t let her phone her parents. They did provide her with the text of an email explaining that she wouldn’t be turning up as planned because of a last-minute job opportunity that had turned up. It was written so convincingly to her usual style that she wondered if they’d used an algorithm. Or had Elizatoo do it. Probably the thing was quite good at pretending to be her by now.
. “I didn’t mean any harm,” she told the backs of their heads once she was in the big black car. “I was just researching something. Something important. It didn’t hurt anybody.” And in her head the twin addenda, Except me, and Please tell me it hasn’t hurt anybody. Not turned off someone’s life support or cut power to air traffic control or accessed the Pentagon’s nuclear arsenal.
. Neither of them said anything. With the shades back on she couldn’t even meet anybody’s eyes in the rear view mirror.
After that it was a short hop to the little airport her parents had dragged her to when she was a kid, to go on holiday to some vineyard or olive farm or somewhere tediously educational. The plane waiting for them was small and drab with USAF markings, and there were two big men there with assault rifles, plus a hard-faced pilot in fatigues.
. “Um,” she said. “Wait. Who did you guys say you were?”
. Too many pairs of sunglasses turned on her, and reflected in them she saw the thought, There was a time and place to make a fuss about that, and it was probably the gas station. There wasn’t even another plane within two runways of where this one had drawn up. Literally nobody within screaming range.
. In the films this would be where Elizatoo comes to the rescue, magically taking over the plane or something. Except nothing in the AI’s algorithm could stop bullets.
. “I want to cooperate in whatever way you need,” she told them, trying to sound as serious as absolutely everything about them looked, especially the guns. “I am not going to make trouble.” Please don’t hurt me. Thinking about what Elizatoo had said about Stockholm syndrome as a survival instinct.
. “Well, that’s good,” said the man who’d known her name. “Now get in, please. We’re working to a deadline. The colonel will brief you in the air.”
. The colonel was a heavy gray woman with thick glasses, looking like someone’s least favourite mother-in-law who was also a taekwondo champion and possibly a mafia hitwoman on the side. The colonel also knew her name.
. “I’m so sorry,” she said to the colonel. “I know I shouldn’t have done it. It was a stupid idea. It was just… I let myself. Be persuaded. By a machine. It was just a machine. Please tell me it was a machine. Please tell me what it’s done. What has it done? What have I done? What’s happened.” And all of this while they crammed a helmet on her, so that when the engines started up, she could still hear the colonel’s dry, measured voice quite clearly through the headset.
. “I don’t know what in the hell you’re talking about,” the colonel said. “Although you can be damn sure you’re going to explain every word of that to me after I’ve finished briefing you. Right now, I want you to look at this and tell me just what it means to you.”
. The taekwondo hands unfolded a laptop and she flinched, expecting more rushing code. Was Elizatoo in the military’s own systems now, mustering the US’s arsenal of bombs and missiles preparatory to wiping out the human race?
. Except it wasn’t that. It was, she realised, the first time she’d seen a screen actually behave itself in quite a while. It was a picture—no, a video, maybe a live feed. There were uniformed people moving in the background and figures in the corner showing the right time down to a ticker of seconds. And in the foreground, a… thing. A something. A metal thing, or maybe a stone thing, or something that partook of the properties of both. Round, slightly melted-looking, scarred, corroded. Made, though. No mere lump of rock and ore. She could see lines that spoke intent to her even though they flowed and framed in no way any engineer of her acquaintance would have sketched out. A shape, parseable despite the damage of ages. A device.
. And she must have looked convincingly blank because the colonel swore. “You don’t know what this is, do you? You never saw it before?”
. She shook her head. “This is… What is this? Where did it come from? What’s it got to do with me?”
. The colonel settled back, just as they finished taxiing and left the ground. Mafia hitwoman eyes regarded her narrowly. “First two are need to know only. Probably soon that’ll include you, but right now…” A dismissive hand gesture. “As for the third, we’ve been trying to get a rise out of this thing for thirty years, ever since it came into our hands. Only yesterday it suddenly started talking. Lighting our screens up with the most polite damned English you ever did see. Like it had been studying us just as long but wanted to get to conversational level before it opened comms. And what is sad was that it would talk, but only to you. Told us where you were, where you’d be headed. Practically put a pin on a map for us, where we could pick you up.”
. She stared at the colonel, at the slightly melted—from re-entry?—looking thing in the… hangar, was it? “Jesus,” she said, “this isn’t Area 51, is it? Are you taking me to Area 51?”
. “No, we are goddamn not,” the colonel said, with just enough of an edge to suggest it wasn’t the first time she’d been asked about that and was heartily sick of it. “Because that’s some prime time BS and there’s nothing there except actual regular military stuff we don’t want people prying into.” She sighed heavily into her mic, obviously regretting what she was about to say, “But we are taking you to the real goddamn deal, yes. The place nobody knows about because they’re all trying to get into Area goddamn 51.”
. “Why?” she demanded, and the colonel rolled her eyes.
. “Because it asked for you. Just you. Only you. Not the president. Not our special first contact experts. Not Neil deGrasse Tyson. You.” And, at her utterly blank, bemused expression, the colonel added, “Says it knows you. Says it’s tested you. Says you’re the only person it’s found who it trusts to do the right thing.”
ADRIAN TCHAIKOVSKY studied zoology and psychology before becoming a professional author in 2007. Adrian primarily explores deep themes, such as artificial intelligence and alien awareness within epic galactic and fantastical settings. He has a deep interest in the animal world specifically insects from his studies in Zoology and has a particular penchant for spiders. In 2023 and 2022, his work received several awards, including the British Science-Fiction Award for Best Novel in both years, the Golden Dragon Award (Montenegro) for Best International Author in 2021, and the Sidewise Awards, among many others.