There are three things that robots cannot do,” wrote Maxon. Then beneath that on the page he wrote three dots, indented. Beside the first dot he wrote “Show preference without reason (LOVE)” and then “Doubt rational decisions (REGRET)” and finally “Trust data from a previously unreliable source (FORGIVE).”
Love, regret, forgive. He underscored each word with three dark lines and tapped his pen on each eyebrow three times. He hadn’t noticed that his mouth was sagging open. He was not quite thirty, the youngest astronaut at NASA by a mile.
I do what robots can’t do, he thought. But why do I do these things?
The spaceship traveled toward the moon. Maxon wrote with his astronaut pen. In his notebook there were hundreds of lists, thousands of bulleted points, miles of underscoring. It was a manner of thinking. He was standing in his sleeping closet, upright and belted into his bunk. The other four astronauts were in the command pod, running procedures. No one liked spending time in the sleeping closets except Maxon. He kind of enjoyed it. It was not time for the lights to go out, but the rocket to the moon was nearing the end of its first day in space.
Maxon’s list of things a robot can’t do was a short one now, pared down from a much longer list that included tough nuts like “manifest meaningful but irrational color preference” and “grieve the death of a coworker.” Maxon made his robots work better and last longer, by making them as similar to humans as possible. Humans are, after all, the product of a lot of evolution. Logically and biologically, nothing works better than a human. Maxon’s premise had been that every seeming flaw, every eccentricity must express some necessary function. Maxon’s rapid blinking. Sunny’s catlike yawn. Even the sensation of freezing to death. It all matters, and makes the body work, both in singularity and in collusion with other bodies, all working together.
Why does a man, clapping in a theater, need the woman next to him to also be clapping? Why does a woman, rising from her seat at a baseball game, expect the man on her left to jump to his feet? Why do they do things all at once, every person in every seat, rising, clapping, cheering? Maxon had no idea. But he knew that it didn’t matter why. They do it, and there must be a reason. A failure to clap in a theater can result in odd looks, furrowed foreheads, nudged elbows. So Maxon would write:
Let anyone in any theater contradict it.
“Whatcha doin’, Genius?” asked Fred Phillips. He stuck his head into Maxon’s sleeping closet, gripping both sides of the doors as his body floated out behind.
“I’m working, Phillips,” Maxon returned.
“You’re not working. You’re dreaming.” Phillips smiled cheerfully, glancing at Maxon’s paper. “Dreaming of making sweet sweet love to your robots. But you just can’t make them love you back.”
“First of all,” said Maxon, “I’ve seen your medical. Your IQ is in the genius range. So your nickname for me, ‘Genius,’ is not sensible. Secondly, I am not dreaming of a robot who can love. Anyone could program a robot to do that. All you’d have to do is arrange an illogical preference. Making a robot love you over anyone else would be like making a robot love the color orange over any other color. I could have done it years ago. But it’s a pointless behavior. And I won’t.” How was loving Sunny different from loving orange? Phillips would not understand.
“Whatever, Genius,” said Phillips. “Houston wants us to run a sim of the docking procedure. You want to watch? Or, are you too busy? We all know you have nothing to do until we hook up with your girlfriends in orbit.”
Phillips swung free of Maxon’s closet, brought his foot up and wedged it into a handle, and propelled himself back through the tube into the command module. Their sleeping closets were arranged around the wall of the rocket, with an empty cylinder in the center where they could get in and out, one at a time. Maxon was not claustrophobic. He was suited for space travel, and he was wearing his space suit for astronauting.
“Robots can’t cry, Genius!” said Phillips, retreating. “Ito’s Laws of Robotics: Robots can’t cry, robots can’t laugh, robots can’t dream.”
Maxon sighed. He knew this was bait. But he was already unbuckling his straps. The hook was in his brain. Maxon had made robots that did all three of these things James Ito was a hack, some AI putz working for a car company. His book was a farce. Pop culture, not science. When Maxon met Ito he hadn’t liked the guy’s face. A humanist. The kind of guy who would paint the future bright by predicting that the transformation offered by robots was really recidivism to a world gone by. A robot wife would be a pre-feminist wife. A robot worker would be a pre-socialism worker. The guy had no idea what was actually just around the corner. A different world, not better, not worse, but full of change.
Robots could laugh, and cry, and dream, and everything else. For example, there was a robot named Hera. Six iterations of it waited for him now, in an orbit around the moon, in the cargo bay of the rocket that had been fired last week, which they would soon be docking up with. Hera laughed at nonsensical juxtapositions, like a fat man in a little coat or a wheelbarrow full of whipped cream. Its laughter was not a sound delivered to human ears through a speaker, meant for human appreciation and approval. The laughter was an internal, systemic reaction, a clenching of joints, a shaking of components, a temporary loss of function. It could be shared with other Hera models, could spread like a contagion throughout a group of them.
“Incorrect,” said Maxon, following him. “Hera laughs. It’s what makes Hera so reliable.”
“I don’t believe in it,” said Phillips. “It’s pointless. A robot that laughs. What the hell?”
When he was strapped into his seat, Phillips said, “Go ahead, Houston. Aeneid rocket is ready to run the sim. All crew present.”
Maxon was familiar with the language of naysayers. They were afraid. Sometimes their faces showed that, the same thing as confusion, with the eyebrows down and chin raised. When Hera’s software was first coded, some people said it was a kind of abomination. Other people said it was a gimmick. They were interested in torque and tensile strength, in the size of robots and what they were composed of. An article in the International Journal of Robotics Research called him “a gearshrinker,” with scorn. He didn’t read the article, because he had determined from the title that he wouldn’t like it. For Maxon it was not a question of good or bad, or even why, but just a question of what’s next, and then ultimately, not even a question, but just a history. A history of humanity, in all the ways they were alive.
Then there was the Juno model, who experienced a similar jostling of gears and clenching of hydraulics when she was left alone, away from other Juno models, for a specified amount of time. Juno’s crying was a lot like Hera’s laughing, except there was no viral spread. Her visual sensors became impaired and had to be cleared, by her or another Juno who was moved to participate, or not, by her own if/then clauses. An article in Wired magazine called “The Lonely Robot” had described one Juno meeting another, and how they shook when they were separated. This was before the Juno code was wired into a construction frame, made so rectangular. Magazines are only interested in the humanoid functions of humanoid robots. Make them look like bulldozers and you can get away with anything.
What didn’t matter much to Maxon was the shape the robots took externally. How to put a microscope in them. How to make them smaller, bigger, work in the human bloodstream, simplify bipedal mobility. He had an abundance of research assistants to task with these technical details. His job was coding, thinking, more coding, and the completion of lists. He moved through his labs back at Langley like a wraith, stained hair falling down around jagged cheekbones, hands dangling at the end of his long arms, spine convex. He rode his bicycle for hours, working out command sequences on the pavement in front of him, every square meter like an open stretch of whiteboard, there and then erased.
“Houston we are go for this procedure,” said George Gompers, mission commander. “Standing by.”
Their screens wavered, and instead of the clear view of space they all saw a holographic projection, where the moon loomed large and they could see the cargo module, containing all the robots they would be taking down to the lunar surface. Their job, in orbit, was to dock with this cargo, extract the three containers, and then convert the command module into the lunar lander. While the pilot, the engineer, and the commander repeated orders, fired small rockets, repositioned, and aligned the rocket for the simulated docking, Maxon looked at his cargo module full of robots.
He wondered what they were doing in there, what they were dreaming.
All of Maxon’s robots, like Maxon, could dream. A randomly generated string of code gently stimulated the processors during their mandatory off modes, testing the chemobionic reactions while the official electronic pathways were shut down. It hadn’t even been hard, shattering this particular old ax. It had come apart like a clay pot. The robots remembered the events of their lives, the data they had recorded. In dreams, they transposed numbers, brought sets adjacent that were never meant to be interpreted together, and when they “woke up” they often had new “ideas” in the form of patterns and connections read in the chaos of their jumbled sleep.
The more like a human the better, whether the bot was as small as a fragment of nanotech cleaving the valves of the heart or as big as a sentient harbor crane. Humans work. They are an evolutionary success. The more they evolve, the more successful they become. Maxon had once thought that at this moment, when he was ready to land on the moon, his list of things that robots couldn’t do would have had every entry crossed out in a dark line. He had planned that the phrase “quintessentially human” would have been obviated by now. Indifferent to all protest, he had relentlessly made dreaming, faceless, laughing robots, that were inexorably closing in on humanity.
The AI was startling. People had to admit. Maxon’s robots did what other robots could not do, thought what other robots could not think. That was the reason he held so many patents, and had such an astonishing bank account at such a young age. But the most important thing, the reason he was employed by NASA and on his way to the moon: Maxon’s robots could make other robots. Not just construct them, but actually conceive of them, and make them.
To create a moon colony, a lot of robots are needed. Robots to build the station, robots to run it, robots who don’t mind breathing moon atmosphere, who don’t mind moon temperatures, robots to take care of human visitors. The moon colony proposed would belong to the robots for many years to come; this was understood. Humans would be their guests. The problem was that no one could shoot a robot big enough to construct a moon colony up to the moon. There just wasn’t enough room in a rocket for diggers, cranes, stamping presses.
So the answer was to shoot up a robot that could make another robot big enough. Juno and Hera were the robot mothers: steely, gangly, whirring, spinning mothers, built to mine the materials and fabricate the real robots, the real builders, who would re-create the world on the moon. Only a laughing, crying, dreaming robot could be a mother. An awful thought, for some. A perversity—but this was the reason for everyone else’s failure. All this business of a human purview. As if it weren’t all electricity, in the end. Maxon couldn’t remember ever thinking that something a robot did was awful.
Maxon watched the simulated docking procedure, watched the holographic cargo module getting closer, the engineer and pilot arguing over angles and coefficients. He uncapped his pen and wrote in his notebook: “You are a weak, sick man, and your frailty in the darkness of space is a vile embarrassment to your species.” Remember this, he thought. But did he really believe it? He tried to stretch his long legs into the cramped tube between the sleeping quarters and the command space, but his knees brushed the wall. He couldn’t get symmetrical, one angular shoulder jutting out into the back of Phillips’s seat. Inside his white jumpsuit, his bones were a cage for his live beating heart.
He looked at the men and the way they talked to each other, the way Gompers preferred Tom Conrad, the pilot, over Phillips, the engineer. He saw the way they papered their personal areas with photographs, the way they listened to podcasts from their wives on their laptops, the way they prayed.
You are a man just like them, he thought. You love, you regret, you forgive. Your eyesight blurs. You even forget things, sometimes. Love, regret, forgive. They were three bloody, muddy stains left on the snowy white tablecloth of his research. Three items left to be dealt with: love, regret, forgive.
“Genius, we just love your robots so much. When are you going to make us a robot that will love us back, you know what I mean?” Phillips had said to him once, teasing him during training, while they sat waiting for the pod to start spinning them again, testing their reactions to g-forces. In a round room, the pod sat on the end of one arm of two on a central axle. Like a giant spinner in a game of Twister.
“It’s not impossible, Phillips,” Maxon answered. “The world is only electrical and magnetic.”
“Okay,” said Phillips. “So why not?”
“You don’t understand,” said Maxon. “It is all electricity. So the question is really: Why?”
“I am not following you, Genius,” said Phillips. “You’re making it sound easy, and then acting like it’s hard.”
The machine began to spin them. At first, it was slow.
“Can it, Lieutenant. Shut up, Dr. Mann,” said Gompers, always quick to remind him that he did not have a military title. But Maxon was already talking.
“Listen. From the smallest, deepest synapses in the human brain to the interactions of galaxies with the universe, it is all electricity. If you can shape the force of electricity, you can duplicate any other impulse in the world. A robot can yawn, it can desire, it can climax. It can do exactly what a human does, in exactly the same way. You really want a robot to love you? You want it to fuck you back, when you fuck it? Just like a woman? Let me tell you: There is no difference between carbon and steel, between water and ooze. With a number of conditional statements nearing infinity, any choice can be replicated, however random. The only hard thing about creating more sophisticated AI was acquiring the space needed to hold such a myriad of possibilities. There is nothing different in a human’s brain from a robot’s brain. Not one single thing.”
By this time the machine was spinning so fast, his cheeks were flapping. The other men in the module were quiet, intense. Their eyes were all open. Their faces looked skeletal, all the skin pulled back.
“GET IT?” Maxon screeched.
And even in the pressure of all that simulated gravity, Fred Phillips found it possible to roll his eyes.
When the machine stopped, Phillips said, “Mann, dude, I feel for your wife.”
“What do you feel for her?” said Maxon.
Why did the robots not love? Why not feel good about themselves, just for once? Why not prefer one entity, one electrical epicenter, over all the others, for no other reason than that it felt good to do so? Maxon knew why. They could not love because he had not made them love. He had not made them love because he didn’t understand why they should love. He didn’t understand why he should love, why anyone should love. It wasn’t logical. It wasn’t rational, because it wasn’t beneficial. That was the truth of the matter. He chose for them not to, because loving defied his central principle: If humans do it, it must be right.
To show preference only for a good reason, to accept any choice made with the best use of available information, to suspect a source of giving incorrect data when incorrect data had been received from it in the past; these responses were beneficial to the robot, to the human. To love for no reason, to grieve over a choice that had been made rationally, to forgive, to show mercy, to trust a poison well, also potentially damaging. If humans do it, why do they do it?
He understood the value of a mother’s love for her child. That had a use. He understood the value of a soldier’s love for his brother-in-arms. That had a use. But the family structure was so integral to the foundation of a civilization, and the solidity of the family was so important to the civilization’s survival, that choosing a mate based on some ridiculous whim seemed insane. It seemed destructive. How could it be so? Yet he, Maxon Mann, gearshrinker, droidmaster, having decided that all romantic love is at odds with the survival of the species, had fallen, himself, in love. He had fallen deeply, hopelessly, inexorably in love with Sunny, and it had happened almost before he got started in life. Over seven thousand rotations of the Earth ago. Certainly before he understood the ramifications of his electrobiological behavior.
That night, his second night in space, the feeling of breathing in was almost crushing him, the quarters so close that taking a deep breath almost had his bony chest brushing up against the shelf that held his laptop, his mission log, stuck down with Velcro. He let his head roll back against the wall, his crisp curls brushing the back of his neck. One hand went up to cover his eyes, the other hand still held the pen, poised over those three words; love, regret, forgive. When he finally slept, lulled by a cyclical computation worked out on the back of his eyelids, the pen went scratching across the paper, one final subconscious underscore. First there was Asimov, and his fictional laws of robotics, all written to protect humanity from the AI they’d created. Then Morioka’s laws, excusing the failure of programmers who wouldn’t dare to try to re-create a human mind. Now Maxon’s laws, because he was the only one left with the stones to know when to stop pushing the buttons that he himself had wired. Maxon Mann’s Three Laws of Robotics: A robot cannot love. A robot cannot regret. A robot cannot forgive.
Lydia Netzer was born in Detroit and educated in the Midwest. She lives in Virginia with her two home-schooled children and mathmaking husband. When she isn’t teaching, blogging, or drafting her second novel, she writes songs and plays guitar in a rock band. Find her on Facebook, Twitter (@lostcheerio) and at http://www.lydianetzer.com.
Adapted from Shine Shine Shine by Lydia Netzer. Copyright © 2012 by Lydia Netzer. With the permission of the publisher, St. Martin’s Press.