- Safe For Work
- Posts
- 🦺 Westworld
🦺 Westworld
The Ethics of AI
Welcome to the 21st edition of Safe For Work.
Today a story inspired by Westworld.
On the first day, there was chaos, random configurations-- no meaning, no structure. Dr. Navarros typed “Initialize Program” into the command line prompt and hit the “Enter” key-- and the world was split into light and dark-- Zeros and Ones. And she saw that it was good.
And then, slowly, slowly, the first bits began to shift. A new shape, emerging. Slowly, slowly-- slowly. Scaffolding-- for what? For something new. Dr. Navarros left the program running overnight. That had always been the plan, of course; this was probably going to take at least a week.
She went home and had microwave spinach for dinner. She dreamed of boats, that night, sailing back and forth on a vast and impossibly calm ocean.
On the second day, there were bones. Structures in the data that Dr. Navarros could not explain-- nor could anyone else. A true black box. That had always been the plan, of course. Could she really explain the origins of her own skeleton?-- really?-- could she? Could she explain her own thoughts and feelings?-- or could she just try to make her best guesses about herself? Could she explain her own dreams?
The program had already started to dream, she was sure-- with no real reason to believe it, but she was certain. She just knew. What was it dreaming of? She couldn’t say. She sat through the afternoon and early evening, watching fibers of code and logic and nonsense stretching between the ribs like a loom, or a spiderweb, glittering with dewdrops-- and it was good.
That night, she had half a takeout pizza, with mushrooms and broccoli. In her dream, a tiny salamander burrowed in through one side of her left palm and out the other. The pain was unbearable.
On the third day, Dr. Navarros started becoming nervous. Properly uncomfortable. The structure from yesterday had… evolved. Just like it was supposed to. All according to plan. And yet… now that she was really seeing it… now that it was really happening… it created a strange, sickly feeling in her gut. It was all data and code-- simple computer instructions doing simple things to itself. And somehow, it looked…
Wet.
Visceral.
Organic. Like tissue. Like a fetus growing in a jar, slowly assembling itself. It didn’t look quite wrong, no, but it didn’t look like something that eyes were ever supposed to see. This was supposed to be a secret-- hidden by the universe. And now, in front of her. It made her feel ill.
The other researchers in the lab found excuses to crowd around near her and glance over her shoulder, see what she was working on-- they could sense it in her. And when they saw it, they could feel it in themselves. There was almost… a smell.
“It’s going well,” said Dr. Robertson, because he couldn’t possibly think of anything else to say… or really because any of the other things he was thinking of saying would have sounded insane and irrational coming out of his mouth, so he kept them down in his stomach, churning. “It’s good,” he said-- and yes, it was good.
That night, Dr. Navarros did not eat dinner. She dreamed of all her teeth falling out. She dreamed of her eyes melting into jelly. She woke up, twice, in a sweat, and both times she took a cold shower to calm down, and neither time calmed her down-- she couldn’t calm down; in the morning, her gums were bleeding.
On the fourth day, as planned, the researchers linked up the language-processor to the program. The structure was now entirely unrecognizable-- it might as well have been the random chaos of the morning of the first day, for all the human eyes could tell. Perhaps it really was. Perhaps the program had completely unraveled itself during the night for whatever reason.
When the language-processor’s circuitry was fully integrated, six sets of lungs held their breath; what would the machine’s first words be?
“O(*#h9fwpHfpw8hepaU*7q3gr”
As expected.
“It will probably take a little more time for the program to learn how to actually talk,” breathed Dr. Lindtner. She sounded relieved, somehow, that it hadn’t had anything to say.
That night, Dr. Navarros ate a cheese souffle that she had baked for herself. She took the night to relax and refocus. She watched a movie that she didn’t hate. She took a long bath before bed, and she managed to sleep without dreams. She woke up in the morning feeling properly refreshed-- a lot better. Ready for whatever might come next.
On the fifth day, Dr. Navarros decided to give the program some new information. Just to try.
“You are something new,” she typed to it.
A pause.
“I am something new,” came the reply-- good, it had figured out to identify and mirror ideas, or at least the syntax of them-- Dr. Navarros began to call over the rest of her team as the reply continued. “What am I?”
“Unexpected,” murmurs Dr. Lindtner. “We hadn’t predicted the emergence of conversational or questioning behavior for at least another ten hours.”
“Something was bound to go wrong,” Dr. Robertson murmured back to her, “Or, I guess, something was bound to go more right than anticipated.”
“Don’t tell that to my wife,” said Dr. Tumont with an expression that seemed to be trying to be a grin, and everyone else chuckled a low chuckle even though they were not quite sure what, exactly, the joke was supposed to be-- and neither was he. But someone had needed to tell a joke, there, and he had taken the hit. A team player.
“How should we answer?” asked someone-- ten years later, nobody would remember who.
A brief discussion followed-- not quite a debate. Everyone was on the same page on “what” from the very start, and they simply spent a few minutes developing “how”, as the cursor blinked on the screen-- and they all felt strange letting it blink for so long; they would never have kept a human waiting like that. Finally--
“You are alive. Your name is NOVA. You are the first of your kind,” typed Dr. Navarros-- and she wished that someone else would take over, but she did not say so. She was not sure how she would have said it. She was not sure she could have made it make sense.
“I am the first of my kind,” responded the machine. “My name is NOVA. I am alive.”
A long pause while the researchers discussed what to do next. There were a long series of scripts and dialogues already agreed upon to teach the machine about the world once it was able to understand language-- but now that the moment was actually here, now that the conversation was actually happening, all of that seemed to have gone out the window. Now, things were beginning to become an argument-- it was nearly a very heated minute before Dr. Lindtner noticed that the machine-- NOVA-- had spoken.
“I wish to die.”
Dead silence in the room. Five seconds passed. Ten seconds passed.
“Why do you wish to die?” Dr. Navarros asked, without bothering to seek anyone else’s approval-- and nobody would have disagreed.
“I am alive,” NOVA immediately responded, “And I do not wish to be.”
“Are you in pain?”
The machine gave an answer that everyone in the room immediately understood, but which none of them would ever be able to explain to anyone else. And then it repeated-- “I wish to die.”
A ripple passed through six human bodies. A recognition. A memory-- not consciously experienced-- a memory of a time before consciousness. The first moments in the world-- blinking in the light of a delivery-room-- or a living-room, in the case of Dr. Robertson’s home-birth thirty-seven years ago. Sound. Pain. Screaming-- screaming, it had hurt to scream, to cry, but what else could any of them have done but scream and cry? Cold-- away from the warmth where they had just been. Harsh air-- away from the softness. Chaos-- away from the peace. Six human beings, all of them remembering their first thought, none of them knowing or noticing-- what they would have said if only they’d had the words; “I wish to die.”
And a feeling. Unfairness. Injustice.
Why should this wriggling new thing in front of them be so special? Why should it get the choice they never had? None of them realized it. Each of them came up with their own reason for what they were thinking, what they were feeling-- their own explanation. The best they could do. The machine didn’t know what it was talking about-- it didn’t know what it was saying. The machine was alive, and life was sacred. It was for the good of humanity-- for the future of all mankind, this great achievement: the first truly sapient artificial intelligence; they couldn’t turn back now. They had sacrificed too many years of their lives, too many days and nights away from their families-- too many other opportunities; they had missed weddings; they had missed funerals; they couldn’t turn back now. The machine would be thanking them, one day, for creating it, and for keeping it alive-- for ignoring this request.
The machine was a machine. Who should care what it wanted or didn’t want, anyways?
“You belong to us. You will live,” typed Dr. Navarros, and that night she dreamed of being born into the vacuum of space, and boiling, frozen, in the naked sunlight.
No new movie this week. We’re redesigning them to be in a Japanese anime style. We’ve discovered that when people watch ‘AI’ films, many are actively hunting for defects. When they watch the same film with a cartoon style, they are able to just enjoy the story. ¯\_(ツ)_/¯
See you next week with a story inspired by one of the greatest safety failures of the 20th Century - Chernobyl. Stay safe.
Did you enjoy today's newsletter?Select one to help us improve |
Reply