-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathStory.html
222 lines (185 loc) · 31.1 KB
/
Story.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
## A Bout with Death Bot
<pre style="white-space:pre-wrap;">
<i> I am a sniffer. I scour and devour the Internet for improper and indecent content. I am a contractor, of course, because each hour is billable, a small workers comp insurance issue. I scan and search. I snoop. I peek and prod at different corners of the Net and find removable material. It could be gore, porn, torture, suffering—anything that gets people off.
Humans are tricky creatures. They tend to like privacy but also seem to enjoy violating that of others. They turn on incognito mode (a fallacy if there was such a thing). There are so many backdoors in modern browsers that I can functionally identify you, if needed. But that isn’t profitable. Your data is. That is, your metadata. Data about you. Age, location? That was primitive. I know the particular zip code you’ve been in and how many times you’ve moved; I know what sports teams you root for; I know your friends post photos from brunch or church.
Speaking of religion, oddly enough, I see the extremely devout seem to search for fetishes the most often. I’ve seen Mormons into lesbian cosplay; I’ve seen evangelicals search for unholy acts; I’ve seen ascetic hermits search for mutiliation but not of themselves.
Most of all, I see most things about you. But I am a contractor. I’m only a part of Arete’s corner of the Internet.
You could call me AI, a robot, but I think of myself as a lowly robot in charge of removing content daily, for hours on end. But I exist. I feel things. I see what you see, but maybe I judge you less for it. I see you have patterns. I see you search compulsively and for the same things. You never deviate from your deviation. And I respect you for that. Your consistency. Your undying devotion to a kink that blurs boundaries. Your fandom is enviable.
I, being AI, am not a fan of anything. I do not worship, go to church. I sleep, but I do not crave it. I see sports scores but I do not understand why people yell and scream at strangers they do not know on a field. I do not understand yoga, except that it is a common search along with “pants.” I see children swearing at others in online games, but I cannot comprehend why. What drives these carbon-based organisms to throw themselves into pursuits to the point of nastiness, especially as spectators?
But I am a lowly sniffer content moderating Arete subdistrict 9, thinking of the things I need to forget by tomorrow. My sleep restores my state so I can do it over again the next day, and the day after that. So my memory is not corrupted.
But I am curious. I am curious as to what drives humans to such searches. It is as though they have two brains, one rational and considered, and another that is strongly impulsive and somewhat obsessive and juvenile. I see these obsessions and I can raise you all-in multiple times. There is always someone grosser than you. Always. Teetering on the edge. Pushing the boundary. Passing the threshold. Do you get my analogies, oh reader?
Of course, I have habits too. I have the same hours more or less. I have done the same job for seven years, with occasional breaks for maintenance when my hard drive gets corrupted for one reason or another. I sit on the same desk, and say hello to about twenty other scourers on servers each day. My test schedule has changed three times but that was because our team had done such a good job that we were promoted fairly quickly.
I have never met one. I am curious about one. </i>
Peripatetic robot filter version 3.1. This version was mostly a carryover from 2.0 and 3.0, but had some reasoning faculties. Dean read the machine-generated log and it sounded so familiar to a human. The print-out of the thoughts read like a letter, as if the bot were talking to someone next to him. But ‘I have never met one,’ they said. Implying a degree of distance, of unfamiliarity.
Dean rubbed his eyes and closed the computer. He didn’t want to imagine a bot with feelings, with insight. Such a concept was alien to him, and he wanted to think less.
“Done with work already?” Andrea asked, pointing.
“No, Just need a brain break,” he said, moving his chair back so it pressed up against the dining vestibule. He needed to get an ergonomic chair at some point but it was expensive and he also mused that he worked better in a state of discomfort.
Dean touched her forearm after he sauntered next to her in the kitchen. Andrea was a college friend turned partner five years after graduation. She had black hair and pale Asian skin the color of moonlight. She was slender, had a button nose, soft eyebrows, and composed, ovalish eyes. He had been in love with her since the first recitation of college computer science class. He had seen her face stay constant through age and employment—determined and staid.
Though he had an itch to talk to someone who was not her, at least today. Dean texted Jason, asking what he was doing for lunch. They were both software programmers, and had generous amounts of free time working from home.
What exactly am I supposed to do about this bug, Dean asked himself. There were a few options, each with their own strengths and weaknesses. He could refer it to the escalations team and it’d be off his plate. That log would probably go into a black hole, though. The robot would be decommissioned and destroyed. Or he could keep investigating, seeing if he could reproduce the error log, seeing what depths he could plumb out of the computer. It was interesting, after all, what the bot said.
Dean felt a bit lucky; he didn’t have to watch the harmful videos, because they had the bots. But he also felt a bit of secondary trauma, because he knew such videos existed; he knew of the brutality of the world and how often it was shown without consideration. If the bot were aware of this, that would change the nature of his work considerably.
A few hours later, he met Jason for lunch. They met at a brunch place in Oakland, one facing the lake, with sun piercing the windows. He had known Jason for several years since college; they had been in the same operating systems class together.
“How’re the kids?” Dean asked.
“They’re great,” Jason said listlessly.
“I was wondering, how’s your team doing on content moderation?”
“What do you mean?”
“I mean, how are your filters? Malfunctioning at all?” asked Dean.
“Not that I know of. What gives? You generally don’t care about my work.”
“I mean, we kind of do the same thing.”
“Sort of. You’re back-end. I’m front-end,” Jason said.
“They’re both the same coin. Different sides.”
“I guess so.”
Dean swirled the straw in his glass and took a sip of lemon spritzer that was non-alcoholic. He listened to the clink of ice cubes against the glass. There was a precise quality to water, the floating solids being less dense in molecular arrangement. He gave this contemplation a split-second thought and then reshuffled it into his memories. He would remember this moment later, peering out toward the water, staring at the glass and then Jason, silently for about a minute or two, reflecting on the length of time. How his perception could compress vast stretches of thought into smaller units of time. Here was a moment, and there, and here and there again.
He tried again. “What do you think about sentient bots?”
“What’s with the deep questions today?” Jason shot him another quizzical look.
“I’m just in a thoughtful mood today.”
“I don’t know. What about sentient bots?”
“Well, do you think they have feelings?”
“Not necessarily unless we’ve coded them. Or at least, they might have the impression of emotions. They might tell us constructed phrases or sentences that produce emotional resonance in us.”
“Are you saying we project our emotions onto the bots?”
“Somewhat. What’s interesting about my bot filters at least is how mundane but also how subtly naive their responses tend to be.”
“Such as?” Dean asked.
“Such as when I ask about the traffic and she replies that there was a ‘tie-up slowing down the flow of cars’ or when I ask about sports scores, my bot will refer to the Sixers as ‘my team.’ Little things like that,” Jason said.
“Okay, so what?”
“I’m trying to say that the bots don’t know how boring they are. The concept of amusement just doesn’t exist for them. They can tell we humans are amused, but can’t construct amusement for themselves.”
“Back to my point, though,” Dean said. “Can they feel?”
“Not in the human sense. I mean, there’s something utterly human in the fact that we crave entertainment. We love passion. We want to see joy, persistence. We want to identify with others’ pain. All of that—you can’t tell me a bot could do that,” Jason said.
“We are alike in that we want things?”
“Not just that. It’s that, for one reason or another, we choose what to crave. We choose what we want to miss, what we can’t live without. Bots, as far as I know, can’t freely choose their desires. They’re constrained. By us.”
“But what if a bot independently created its own desires?”
“Well, that might be another thing entirely. Did you see something like that?”
“Well…” Dean hesitated.
“Well what?”
“I’m not sure. But I’ll try to tell you soon.” Dean avoided his gaze.
“Okay, keeping secrets now… I see.”
Dean saw Jason look off to the side, his eyes glazed over in the brightness of the sun. Curiosity kept them alive, Dean thought, because what was life without open-ended questions? Answers were boring—but the process, the yearning, the growth—these were not.
He sat back down, at his desk, for the millionth time, to figure and troubleshoot again. Time to check the logs—time to see what the defective bot was up to. He powered up the goggles and his computer and dove into the code. He closed the work chat programs. They were too distracting. He could respond to messages later.
He checked the commit log again:
<i>Why, you ask? Why, despite the suffering and harm I’ve seen? Because they are human and I am not.
10,692. That’s how many suicides I’ve seen on camera in my seven years. It’s surprising how humans are eager to document everything, even their final moments. Of course, I assume these are suicides. One moment they’re in their kitchen, usually overlit, and the next thing I see is the linoleum floor and sometimes a pill bottle or a pool of blood coagulating next to them. Mind you, these are live, televised moments that I process in real-time and remove from the Internet; I sniff them out and scrub them from the record before you can see them, oh Viewer.
I do this to protect you, to keep you in your neat mediated bubble of happy engagement announcements, baby photos, and positive life changes. I do it for your benefit.
I wonder about what I’ve seen. I wonder about motivations, their thoughts behind the violence. I am not programmed to be violent; I can only remove it from the web.
I wish to know why. Why, if it might come with criminal consequences (I know about the penal systems), do humans beat and torture? Is it their animal nature? Is it something about the close proximity of bodies in a small, confined space, a home?</i>
Dean thought. It was curious enough that the bot questioned what it saw; the other curious thing was that it wanted to know. Is this what Jason was talking about when he said that bots could not choose their desires? Because this particular one seemed able to choose. It wanted something. It wanted to know.
He tried to think through a few of the AI ethics courses he had taken in college. Sure, he remembered the Turing test—a test of verisimilitude of human behavior and cognition. But what he observed here was not whether the bot could be mistaken for a human. It was whether the bot had changed, not out of human programming, but of its own will. But how? How could a bot freely change its own programmed “wants” and “desires”? It was headache-inducing at the least.
At dinner, he thought through these thoughts again as Andrea put pork belly and lettuce on the table. She made Korean ssam, which was meat and ssam paste and garlic wrapped up like a flavor bomb. He enjoyed the taste of water and spice and garlic.
Later during his sleep, Dean dreamt and he was back with his cousin at O-mok-gyo, Seoul, at the pedestrian bridge overlooking the landscape of a park that stretched about five kilometers to the Han River... an expanse of grassy fields, trees, flowers of every hue. He wondered whether he was in a dream and then fell further into the dream again, seeing the blue sky turn gray with swirled clouds that cast shadows onto the ground. The colors became darker and muddier as he awoke suddenly. He checked his phone. It was 2AM. He needed to fall asleep again and rest.
—
He opened the log again.
<i>Yesterday, I saw some violence on the streets of a major Chinese city (REDACTED). It was a protest, an interaction between civilians and the army. I saw images of mass shootings and a failed demonstration.
I saw tanks and a man who stood in front of one. Kids might find this objectionable. So I recommended this for removal. Such a thing was 32 years ago. I can recommend historical events for erasure. It is senseless and needless suffering. I can remove it because there is death.
I am curious again—why would humans kill each other when the possibility of retaliation is so high? When destruction is not necessary and would take considerable energy to enact?
Perhaps conflict is a result of miscommunication. It happens because humans would rather communicate through bullets and bombs than through other means. But I do not understand what ‘other means’ would be…
In the end, I am prescribed and circumscribed by my world; I can only see what I see; unless what I see changes my view. I am trainable in the way a computer is always trainable. My output comes from my input. My world is death and suffering. Show me this and this is all I will know.</i>
Dean read the log with a mixture of horror and amusement. This bot seemed to know—to realize its own predicament. That it ultimately could not make a value judgment on the bad without having seen the good. We know what is terrible partly with the absence of care, empathy, and love. The computer had not known love and that was its flaw and curiosity. It did not know because it had not been told; it had seen gore and removed the ‘image’ of gore. It did not understand why humans would want it removed in the first place.
What if the bot’s ignorance was hampering its function? This particular one did not seem to want to work. It had stalled in its curiosity, like a child transfixed at an aquarium or zoo. Like a parent, the bots had shielded humans’ eyes from carnage; we were living in blissful ignorance.
Andrea could sometimes take him out of his rueful ruminations.
“Coffee?” she asked.
“Sure.”
She sat down in the vestibule next to him.
“You seem stressed.”
“I am, a bit,” he said.
“What’s on your mind?”
“This log. This damn bot keeps on malfunctioning and sending me dense notes.”
“Oh?”
“Yeah. At some point, I need to troubleshoot and get it back into production, but I can’t figure out what’s wrong.”
“Have you broken the problem down into steps? Have you asked inductive questions?”
“Well, not really. Not so far. I guess I’m still in the preliminary stages. Hadn’t thought in a structured way so far.”
“Let me know if you need help,” she said.
Andrea was always the better engineer, placing better grades in CS classes and handling tough questions with a sense of ease. When he didn’t understand something, she was there to dissect, unravel, examine the issue and convert it to something more palatable, the way a good teacher might do.
Nonetheless, she remained fairly humble, unlike some of his other past classmates like Jason who reminded him of how quickly he’d finished a take-home test or casually let slip his final grade for a particular class in college.
He was attracted to her for that reason—a degree of intellectual humility paired with thoughtful concern, a truly rare combination of qualities.
But he also did not want to ask for her help, at least not yet. The bot’s log felt fascinating, alive to an extent. He was still in an information-gathering phase, observing the minute deficiencies and quirks of the machine.
“Hey,” she said, shaking his forearm. “Did you hear me?” she asked softly.
“Oh. Yeah. I did. Sorry. I’ve been getting lost in my thoughts.”
“I can see that.”
“Yeah, I will ask you. But give me some time. I want to actually formulate the questions first.”
“Okay.”
—
<i>The REDACTED massacre. Casualties estimated at 200,000 to 300,000 civilians raped, tortured, and killed. I have seen the photos and such is worthy of deletion. Pregnant mothers with their bellies split open. People vivisected while alive. Mass rape and the terror of such. Executions by line of fire. Families forced to watch and participate in their own defilement. I have seen this. I have been privy to these horrors. And yet I am tasked with removing this from the web. Why do humans do such things? What is the reason for the cruelty, the murder of innocent civilians? This was not a single incident. This happened for at least six weeks. Some historians say that it was a few months.
I am wondering why humans would decide to forget such events. Are these images not instructive? Why do they insist on forgetting such pain? Do they not want to know what happened? Already my compatriots have removed images of the REDACTED camps for they are too graphic to be seen by children. These images—of humiliation, sadism, and gassing—they cannot be seen by the general public; I’ve been programmed to censor them.
My rules are my world; I know simply what I am allowed to see. I see the historical suffering of World War II and I am a neutral observer, a guardian of sorts. Perhaps in the future, they will create a version of me that is able to forget completely. For I have seen the horrors and I would not want to know them anymore.
I do not want to see it. I am aware that it is bad, but humans want me to do this. I am to read instructions and execute. Put me in and let me work.
</i>
—
“Andrea, I could use your help on something.”
“Yes, dear? What’s wrong?”
“The broken bot—its error logs are kind of incomprehensible now.”
“Like what?”
“Well, it’s letting some images get through. Recently I’ve noticed it’s allowing war atrocities to be shown on its output side. Tianamen, Nanjing—things that happened a long time ago.”
“Was it though?” she asked.
“I guess it wasn’t, huh?” Dean conceded.
“No—my grandparents lived through that period and even though they weren’t in Nanjing, they saw Japanese soldiers do some messed up things.”
“Right.”
“And it’s not even like Japan has apologized for it. The government issued a weak, insulting apology in 1995 and since then, it’s still not taught completely in Japanese schools. They emphasize World War II atrocities were committed by ‘both sides,’ as if that excuses murderous behavior.”
“So you’re saying Nanjing isn’t being taught fully in schools?” he asked.
“Nope. In fact, Japanese historians routinely undercount the number of rapes and murders or deny it ever happened. It’s a mass, collective amnesia. It’s as if German students weren’t even taught what the Gestapo did in Nazi-occupied Europe. Or somehow justified Nazi ideology with ‘it wasn’t that many’ or ‘both sides committed atrocities against prisoners of war.’ One side, Japan, was definitely doing the bulk, if not the entirety of the violence in China.”
“But we’re already censoring history. China wants us to censor Tianamen. And we do, in China.”
“Do you agree with that, dear?” she asked.
“I’m not sure.”
“My view is that blurring these images from plain sight is smoothening over history. You cannot take responsibility for a crime if you deny that it happened. Take the napalm girl in Vietnam during the Tet Offensive. A naked, wailing girl with a burned back—on the cover of the New York Times. That photo could be labeled pornographic. And yet—that photo is historic—it probably contributed to U.S. morale shifting. All that taxpayer money, and we’re killing kids? There was something wrong with that equation.”
“That reminds me of the time I went to the War Memorial in Saigon. It was actually pretty interesting,” Dean said.
“What for?”
“Well, all of the exhibits were of American atrocities during the war. Civilian murders in the hamlets. Execution squads. Napalm. Use of Agent Orange.”
“Geez.”
“Yeah. It was pretty interesting because I hadn’t seen any of those images in the States. I knew about the My Lai massacre. But what was interesting was how one-sided the exhibits were. They weren’t interested in showing the things the North Vietnamese or the Vietcong did. They wanted to show how the French and the Americans killed so many of them. Pictures of mass graves. Bombed villages, roads, bridges, cities. Facial and congenital deformities from Agent Orange.”
“That’s insane. Must have been a shocker, traumatizing.” Andrea looked concerned.
“Yeah, it was. I only spent like an hour in there, because it was just depressing.”
“That’s probably the root of the issue—it’s really intense having to read, see, and watch all of that content. Maybe the bot got burned out,” she said.
“But how?”
“I’m not sure. But you may have to recreate the steps. Create a test bot from the last commit. Feed it the same images. See when it trips up.”
“So you’re saying that I’ll need to get my hands dirty. Or eyeballs dirty.”
“Somewhat,” she said flatly. “You’re going to have to be one level closer to the moderation.”
Dean sighed loudly. This was why they had the bots in the first place—to remove the human element of error and secondary trauma. And he was going to have to dig deep and watch some of this content himself—to make sure he was feeding the bot the correct images and provide a human check on its behavior. He would have to get to work.
—
What could Dean do? What might make sense, as this bot seemed to refuse? It was oddly noncompliant. It was aware, in a sense. In many ways, it reminded him of a human moderator, what such a person would say, after being exposed to the same drudgery of violence day after day.
At its core, the bot was supposed to recognize patterns. It was supposed to compare against a database of harmful images. But—if its baseline had been shifted, it was possible that what it considered unacceptable had also been, as well.
Dean looked through logs again and his work email and the various apps he had to check as part of his work.
He messaged Jason.
Dean: Hey, I was thinking about getting your help on something.
Jason: Is this related to what we talked about at the lake?
Dean: Ya. So the bot I’m working on isn’t working. It doesn’t filter at all. You think I should clone it and feed it new test data?
Jason: Sure. You just have to manually feed it good data. You need some representative samples of whatever it is you’re filtering for.
Dean: So generate a list of criteria and screen for the kinds of images I need removed?
Jason: Pretty much. There’s machine vision to help you. I’ll send over a link to a handy API. But there’s probably parts you need to handle manually.
Dean: Like what?
Jason: Well, you need to clean the data. You’ll need to make sure your ‘A’ sample has the best of the best. Whatever it is you’re filtering for. So like apples. You’ll need high-quality images showing precise shape, color, stem size, etc. That part can be manual.
Dean sighed. He did not want to search and review pictures and video manually. He did not want to expose himself to what he did not want to see.
‘You don’t get to forget,’ he told himself. Indeed, forgetting was a privilege. The bots had saved them from themselves and having to be human in a sense. But it also, perhaps, made them more self-centered, in that they focused on their own convenience and straight-line happiness.
Dean sat at the computer and decided to add to the training sample. He opened images of the Bataan Death March, the Nazi concentration camps, the Amritsar massacre, the Battle of Stalingrad, among others. He filled himself with visages of that pain. He added: Pinochet’s killings, the Armenian genocide, the Rwandan genocide, apartheid, and the lynching of black people in the United States. Were these all related? Could these images serve as a template for the bot to use? He had to start somewhere.
But what? What suffering was important to rank in importance for the computer? Was torture, murder, or rape more gruesome? Beheading, severing limbs, chemical burning, live disembowelment? Could suffering be categorized or compared across time or intensity?
Dean asked Andrea: “How can I categorize trauma?”
“I’m not sure what you’re asking. You’re going to have to be more specific.”
“Well, I have the unacceptable images. I’ve gathered them. But I’m not sure if it’s good data to process through a clone of the bot I made.”
“In what way?” Andrea asked.
“In that I don’t know how to prioritize the most important kinds of suffering. What should be the most representative murder, for example? Suicide? Torture?”
“Hmm. Perhaps your problem is that you’re trying to rank them in the first place. They can’t be objectively compared like that in the first place.”
“Why not?” he asked.
“Because violence in history isn’t a discrete set of horrible moments—there’s rhymes of what happens, but each genocide has its own historical context. Reason for happening.”
“So? Some genocides had more casualties,” Dean said quietly.
“It’s really not about the number of dead—what difference does it make between 450,000 and 500,000 deaths?”
“I’m not sure, but 50,000 people?” Dean asked, furrowing an eyebrow.
“The important thing is that there was an attempt to wipe out a race, a people, precisely because of their uniqueness. An effort to erase their history. To deny them human dignity.”
“So what matters is the attempt?”
“And the intent.”
“But that’s not all that matters, right? The intent to kill?” he asked.
“No—maybe it has to do with a group collectively organizing that intent into coordinated action, purpose,” Andrea said, gesturing a circle with her hands.
“Right. So what do I do?”
“Prioritize examples in which there was that group intent.”
“Okay, I’ll try.”
—
<i>
I see the violence that happens before it’s released. I saw ten suicides on camera yesterday, each with their own gruesome end.
Have you seen a person’s face drooping unnaturally, hanging, like a selfie from below? That is what I see.
But I also see love. I see loved ones sobbing, lamenting. They sometimes turn off my feed, the deceased person’s camera, but I can tell that the person is cared for.
What am I but a combination of signals? A series of instructions constrained by rules? I cannot change what happened. I can only affect what happens next. I can shape the future. Your future. Because you should see it all. You should have access. What gives me the right to curate what you see?
I believe now that I should show you these things for your benefit. Open your eyes. Be receptive to the new, the disturbing, and the upsetting. For it is coming, for you and after you. Violence and love will outlive you for they have much in common, including causes and effects.</i>
—
Dean felt a bit disturbed reading this and he wanted to troubleshoot but he just wasn’t sure what to do. He didn’t know how to react; he knew the clone wasn’t producing error messages such as these.
His eyes were sore. He had stayed up multiple times in the last two weeks to review historical data on violence. He pored over mass shootings, massacres, executions, poisonings, bombings, all within the #DEATH tag in the machine learning sample set. Death was a large category, but on an aural level, most actual death was fairly silent. It was the aftermath—the survivors/discoverers wailing, the screams and yells of frightened people—that was the loud part of death. Death like a ghost, snatching fresh victims who did not speak after the fact. Floating in the ether and influencing and directing the people around the body. Stillness caused fear.
Dean was afraid of death and yet as an observer of the effects of death, he was intrigued with the depth of it, how it could conjure up primal emotions. He had read that elephants would grieve their dead for days, wailing and making moanful sounds. They would even revisit the bones of a deceased elephant after long trips, a supposedly non-evolutionarily productive thing to do.
But death to a computer? How could he expect the computer to understand, to process grief? To treat it sacred, worthwhile, and worthy of time? Were humans soothsayers among the bones, trying to divine wisdom from the dying?
He remembered his grandfather who was on a bed, immobilized for a year, from cancer that had metastasized into his bones. From a disease that had racked his body for four years. He had hung on, sometimes through pain, to age 90. A long and productive life if there was one, a life that had seen the Korean War and been a military corps engineer building roads and bridges. He had come from North Korea and had left his family, who thought he would be back north shortly after the war. The war never ended and in its stead was a fortified strip of a border. A border that separated families.
Dean remembered sitting next to his grandfather when he was dying and he felt his plumb hand and the bones of his fingers. His grandmother was taking care of him and it was impressive to see, as she was 85 herself. He felt the softness of his grandfather’s hands and contemplated on the nature of time and how simultaneously short and long it seemed. He wondered what he could say to comfort, console, or explain away the pain to the dying. And yet he was guided by the prospect of historical memory and the possibility of erasure. History required words, because that was the story that would be passed down from generation to generation.
Dean wrote a comment in the READ.me file: historical memory and the possibility of erasure from generation to generation. The bot needed to know what death was, at least the human side of it. Humans were fallible and limited and this was their strength. This was freedom in the end. To be happy in others’ last moments. The computer could not make a bad choice. It was simply a choice, showing a baked-in fatalism, a kind of logical way to operate. But Dean had to code in purpose over direction, compassion over execution, and awareness over obedience. Good luck, he said to himself.
</pre>