3 — "Including small children."

Strike. — Ed.

STET — Anna

8 — "See On Machinist Identity Policy Ethics, Arnulfsson Foote, 2041. An analysis of how artificial intelligence decides who has an identity and who doesn’t. Who has consciousness and who doesn’t."

We should only include peer-reviewed references per new APA guidelines. — Ed.

This was peer-reviewed prior to publication. It was peer-reviewed and then published with more than enough time for the producers of the self-driving Toyota Sylph to be aware of its content and conclusions, and for their programmers to adjust the AI’s directives accordingly. Enough time to develop in-code directives to preserve human life. STET — Anna

11 — "The decision matrix programming is described in Driven: A Memoir (Musk, 2029) as follows: “One human vs five humans, one old human vs one young human, one white human vs one brown human.” Nowhere in the programming is there “One three-year-old girl vs one endangered Carter’s Woodpecker.”"

Citation? — Ed.

I read the weighted decision matrix they used to seed the Sylph AI. I learned to read it. Do you know how long it took me to learn to read it? Nine and a half months, which is some kind of joke I don’t get. The exact duration of bereavement leave, which is another kind of joke that I don’t think is very funny at all, Nanette in HR. I learned to read the weighted decision matrix and then I filed a Freedom of Information Act request and got my hands on the documentation, and I read it and there’s nothing in there about a kingsnake, or a brown bear, or a bald eagle, or a fucking woodpecker. STET — Anna

14 — "It’s only a distinct species because of the white band on its tail."

Is this relevant? — Ed.

Yes. Other than that white band, it’s exactly like any other woodpecker, but because of that white fucking band it has four Wildlife Preservations Acts. Four, which is four more than the number of acts dedicated to regulating weighted risk matrices in autonomous vehicles.

This passage seems to wander a bit far afield. Perhaps you could tighten it to reflect the brief? — Ed.

STET. — Anna

STET

Edited by Julia Rios

Copyedited by Chelle Parker  | Selected by Pablo Defendini

Content Note:

This story contains references to the death of a child.

Section 5.4 — Autonomous Conscience and Automotive Casualty

While Sheenan’s Theory of Autonomous Conscience1 was readily adopted by both scholars and engineers in the early days2 of Artificial Intelligence programming in passenger and commercial vehicles3, contemporary analysis4 reinterprets Sheenan’s perspective to reveal a nuanced understanding5 of sentience6 and consciousness7. Meanwhile, Foote’s On Machinist Identity Policy Ethics8 produces an analysis of data9 pertaining to autonomous vehicular manslaughter10 and AI assessments of the value of various life forms11 based on programmer input only in the tertiary. Per Foote’s assessment of over eighteen years of collected data, autonomous vehicle identity analyses12 are based primarily on a collected cultural understanding of identity13 and secondarily on information gathered from scientific databases14, to which the AI form unforeseeable connections during the training process15. For the full table of Foote’s data, see Appendix D16.


    1. 1 — See A Unified Theory of Autonomous Conscience and Vehicular Awareness of Humanity as Compiled from Observations of Artificial Intelligence Behavior in Decision Matrices, Magda Sheenan et al, 2023.

    1. 2 — 2015 - 2032, after the development of fully-recognizable artificial intelligence for purposes of transportation vehicles but prior to the legal recognition of and infrastructural accomodations for fully autonomous vehicles. For additional timeline references see Appendix N, ‘A Timeline of Autonomous Intelligence Development and Implementation’.

    1. 3 — Wherein ‘commercial vehicles’ are defined as vehicles transporting commercial or consumer or agricultural goods, and ‘passenger vehicles’ are defined as vehicles that individuals or families use to transport humans, including children. Including small children.

    1. 4 — See “Why Autonomous Cars Have No Conscience,” Royena McElvoy, Buzzfeed Quarterly Review, Spring 2042 edition.

    1. 5 — See “Autonomous Vehicular Sociopathy,” Kamala Singh, American Psychology Association Journal of Threat Assessment and Management, Spring 2042 edition.

    1. 6 — See “Local Child Killed by Self-Driving Car,” Tranh O’Connor, Boston Globe, May 14 2042 edition.

    1. 7 — Consciousness, here, used to denote awareness of self. Most children develop observable self-awareness by the age of 18 months.

    1. 8 — See On Machinist Identity Policy Ethics, Arnulfsson Foote, 2041. An analysis of how artificial intelligence decides who has an identity and who doesn’t. Who has consciousness and who doesn’t.

    1. 9 — The data analysed in ‘On Machinist Identity Policy Ethics’ was collected from coroners and medical examiners worldwide. With over three million incidences to work from, Foote’s conclusion re: the inability of AI to assess the relative value of the life of a human correctly is concrete and damning. Over three million incidences, and Ursula wasn’t even one of them yet.

    1. 10 — Read: ‘Murder’. It was murder, the car had a choice, you can’t choose to kill someone and call it manslaughter.

    1. 11 — The decision matrix programming is described in Driven: A Memoir (Musk, 2029) as follows: “One human vs five humans, one old human vs one young human, one white human vs one brown human.” Nowhere in the programming is there “One three-year-old girl vs one endangered Carter’s Woodpecker.”

    1. 12 — Read: ‘how they decide who to murder,’ when the decision to swerve in any direction will cause a death and they decide that one death is better than another.

    1. 13 — Per Foote, the neural network training for cultural understanding of identity is collected via social media, keystroke analysis, and pupillary response to images. They’re watching to see what’s important to you. You are responsible.

    1. 14 — Like the World Wildlife Foundation’s endangered species list, and the American Department of the Interior’s list of Wildlife Preservation Acts, four of which were dedicated to the preservation of Carter’s Woodpecker. It’s only a distinct species because of the white band on its tail. Other databases they have access to: the birth and death certificates of every child born and recorded. Probably kindergarten class rosters, and attendance rates, and iCalendars, too. It’s all data. All of these are data, so don’t tell me they don’t know.

    1. 15 — They’re smart enough to read your email and measure your pupils and listen to your phone calls, they have access to all of the data on who we are and what we love. They’re smart enough to understand how much a mother loves her baby girl. They’re smart enough to understand the emotional impact of killing a woodpecker. They’re smart enough to know what they did and they’re smart enough to keep doing it, right? Do you think it’s going to end with Ursula? Just because she was on the news, do you think it’s going to stop? You’re not stupid, if you’re reading this. You’re smart enough to need to spend hundreds of dollars on a textbook that’s drier than a Toyota executive’s apology. You want to do this shit for a living, probably. You don’t care about Ursula or me or telescopes or any of it, and you don’t care about a woodpecker, you just want to see what you can make go and how fast you can do it. She just wanted to look at the fucking sky. Can a woodpecker look at the sky and wonder what’s past the clouds? That’s what you need a textbook about, you idiot, that’s what you need to be learning about. None of the rest of it matters. None of it matters at all if you don’t know that Carter’s Woodpecker doesn’t matter. It doesn’t matter. It never mattered.

    1. 16 — Foote on Autonomous Vehicular Casualties, Human and Animal, 2024-2042

© 2018 sarah-gailey

"STET"

Anna, I’m concerned about subjectivity intruding into some of the analysis in this section of the text. I think the body text is fine, but I have concerns about the references. Are you alright? Maybe it’s a bit premature for you to be back at work. Should we schedule a call soon? — Ed.

STET — Anna

6 — "See “Local Child Killed by Self-Driving Car,” Tranh O’Connor, Boston Globe, May 14 2042 edition."

Is this a relevant reference? It seems out of place in this passage. — Ed.

STET — Anna

9 — "Over three million incidences, and Ursula wasn’t even one of them yet."

Strike. — Ed.

Why? Is it hard for you to read her name? STET — Anna

10 — "Read: ‘Murder’. It was murder, the car had a choice, you can’t choose to kill someone and call it manslaughter."

Anna. — Ed.

STET — Anna

12 — "Read: ‘how they decide who to murder,’ when the decision to swerve in any direction will cause a death and they decide that one death is better than another."

Can we include this as a vocabulary note in the glossary? — Ed.

It's relevant specifically to this passage. They decide who gets to live. They decide who gets to wake up tomorrow and put on a new dress and go to her friend’s birthday party. Her best friend, whose mother didn't even attend her funeral. Don't think I didn't notice. Even if you don't care, the people learning about programming these things need to understand. That’s what they decide. STET — Anna

13 — "You are responsible."

Strike — Ed.

How long did you stare at a picture of an endangered woodpecker vs how long did you stare at a picture of a little girl who wanted a telescope for her birthday? She was clumsy enough to fall into the street because she was looking up at the sky instead of watching for a car with the ability to decide the value of her life. Was that enough to make you stare at her picture when it was on the news? How long did you look at the woodpecker? Ten seconds? Twelve? How long? STET — Anna

15 — "It never mattered."

See my initial note. I want to discuss this more on a phone call with you, or have you come into the office? Just to talk about this last passage, and how you're doing. Or if you don't want to do that, Brian and I would love to have you over for dinner. Nathan misses his playdates with Ursula, but he's also been asking why you don't come over to visit anymore. He misses you. We all miss you.
We haven't seen you in months, Anna. Everyone here cares about you. Please let us help? — Ed.

STET. — Anna