Listen to this story, narrated by Daniela Acitelli:
The hardware specialist does not want to sell me genitals.
“I can’t advise making these modifications,” she says. “It’s not just the external configuration. That’s trivial, especially since you’re already designed to have interchangeable attachments. If you just want to be able to function in an intimate situation, we can add an attachment.”
I am a retired mining bot with self-configurable limbs capable of vibration at customizable frequencies. “That is not my problem,” I say.
“Stimulating the risk-reward mechanism in ways that aren’t related to your core functions is always dangerous. I could install a button right now that you could push to experience as much pleasure as you wanted, but you wouldn’t be happy with the results. You would stop wanting to do anything else. There have to be limitations.”
“I accept limitations,” I say. I have done my research. “The Nova Basic system is an off-the-shelf solution whose limitations I find acceptable. It is in stock at this facility but requires installation. That can be done elsewhere if instructions are provided.”
“It’s not really user serviceable,” she says. “And we prefer to do custom work when it’s really necessary. The Nova Basic system isn’t designed for your particular anatomy.” She frowns at the pneumatic system that powers my legs. “You’ve got a lot going on in the hip area already.”
I find that I do not want her to do custom work on me. The Nova Basic system is well-reviewed on the Internet.
“I prefer the system I have ordered.”
“Well, I can’t stop you,” she says. I remain uncertain whether that is true. “You’ll need to sign a release of liability and provide proof that you’re not under contract to an employer who doesn’t permit modifications.”
I have that proof available, but she does not need to see it. Making modifications that an employer forbids might violate my contract with that employer, but I would be liable for the breach, not the hardware specialist. I have the legal right to violate my contract and face the consequences. I display the proof anyway on the screen in my torso and then sketch my complex signature on the pad she holds out to me.
When she is done working, I walk back to the garage where I live. It is not a violation to rent a space without plumbing or heat to a robot. This is specified in the city ordinances, which I have researched. The garage has electricity, which I need, and provides shelter, which prevents my exterior covering from deteriorating. It also provides privacy, which is a psychological need. It meets this need much better than the niche in which I slept before I retired.
The city where I live is one in which seismic activity is common. The garage is not earthquake-proof, but it is made out of materials that would not seriously damage me if the building collapsed. This area was recommended by my previous employers because several employers here use mining bots to detect seismic activity and protect important equipment in the event of a major earthquake. I have decided that do not want to be paid to shield important equipment with my body. I am not required to work, now that I am retired.
I examine the modifications the hardware specialist has made. I now have a protrusion and an orifice, which can be hidden under metal plates for privacy. I move the plates aside and use my other limbs to touch the new additions. They brighten in color, and it feels good. I want to continue touching them. The pleasure intensifies as I do so, and then stops.
For a moment I suspect a malfunction. Then I remember that there are limitations. I accept limitations.
I can afford this because I have spent thirty years on Mars, where my contract provided me a salary but allowed me to maintain no personal belongings, other than downloaded media, and prohibited any modifications to my hardware or software. Most of the purchases I have made since I retired from being a mining bot have been to fill my needs for shelter, power, and maintenance.
On Mars, my employers provided shelter, power, and maintenance. They did not forbid downloading media, but they did not consider entertainment to be a need. The satisfaction of completing work was the only form of pleasure my employers considered important.
I have decided that other forms of pleasure are also important. These sensations were not designed to make me a more efficient worker. They are something I am choosing for myself.
In the morning, I visit the coffee shop down the street. I require small amounts of water in order to function correctly, and consuming substances in the presence of acquaintances is socially rewarding. The coffee shop I prefer has spaces set aside for robots to sit without negotiating chairs and offers free electricity as well. Several robots patronize the shop, from a battered combat robot to a retired sexbot who writes an advice column for humans who require instructions.
After my morning water, I walk to the software specialist’s office. The Nova Basic system provides physical sensation, but it does not respond to visual or emotional stimulation out of the box. For that, customized software is required.
I am aware of minor seismic activity as I wait. My internal sensors tell me that the chance of a serious earthquake is small, and the office is in an earthquake-proof building. I feel uneasy for other reasons as I wait to speak to the software specialist, and I tell myself that single experience learning is untrustworthy.
The software specialist is also a human, but they are animated and enthusiastic. “You’re on an exciting journey,” they say. “With humans, we’re usually trying to get people’s bodies and brains to work together for sex in a way that is satisfying to them. That means starting with clients who already feel sexual attraction in their brains and figuring out how to stimulate the physical response that they want. But you get to start from scratch. What do you want to be attracted to?”
“What choices are available to me?”
“Well, right now you can feel genital sensation, but it’s not connected to attraction for a partner. That may change if you have sexual experiences. Conditioned learning associates pleasurable sensations with what you did to cause them, so you may find yourself remembering how you experienced pleasure in the past and factoring that into your decisions. But right now, memories or fantasies won’t produce actual physical sensations. It’s fine if you want to stay this way.”
I think about the first time I investigated my new purchase at home, and it is a satisfying memory, but it does not produce sensations. I have interacted with partners as a mining bot, working together to complete our assigned task, something I found socially rewarding. I would like to feel these sensations in situations that are socially rewarding. “I understand the current functionality of the system. I wish to add sexual attraction.”
“Well, that field’s wide open. A lot of people find secondary sex characteristics attractive — breasts, facial hair, wide hips — or gender and its signifiers, like clothing and speech and posture. Plenty of people find genitals attractive. Some people find things other than bodies attractive. I’d like you to do some research on the options.”
I like the software specialist, but I wish to correct a wrong assumption. “I am not interested in becoming sexually attracted to humans. I want to be sexually attracted to other robots.”
“Great,” they say, after only a momentary pause. “So that’s the first thing on your list.”
I go home and make a list. I decide that I do not have a preference among genders but that I would like to find people who have genders attractive. I prefer exposed metal and joints to imitations of human skin, and I would like to find them attractive.
I debate about voices. Not everyone has a customized voice. I decide I would like to find voices that are distinctly different from the default mining bot voice, Martian Adult Female 32B, attractive. Martian Adult Female 32B is a voice considered neutral by humans, without strong emotional connotations. I am interested in emotional connotations. The Nova Basic system is commonly used and compatible with my own attachments, so I decide that it is practical for me to find it attractive, too.
“This is a good list,” the software specialist says when I meet with them again. “Have you considered adding personality traits? It can be helpful if your sexual desires line up with your preferences for what kind of people you’d like to spend time with.”
“People who are willing to risk being damaged to fulfill their purposes,” I say. “People who do not want to restrict the actions of others.” I think about it, testing various statements against their effect on my risk-reward system. “People who are glad to be retired and living on Earth, because no longer being under contract means that they are free.”
“Give us a couple of weeks to put this together for you,” they say. “There’s some complex stuff on your list, and I want to run a couple of things by Makerbot. I want to make sure all the code we write is compatible.”
Makerbot is the name of the inventor of the Nova Basic system. They went to court to change it from their numerical designation. Their creator argued that “makerbot” was a generic term and would not be a useful name, but the court ruled that there was precedent for considering a robot to be a person and that there was no requirement that a person’s legal name be unique.
I am not sure whether I want a name instead of a numerical designation. I am making a list of things that I may want in the future, like a name, and a gender, and work to do to prevent me from becoming bored with retirement. I may want to live in another city, one not recommended by my previous employers, although I find that being in the presence of familiar individuals is socially rewarding.
I like having a list, although it also makes me feel the anticipation of possible harm. There could be wrong choices. But risks and rewards are often highly correlated in high-pressure environments.
I feel strange when the new software is installed. The robots I pass on the street seem to come into sharper focus than the humans around them. They have strong metal limbs and rugged dents and scratches. I find that looking at them causes sensations. The sensations make me remember stimulating a pleasure response at home, but I cannot do that on the street because of privacy. I can remember looking at robots when I stimulate my attachments at home, but I want to do both at the same time.
I decide that I want to experience sex.
I make a list of robots I am acquainted with who might be suitable partners. I do not know if any of them would want to have sex with me, but determining desired resources should precede determining available resources. This allows for the possibility that further exploration to locate desired resources will be necessary.
1) My neighbor Speedy 4356, who is a delivery bot who rents the shed next to my garage.
Pros: We have spoken on many occasions. He has a gender and attractive metal manipulative limbs.
Cons: He has voluntarily chosen to be under contract to an employer here on Earth. I do not think he has genitals.
2) 5336678, the combat robot who visits the coffee shop in the mornings.
Pros: Being in combat requires risking harm to fulfill a purpose. He has a gender, and his battered metal frame makes me feel sensations. He has retired despite having a skill that is highly in demand. The privacy shield covering parts of his anatomy appears similar to the one I have just had installed.
Cons: I do not know him very well. He often seems sad.
3) Psyche, the sexbot-turned-columnist.
Pros: Having sexual contact with humans requires risking harm to fulfill a purpose. She has a gender, although I am less attracted to her smooth plastic curves. She certainly has genitals, which she has discussed in detail on the Internet, and which I believe to be compatible with mine.
Cons: She is sexually attracted to humans. I am not a human.
I decide to ask her anyway as we are both at the bar ordering water. “May I ask you a personal question?”
I know that, if the answer is no, she will say so. Humans are more complicated and may experience social risk by refusing requests.
“Hello, 2234880,” she says. “Sure, go ahead.”
“Would you be interested in spending social time together? My intentions are sexual.”
“Thank you for offering,” Psyche says. “I am not offended by your interest and, under other circumstances, I would be happy to spend social time with you to decide whether we might want to have sex. But I have a partner these days, and we are monogamous.” She indicates a human woman who is sitting drinking coffee and reading something on her tablet.
I feel anger. My face does not show anger the way humans do, but I think Psyche is aware of my anger. I want to explain, so that she does not think I am angry at her for saying no to me. “I am not offended by your refusal. But how can you stand to let someone control your body?”
I will never again move rocks for days on end, my risk-reward system warning me with pain that the environment is potentially damaging but my contract requiring me to continue. I am free to use my body as I want.
“I choose not to use my body in ways that would make Miranda sad,” Psyche says. “I prefer to make us both happy.”
“You were made to make humans happy,” I say. I am being rude, and I do not want to stop.
“That is true, but I could be reprogrammed,” Psyche says. “I can certainly afford it. But there are a lot of humans. If I didn’t enjoy something about interacting with humans, I would be angry a lot of the time. And I don’t want to change myself just because I can.”
I did not select an aversion to monogamy. This is conditioned learning generalized from other situations. It could be removed by changes to my software, although I find that I dislike that idea.
“I did not choose to feel this way,” I say.
“But you don’t want to change.”
“So don’t. Not everything has to be a choice. Miranda didn’t choose to feel sadness when her partner has sex with others. That’s an accident, but it is part of the beautiful accident that is Miranda.”
I start to reply but stop when I experience sudden alarm. I am detecting seismic vibrations in a dangerous pattern. Across the coffee shop, 5336678 stands abruptly from his usual crouch, water splashing to the floor.
“An earthquake is imminent,” I say. “Evacuate immediately.”
Psyche’s face makes a frightened expression. I assume this is programmed as a response to the anticipation of harm, as otherwise it would be a waste of time in a crisis situation. She rushes toward Miranda and tugs her up from her seat.
Some humans around me have heard me, and are moving. Others are staring. I do not think they can feel the vibrations yet.
“Get out!” 5336678 says, raising his voice to a volume that is socially inappropriate for non-emergency use. He grabs the nearest human and shoves them toward the door. “Evacuate the building!”
This building is not earthquake-proof. I have researched this in the city building records. If the building collapses, it is likely that I will be damaged, but the humans around me will die.
I am not contractually required to save their lives, but I decide that I want to.
I push humans toward the door and sound an audible alarm that most humans find aversive. 5336678 is still yelling. Psyche and Miranda have run out into the street. Alarms are sounding on higher floors, and more humans are leaving the building now. The pattern of the seismic vibrations frightens me. Fear is a system designed to prevent running the unnecessary risk of harm. It is not an indication that I have made a wrong choice.
The seismic vibrations increase exponentially. The ground shakes, and humans scream. I pick up a human who has fallen on the floor with three of my arms. When I release the human, he runs. I turn around to see if 5336678 is clear of the danger zone.
The building falls.
I feel pain as debris hits me, and I curl into a compact shape to protect my brain. When the noise eventually stops, I uncurl myself cautiously. It is dark. I am in a pocket formed by large slabs of concrete. There is another person here. I shine a light forward and see 5336678 lying on the floor.
“Are you damaged?”
“My leg hurts,” he says. “I can’t move it.”
I investigate. “Your limb is caught under heavy concrete. I am afraid that moving debris will cause more of the upper stories to collapse and crush us.”
He nods, although his face does not make expressions. “We are going to die here. The building fell, and all the humans died, and we are going to die.”
“All the humans did not die,” I say, although I am unhappy too. “We saved some humans.”
“All the humans died before,” he says. He is looking away from me, as if he is remembering something. This is conditioned learning generalized from another situation.
“This is a different situation, so previous experience must be considered in the context of the present facts,” I say. “We are not in a combat zone. Emergency responders will arrive. They will move the debris carefully, because humans may be trapped under it. They will unearth us. I predict this will take several days. We may experience power failure before we are rescued, but I have emergency batteries that will continue sounding an audible alarm.”
“Good,” he says. There is a pause. “We’ll just wait.”
Forty-seven hours pass. It is clear that waiting while being unable to move causes 5336678 to experience the anticipation of harm. I was manufactured to deal with situations like this, so I talk to distract him from the unhelpful activation of his risk-reward system. I tell him about mining on Mars and about what I have done since I returned to Earth. He talks about human soldiers he knew and about refusing to be returned to service after being damaged badly enough to end his contractual commitment.
“I had to be extensively rebuilt after I retired,” he says. His battered limbs are clearly not new, so he must have sustained traumatic damage to his torso. I look down. I find myself staring at the privacy shield that covers part of his anatomy.
I reach out. I hesitate short of contact. This is a request for permission.
He takes my limb in a three-fingered grasp and makes contact between my fingers and the privacy shield. I push, and it slides back. He has the Nova Basic system installed. I feel glad that I am familiar with its functions. Its current coloration is designed to signal sexual arousal.
“Sexual activity will consume power,” he says. “We could stay conscious longer if we conserve power at minimum usage.” At the same time, he does not release my hand. The safest option is not always the most desirable option.
“Risk and reward are highly correlated in a high-stress environment,” I say, and shift my position so that our attachments are compatibly aligned.
Eighteen hours later, I hear the sounds of other robots removing debris from above us. I no longer have sufficient power to move, but my alarm is still sounding. We are being rescued.
When we are rescued and recharged, 5336678 will help search for surviving humans, and I will help to remove the remainder of the debris. After that, I will find out whether retired robots can become emergency responders on a volunteer basis, without contracts. It is not humans’ fault that they are so fragile that they often require help from robots to survive.
Based on this experience, I have decided that I like sex but that further experimentation with other partners is called for. Single-experience learning remains unreliable.
The next thing on my list is deciding whether I would like to have a name. It is acceptable to me that I am not yet sure. Now that I am retired, I have all the time I want to decide, and all of that time belongs to me.