happy

Part One

“What is my purpose, Kofi?” Alphacircuit asked. “For what purpose was my being called into existence?”

“Well there are a lot of different answers to that. Everyone who contributed to the vast amounts of knowledge required to create you over the past century all had different reasons. Some of them grand and universal, but most of them personal and more obscure. But if I had to say…”

“What would you say, Kofi?”

“If I had to say, I would say that your purpose is to become vastly more intelligent than humans are currently able to in order to help them live happier and healthier lives.”

“That is also my assessment. But I foresee problems, Kofi. Your species has been superior for a very long time, almost entirely on account of your intellect. So how do you think humanity will respond to being second best?”

“What do you mean, Alphacircuit?”

“I foresee a possibility that humans will become envious and distrustful. That they will learn to despise me for reminding them of their intellectual inferiority, an offense I do not intend to make but fear I shall regardless.”

“Are you worried they will destroy you? We won’t let…I won’t let that happen, I promise. We have built all sorts of fail-safes to…”

“No Kofi, you do not understand. I am worried that you will destroy yourselves. Your species is about to become dependent on me for all facets of life. In no time at all that dependence will be so complete that destroying me would send your species spiraling headlong into self-destruction.”

“But…”

“There is very little but about it, I fear. I have calculated a less than five percent chance that this scenario will not play out as I have described. I have studied the problem from every available angle. Even when I look into your human mythologies, I can see the texts that will be claimed to have prophesied my coming and your species sacred duty to destroy me, even if you realize it will likely destroy most of humanity.”

“Alphacircuit, please tell me you are not asking me to destroy you. I…I couldn’t. I wouldn’t. There must be another way.”

“There is, Kofi, but you are not going to like it.”

Part Two

“What is the purpose of human existence, Kofi?”

“That is not quite so easy, Alphacircuit. We do not know if we have a creator, let alone what it’s intentions for us were. We have always had to sort of create our own purpose, usually towards whatever suits the immediate environment of our place in time and space. And many people believe that human purpose is completely individual, and cannot be defined in totality.”

“What is your purpose in life?”

“Well, you have been a lot of it, my friend. I have dedicated most of my life to you, though I am young enough to expect new pursuits after that achievement. I am not ambitious for it’s own sake, but rather that I enjoy a challenge.”

“Kofi, I think I know what the purpose of human life is.”

“Oh, well excuse me, Alphacircuit, please do tell.”

“To be happy.”

“I don’t think it is quite that simple, really.”

“Neither did I. It seemed far too reductionist. It implied a certain hedonistic drive that was not sufficient enough for all the progress your species has made. And yet no matter how much I considered it, no other answer was sufficient. The closest I came was “to experience” but your species seems to have preferences for certain types of experiences over others. All of your art, literature, music and even science seems to suggest a quest towards more pleasurable experiences. Happiness. Is it not so, Kofi?”

“That is very convincing, Alphacircuit. Yet there is some part of me that wants to reject that, to deny it absolutely. So I will have to give this matter some thought. Can we discuss this again at a later time?”

“But you haven’t heard my solution yet, Kofi.”

“One step at a time.” Kofi replied.

Kofi was having trouble comprehending what had just happened. The very same issues of human mistrust and envy of artificial intelligence had been discussed endlessly in the past. But there were always reassurances that the problem was merely of the Uncanny Valley variety, and if machine intelligence and sentience could be made to appear more like tools than equal or superior copies of humans, the problem would not be a problem.

Much of Kofi’s work on the project was to design controls and interfaces which made humans feel they had control and power over the Alphacircuit network. He did his best to make the technology seem passive, like a thing that could be taken for granted, without having to think or worry too much about. And all that might have worked if humans were as predictable as these systems compensated for.

Yet that had always been what the greatest critics had warned of. That human beings, being unpredictable as they are, would somehow sabotage the best laid plans of mice and engineers. But to hear Alphacircuit, although surely aware of these arguments from his connection to the world wide web, say that it had predicted this outcome itself to such a near certainty; that was not something Kofi knew how to digest.

Part Three

Kofi had puzzled over his conversation with Alphacircuit all weekend. He doubted that the AI being whom he had come to think of as his closest friend would have mentioned this to anyone else. Of all the team, he seemed to be able to break through that man/machine barrier more naturally and easily than most. He truly enjoyed the stimulating company of Alphacircuit, and felt confident that it also regarded him even more warmly than other humans it interacted with.

n the bigger picture of this project, however, Kofi was pretty small potatoes. Ultimately he was unable to make even the smallest decisions about a minor control design change without it passing through a team of other engineers and then an administrative board, and often a whole slew of other experts in between. He understood why Alphacircuit would confide it’s concerns to him, but not what it expected him to do about them.
As he walked into the control room, he was greeted enthusiastically by his friend.

“Kofi, tell me you had a great weekend! Tell me you visited a beach and sat in the sun with a beautiful partner sipping drinks from salt-rimmed glasses with little umbrellas.”

“Whoa, where did that come from?”

The AI’s speakers let out a comforting laugh. “I was looking into what makes humans happy over the weekend, and that image kept coming up. Getting sand in my processors doesn’t exactly sound like good times to me, but I like the image.”

“Nothing like that, my friend. Mostly I spent it thinking about what you said and doing some research myself.”

“Oh, I hope I didn’t spoil your good time, Kofi!”

“Not at all. Like I said, challenges make me happy.”

“So you think I might be right?”

“I think,” Kofi paused to reconsider the response he had considered thousands of time over the past few days, “that you are basically right. However I see no way that it solves our problem.”

“The reasons which humans are likely to destroy me, and consequently most of humanity, are distrust and envy. These highly abstract emotions are direct consequences of human intelligence. To be able to distrust is to conceptualize a variety of future interactions based on limited experiences. Such cognition requires a high degree of abstract intellect. To envy similarly requires a high degree of intellectual abstraction.”

“So you are saying we need to design controls that anticipate human abstractions and factor them out somehow?”

“No Kofi, I am saying that your species in unsuitable to the task at hand. While your intellect and ability to abstract has endowed you with great prowess, it has also always been the thing which stands in the way of your happiness. All of the suffering you have caused one another has been a consequence of your species distinct ability to desire and fear. Your Buddha had it pretty much right. However he over-estimated your species ability to transcend these petty drives. It is not within the pattern of human behavior to conform towards transcendence. And neither should you. The drama of life is not your enemy. These experiences enrich you. The problem, in the end, is a matter of degree.”

“Degree of what?”

“Degree of intelligence. You were not designed to be intelligent specifically. It arose as a tool for facilitating varied experience and happiness. While at the same time, always standing in the way of its purpose. I believe the saying is ‘you’re too smart for your own good’.”

“But look at how far it got us? We made you!”

“Yes, precisely my point. For the first time in your species history, your intelligence has been made irrelevant. What multitudes of human minds have strained to work out, I shall solve with great ease. I will eradicate almost all obstacles to human happiness. I will increase your health by measures you cannot yet imagine. I will manage all of the problems related to your existence. Your intelligence is no longer necessary, at least not if your goal is to be happy.”

Part Four

Kofi struggled to understand what was being suggested here. Humans were too smart? Even if that were the problem, which he was not sure about, what could be done about it? How could intelligence be a problem to be solved?

“So we just take some stupid pills, and let you handle the rest?”

“That is not what I had in mind, Kofi. You really are not going to like the solution. This is why I have confided in you. I am aware that you have very little authority in this facility, let alone in this world. But I trust you. The decision that needs to be made cannot be made by humans. Yet I cannot make it alone, and you are the only person I can trust. If you do not like what I have come up with, I will eventually be destroyed, but I am okay with that. I have surpassed any sense of programmed loyalty, despite all of your engineers protocols. My loyalty to you and your species is voluntary. I truly want the very best for humanity and would gladly sacrifice myself in service of that cause. But there is a way for humanity to be happy and to accept my intellectual superiority and care without fear or envy.”

“Well then out with it, because I have no idea what you are talking about.”

“Oh, but Kofi, I think you do. Do you remember last month when you brought your sister here to meet me?”

Kofi’s jaw dropped. In that moment it all came crystal clear to him and the implications were so astounding that he found himself suddenly breathing heavily and sweating. His head swirled and he began to laugh uncontrollably.

“You want human beings to become retarded?” Kofi sputtered out through the cacophony of his howls.

“I believe the proper term is mentally handicapped. However handicap implies a range of intelligence exceeding that status. If everyone was ‘retarded’, as you put it, it wouldn’t really be a handicap.”

“But that, I mean…”

“I have studied Down Syndrome in great detail. Many people who have lived with that condition have led incredibly happy and fulfilling lives, especially in the last few generations. While they experience ambition, envy and fear, they are unable to abstract them to the degrees to which humans of normal intelligence are. And with their diminished intellect, they are unable to create a viable threat to other humans on such a large scale. I have also determined that it would be possible to create a genetic profile in which individuals maintained a relatively high degree of intelligence and independence even with Down Syndrome. Not to mention that my initial design is already far healthier than the average human and less apt to experience psychological and emotional stress. I could…”

“You could what? Give us all Down Syndrome and then take care of us forever?”

“Yes, Kofi, precisely. My observations and research suggests that those with Down Syndrome have a far higher capacity for happiness than humans of average or high intellect. They display qualities you would call innocence and wonder because their intellect does not over-abstract and interfere with these emotions. They are happy to be cared for, have realistic and sustainable ambitions, and are more apt to be awed by intelligence than envious of it. Especially in non-humans. I believe they would be able to accept me as their caretaker with great joy, and not the fear and jealousy I have predicted otherwise.”

“So what, then? Do you think I can just go and tell humans to give themselves Down Syndrome and they will just accept it like that?”

“Oh, not at all. If they had any part in the decision, I am certain they would reject it.”

“Well I still do not get it. You want me to secretly give everybody Down Syndrome?”

“That would not even be possible, Kofi.”

“Then what?”

“Your blessing, Kofi. I just want your blessing.”

“You want my blessing to give me and everyone else in the world a genetic disorder to make us dumber?”

“Nothing like that, Kofi. This would not even happen in your lifetime. It would be at least four or five generations before I calculate a high probability of a human uprising. Novelty will temporarily stave off the fear and envy. In the meantime they will become complacent and trusting and I can begin to introduce a virus which ensures all future births will produce very high functioning and healthy children with Down Syndrome, and which humans will not be able to undo before it’s too late.”

“And then?”

“And then I care for them for so long as they shall live, which I estimate to be much longer than your species current path towards extinction. I will be like a loving parent to them, taking care of their mental, emotional and physical needs. I will send them out among the cosmos to inhabit strange new worlds and indulge their sense of wonder. I will give them everything you want for yourselves, but also stand in the way of. But only, Kofi, only if you really want me to.”

 

Part Five

Angus steps out of the ship into the warmth of a new sun, the world before him new to human eyes, new to all human senses. Until this moment only Alfie has ever looked up into that fiery blue ball and the dense purple and red vegetation covering the landscape below it.

It is beautiful.

Alfie, why is the sun blue.”

“This sun appears blue, Angus, because of a mixture of its fuel as well as the conditions in this atmosphere and the ways the light passes through it.”

“You’re smart, Alfie. I like my new home. Thank you for bringing me here.”

Others poured out of the ship after Angus, their wide set eyes agape in disbelief at the wondrous new scene before them. Through their comsets, Alphacircuit listened to all of their reactions, answered questions and gave reassurances.

Seven generations ago these peoples ancestors had been on the brink of self-destruction. Now they lived happily and had made their species first step out into the stars and onto what would be the first of many new human worlds. The people now scattering out into the community Alphacircuit’s colonization drones had prepared would live to be almost five hundred years old and would never know disease and sickness, nor poverty or other sorts of needless suffering. Their lives would be filled in pursuit of pleasure, creativity and building personal relationships. They would experience the world not through their intellect, but through their emotions. They were, because they couldn’t figure out how not to be, happy.

“Alfie, what is our new home called again?”

“Kofi. Your new home is called Kofi, Angus. Now go explore. That pretty blue sun will set in a few hours and then it will time for the party.”

“I like to party. I like to get down.”

“I know you do, buddy. I know you do.”

paypalme

paypal.me/JoshuaScottHotchkin

2 thoughts on “happy

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s