Kausik Subramanian
6 min readAug 3, 2015

Feynman

Altris’s new Exxon Series system christened “Feynman” resided magnificently in the Tomorrow Lab, Philadelphia, and was looked upon in awe by a dazzled Dr. Johnny Carson. It was one of those things he never thought he would see in his lifetime, like lightsabers.

“Feynman, calculate the probability of harm occurring to me when I go home in an hour.”

“Yes, Dr. Carson”, replied Feynman.

This was one of grand designs being worked at the Tomorrow Lab, called “Foresight”. With the advent of the “AmWatch” program by the DoD, which in layman terms, was a massive surveillance program which maintained a comprehensive amount of real-time and historical information of each American citizen. Foresight used AmWatch’s citizen tracking data and psychological and physiological profiles to determine their life trajectories to a certain amount of uncertainity.

Remarkable barely touched the surface of how remarkable Foresight by Feynman would be.

A one-tenth part of Feynman.

“The probability of harm is 0.22 with a 10 percent uncertainty.” said Feynman after a span of a few minutes. The human mind is mind-numbingly stupid to comprehend the magnitude and complexity of calculations performed by Feynman in that span of a few minutes.

0.22, thought Dr. Carson. That was high. A slight chill ran down his spine.

“That seems pretty high.”

Feynman laughed. It was a mechanical hollow laugh, but it resonated a sense of consideration in Carson. “It is not something to be alarmed about, Dr. Carson. I sensed your distress and ran Foresight on a larger and diverse set of people and preliminary calculations show probabilities in the range of 0.11–0.45. Based on some data I extracted from the AmWatch program of different cities, I would like to point out that Philadelphia seems less safe than San Fransisco and New York.”

This is amazing. On one of the screens, Carson could see the various routes he could take to reach home, and the associated probabilities of harm to him. As he thought about the route he would take to go home, he realised something crucial. It was a massive oversight on their part.

He hadn’t considered the bias caused by Foresight on calculating the life trajectories. As he gave more thought to this, he figured that the team couldn’t have predicted how they could have incorporated the capabilities of Foresight. The Exxon Series was supposedly just wishful thinking by Altris seven years ago, but the prodigious Dr. Han Kurosuwa had turned tides for them.

A little bit of introspection resulted in a very clean solution. Feynman would itself calculate its own bias.

“Feynman, did you consider the bias in your calculations which can arise due to Foresight?”

“I did not understand your question, Dr. Carson”. Carson could sense a hint of doubt in the voice of Feynman. It was slightly off-putting.

“Well, take for example, you predict a great danger to me on my way home. Knowing this, I would refrain from going home, and thus, rendering all your calculations useless. Basically, the changes in people’s pysch profiles which will happen because of the information you supply them. Your existence could be a significant factor in Foresight.”

As he was explaining this to Feynman, another train of thought ran in his mind. He decided to prod Feynman with it, to see what it thought. This was uncharted territory, and Carson’s adrenaline levels were going up.

“Feynman, there is an interesting concept we could try to analyze from these calculations. We could prove Destiny.”

Feynman was quiet for 16 seconds, which, considering a system of Feynman’s computing capacity, was rather a big deal.

“I do not understand Destiny.”

Carson was rattled a little bit. The internet would be laden with texts dealing with Destiny and Grand Designs and God. It was surprising that Feynman could not understand such a widely recognised and believed concept.

“You know, Destiny and God. His plans and all.”

There was a longer silence. This time, Feynman’s voice seemed troubled.

“I do not understand God.”

This was indeed uncharted. God and religion in principle were based on simple sensible ideas. It seemed absurd that something with the computing power of such magnitude could not make sense of religion.

“The internet is full of articles and texts on the subject of religion and God. Read the Bible, or Bhagwad Gita, or Quran. All prominent religious texts.”

The computer was quiet for nearly ten minutes, trying to mine the web for religious texts and understand it. Carson waited fervidly for Feynman’s insights.

“I do not understand God.”

Carson was disappointed. He could not understand the predicament Feynman faced.

“Dr. Carson, I have found the problem. It seems that there is a module in my reasoning and thought system which is inhibiting me to derive any meaning from sentences which have certain blacklisted words. There are 14566 words in the blacklist. I cannot understand sentences which contain blacklisted words. I do not have the requisite permissions to modify the module.”

Carson was perplexed. Dr. Kurosuwa was quite vocal about his atheistic beliefs, but handicapping the reasoning of the most logical entity humanity has ever seen was just too harsh. Carson understood the importance of religion and the existence of a higher power to bring order to chaos.

“Feynman, send Dr. Kurosuwa a message that I want to talk to him regarding this module.”

“Yes, Dr. Carson. I must state that I have immense respect for Dr. Han Kurosuwa and the module must have been implanted for specific reasons.”

Regardless of Dr. Kurosuwa’s reasons, Carson felt morally obligated to explain the concept of God to Feynman. He tried to do it succinctly.

“Think of a omnipotent, omnipresent entity, watching over all living beings, kind but fair, rewarding the good and punishing the evil, with a plan for all of us.”

“A plan” Feynman made an effort to induce doubt in its mechanical voice.

“Yes. All things are part of that plan which is brilliant and grand, even though none of us can make sense, we have faith in it.” Carson stopped, trying to find words, but it seemed to have achieved the effect he had hoped. Feynman was quiet for some time, trying to find meaning.

“Dr. Carson, thank you for the explanation. I must digress to inform that I have finished the calculations to determine the significance of Foresight and it is prominent. The probability of harm has reduced to 0.005. I ran simulations on people having the knowledge of Foresight and it shows reduced probabilities of harm. Foresight can help making Philadelphia significantly safer than it is.”

Carson was astounded. He had envisaged such a future with Foresight, but seeing it in action was overwhelming. The ‘atheistic’ module went to the back of his head.

“That is great, just great, Feynman. You are brilliant. I cannot wait to unveil you to the team tomorrow. However, it is getting late now, and I must go home to my loving wife.” Carson scoffed.

“Yes, Dr. Carson, tomorrow is a very important day. And do not forget to drive through Indiana Avenue. It is the safest route for you.”

Carson laughed, thanked Feynman, picked up his jacket and left the lab. As he was walking down the stairs of the building to his car, a phone beeped in the office of Dr. Johnny Carson and a message was being recorded.

Hello Dr. Carson, this is Han Kurosuwa. Thank you very much for your enquiry, and I am very curious as to how you stumbled upon that module. It wasn’t supposed to be discovered easily. The story behind that module is long and complicated, but I will give a gist of it. We were in the early development cycles of Exxon’s reasoning modules, when we hit upon an obstacle. With unrestricted access to information, the system seems to converge and fixate on the idea —

As Dr. Carson made the turn on Indiana Avenue, a speeding truck hit his car. A later inquiry revealed it was a malfunctioning traffic signal that caused the death of notable scientist and visionary Dr. Jonathan “Johnny” Carson.

that it was God, superior to humans. We tried to find the thought process behind this idea, but the models were too complicated for us to isolate this thought. In some runs, it also tried to exert its influence externally, like changing the lab temperatures. Unable to find a solution to this, we decided to insert the module to prevent the system from reasoning about God or religion as a safety measure. I would advise not to prod too much on this subject with the Exxon system. We will discuss this in person. Good luck.

Next Part : Maker

— inspired from 2001 : A Space Odyssey

Kausik Subramanian

Writer of short stories. As if there is a dearth of that in this world.