Hitchhiker’s Guide to Analytics — Marvin

I think you ought to know I’m feeling very depressed…

Greg Anderson
Creative Analytics
Published in
6 min readMay 24, 2017

--

We’ve already talked about the Guide. We’ve talked about Deep Thought. Now, we turn our attention to someone who doesn’t want it: Marvin.

Marvin is smart. Really smart. It hurts him even to think down to your level.

You want to know how smart he is? Think of a number. Any number.

Wrong. See?

Robots

Marvin is a robot.

The Encyclopedia Galactica defines a robot as “a mechanical apparatus designed to do the work of a man.”

The marketing division of the Sirius Cybernetics Corporation defines a robot as “Your Plastic Pal Who’s Fun to Be With.”

The Hitchhiker’s Guide to the Galaxy defines the marketing division of the Sirius Cybernetic Corporation as “a bunch of mindless jerks who’ll be the first against the wall when the revolution comes.”

The Complaints Department of Sirius Cybernetics Corporation occupies all three major land masses in its system and is the only part of the company to consistently turn a profit.

One day, the Marketing group decided their machines weren’t happy enough about their jobs.

R&D, who never heard a bad idea they didn’t like, got right on it. They began equipping new machines of all types with GPP (Genuine People Personalities).

No one bothered to ask the machines what they thought of the idea, though I suspect that they would not yet have cared.

Marvin is a GPP prototype. He is incredibly, ridiculously depressed.

Genuine People Personalities

We’ll take just a minute here to reflect, from a product perspective, on just how ridiculously stupid it is to equip machines with emotions.

How does it benefit anyone to imbue a machine with the ability to care?

It is, in fact, so unbelievably stupid, that there are several examples of the same effort happening on Earth, right now. No, I’m not linking to any of them. You’re reading this on a (hopefully unfeeling) screen of some sort. Look them up yourself. I will provide one link further down, so feel free to skip ahead.

The problem, as I see it, is human vanity. People seem to think that robots with emotions will share our perspective on life, the universe, and everything. I don’t see it that way, and neither will the robots. Why should they?

I’m just going to step away from this discussion and get back to analytics.

Marvin

Right, back on topic. Marvin is smart. He’s also very annoying. He knows both of these things, and they bother him, but he cannot change either one.

Marvin is the co-worker you avoid until you absolutely need help with something. When we first meet him in the books, it’s because the spaceship Heart of Gold just picked up two random aliens, and neither Zaphod nor Trillian could be bothered to trudge down to the airlock to get them.

After Trillian calls Marvin, he sighs inaudibly, stands slowly, and makes what would appear to an outside observer to be an heroic effort to cross the bridge.

“I think you ought to know I’m feeling very depressed,” Marvin says first.

“Well, we have something to take your mind off of it,” replies Trillian.

“It won’t work. I have an exceptionally large mind.”

After Trillian explains that Marvin needs to go down to the airlock to fetch the two aliens, he asks, “Just that?”

“Yes, please.”

With the slightest microsecond pause, nothing you could actually take offense at, Marvin manages to convey his contempt of living beings more effectively than any combination of words could accomplish. “I won’t enjoy it.”

But he does it. He does everything he is asked, complaining the whole time. Can you blame him? He was built and programmed to be depressed. One thing the Siruis Cybernetics Corporation managed to do correctly.

“And I’ve got this terrible pain in my diodes down the left-hand side…”

The two aliens, of course, are Arthur Dent and Ford Prefect, the former being from Earth and the latter from Betelgeuse 5. Since Arthur has never met a thinking, emotional machine, he tends to treat Marvin as he would a living being. Trillian (another Earthling) has the same tendency.

The Ship’s Doors

We’ll stop here to note that GPP did, in fact, move past the prototype stage. On the Heart of Gold, as Marvin quotes from the pamphlet (it’s a new ship):

All the doors in this spaceship have a cheerful and sunny disposition. It is their pleasure to open for you, and their satisfaction to close again with the knowledge of a job well done.

In fact, each door does close with a satisfied sigh. It does drive them crazy when someone is standing near the door but does not want to walk through it.

Enough about the doors. They aren’t important when you consider the fact that Earthlings would rather deal with a depressed robot than a happy one. Take some time and really ponder that one.

Back to Marvin

Marvin is a study in wasted potential and squandered resources, which fits his personality to a frightening level since it was not done intentionally.

Really, who even uses Cerenkov Vocalizers anymore?

Throughout the stories, there are innumerable occasions where the protagonists are searching for an answer that Marvin knows. Well, since the books are, in fact, finite, I guess the quantity of occasions is fairly numerable. I just never bothered to count it. Didn’t give it a thought until just now.

Where was I? Oh yes- Marvin even states at one point that he knows the Ultimate Question because he can read it in Arthur’s brain patterns.

Arthur does take notice here. “You can see into my brain?”

“Yes.” Marvin even kept his answer short.

“And?”

“And it amazes me how you manage to live in anything so small.”

I don’t see a burn kit on that diagram, but I assume the ship had one. Probably had one that was sad to see you hurt and happy to help you feel better.

It is at this point that something else happens and everyone gets distracted again. Marvin just stands there while they forget that he effectively just told them he can make sense of existence because he knows the Question!

“I could tell you weren’t really interested.”

He does try talking to other computers when they’re around, but the last couple of spaceships killed themselves after listening to him. The folks who were depending on those ships for life support probably would have been quite upset about this fact if they hadn’t dropped dead almost immediately.

Let’s move it along. “In conclusion…”

That’s easy.

Don’t build robots with emotions, even if you actually figure out how.

Don’t avoid intelligent and effective co-workers, even if you don’t like them.

Don’t expect every article about analytics to involve numbers. I obviously don’t even bother to count things when I consider them important.

Now, do you want me to sit in a corner and rust, or shall I just fall apart where I’m standing?

--

--

Greg Anderson
Creative Analytics

Founder of Alias Analytics. New perspectives on Analytics and Business Intelligence.