Grumpy Troll
Sep 4, 2018 · 2 min read

Thanks,

So, there are three problems that I can see with this.

#1. I don’t think Nagel said it depends on our ability to imagine ourselves as that thing. At least that is not my interpretation. In fact, I can not find a reference on Nagel’s paper with that statement.

A clear counter argument would be that, for example, I can imagine being a chair, an airplane, a photon and a quark. But that only speaks to my imagination. When writers write fiction, they create characters, and one common practice is to imagine and describe what it feels like to be those characters. That doesn’t make it real.

Here is what I could find why Nagel thinks it’s reasonable to assume bats have experience:

“I assume we all believe that bats have experience. After all, they are mammals, and there is no more doubt that they have experience than that mice or pigeons or whales have experience”

#2 While there have been some recent overly hyped news of chatbots passing turing tests, I don’t think the results are widely accepted in the AI research community. I don’t think we are there yet. The (real) turing test is pretty damn hard to pass. Here is a good blog post from Scott Aaronson about it.

#3 Even if a chatbot passed the turing test, and the Interrogator imagined he is talking to a human, it’s not clear to me how the Interrogator’s imagination has anything to do with there being something like being the (real) hypothetical silicon computer (or datacenter?), running a software that passes the turing test.

    Grumpy Troll

    Written by