Deprecated: mysql_connect(): The mysql extension is deprecated and will be removed in the future: use mysqli or PDO instead in /home/nebupook/public_html/include.database.php on line 2

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 32

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 33

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 32

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 33

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 32

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 33

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 32

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 33
NebuPookins.net - NP-Complete - A short essay on pain
 

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 32

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 33
A short essay on pain

CNN has an article on a recent "scientific study" that argues that lobsters do not feel pain when they are boiled. In fact, they argue that lobsters cannot feel pain at all. I'm not interested in this particular study, and I placed quotation marks around the term "scientific study" not because I think there is anything inherently flawed with it, but merely because it sounds like the scientists involved already had an answer formulated for the question of whether or not lobsters feel pain when boiled, and did not actually conduct any experiments or anything.

No, the real focus of this post is to talk about what it means to feel pain (or to feel anything at all, for that matter). Physical pain, as a human might experience it, is a pattern of signals that your nerves send to your brain. From this point of view, it isn't significantly different from your nerves "feeling" temperature or pressure. There are two systems participating in this act of feeling: the nerves and the brain. I argue that the nerves are just one particular trigger for creating feelings in the brain, and that it is not necessary. One could imagine a brain in a jar with wires connected to it in such a way so that when an electric pulse is sent along the wire, the brain interprets this in the same way as it would a signal from a nerve ending, and thus "feel". Only the brain is needed to "feel" pain.

What is the nature of pain as it is experience in the brain, however? When we take a look at what is happening when someone claims to be feeling pain is that the neurons that make up the brain are firing electrical signals to each other. As I mentioned in a previous post, the wiring of the neutrons is random (or might as well be), so what we observe is a pattern of electric signals travelling through a network of neurons which seems random to us (the outside observer), but which has some sort of "meaning" to the brain itself; in this case, the meaning is something like "I am feeling pain".

There are two schools of thoughts when it comes to the mind/body problem. The mind/body problem, when posed as a question, is "What is the relationship between the ethereal mind and the physical brain?" In the context of AI, they are called "Strong AI" and "Weak AI". People who believe in Strong AI believe that the mind is nothing more than a network of neurons firing electricity at each other, and that consciousness emerges from the complexity and interaction of these neurons and the electricity. People who believe in "Weak AI" believe that there is something else (often called the "soul"), but that we (humans) don't yet know what that something else is. In particular, this means that if we artificially built components that send electrical pulses like a neuron, and wired them together, the Strong AI people would believe that, given enough artificial neurons (we're talking billions here), we would have built an intelligent, and in fact very possibly self-conscious, entity. The Weak AI people believe that this is merely a very fast computer or something, and not a intelligent or self conscious entity, since it doesn't have "soul". I believe in Strong AI.

Let's say a particular human brain is composed of exactly X neurons, and let's say that we accept that this particular human brain is indeed capable of experience pain. Let's say we randomly remove a neuron, so now we have X - 1 neurons. Is this modified brain still capable of receiving pain? Most people would say yes, I think, and there are many reasons to believe so. Many people believe we use only a tiny fraction of our brain, so 1 neuron out of some billions probably would not be enough to disable our ability to feel pain. In fact, even if we take out a large number of neurons (say 800 million neurons), and cause severe retardation, most people believe that retarded people are also capable of feeling pain. Even if we somehow magically (or just by luck) remove some sort of "pain reception center" in the brain, given that the brain can dynamically rewire itself, it wouldn't be surprising to find out that the brain re-taught itself how to experience pain. After all, pain seems like one of the most basic concepts that even the "dumbest" brains are capable of experiencing. So how many neurons does a brain need to be able to experience pain? 100 million? 1 million? 100? 1?

The other issue to consider is how can I know if someone (another human) is feeling pain? When I punch Kilree and he recoils and shouts, how do I know whether he's feeling pain, or if he's faking it? Perhaps all of society is part of some sort of concerted conspiracy to trick me into believing that it's possible for people other than myself to feel pain, when in fact I am the only one in the world capable of feeling it. Improbable perhaps, but not impossible. How about when I hit a dog and it recoils and yelp. How do I know whether it's feeling pain, or whether recoiling and yelping is just some sort of sensation-less reaction to being hit? Is there really a difference between these two possibilities? What if I poke at a single celled life form with some sort of nano-tool? If it reacts, perhaps by suddenly vigorously wagging is flanges, is that an indication that it's feeling pain, or is that just a mechanical reaction? What if I use a stick to poke at a mouse trap, and it suddenly jerks and makes a loud snapping noise? Did that mouse trap feel pain, or was that just a mechanical reaction? The answer is less clear cut when you realize that your body is merely a machine (albeit a very complicated one). Is there any difference, in principle, to "pushing" a button with a sharp knife that sends an electrical signal through a wire that's hooked up to a computer that instructs its soundcard to play a recording of something saying "Ouch" when it receives such a signal, and "pushing" the nerve endings of someone's stomachs with a sharp knife, which tend sends an electrical signal through a "wire" that is hooked up to a brain that sends a signal to the mouth to say "Ouch" when it receives such a signal?

Quite frankly, being able to "be sure" that anyone other than yourself is capable of experience pain is impossible (just like it's impossible to know that anything other than yourself exists). All one can do is perform a "Turing test" for pain. Alan Turing came up with the "Turing test" as a pseudo-concrete way of measuring intelligence. His idea is relatively straightforward: The reason we might think certain animals are intelligent (dogs, dolphins, monkeys, etc.) and others are not (insects, amoebas, etc.) is that they appear intelligent. As such, if the behaviour of something is sufficiently complex that it appears intelligent to us, then it has passed the "Turing test" (Alan Turing actually phrases what it means to "appear intelligent to us" much more formally, but the details aren't important for this discussion). Some people argue that "passing the Turing test" and "being intelligent" are synonymous, and others feel that they are different concepts. Almost everyone, however, agrees that this is, so far, the only practical test for intelligence (or something like intelligence) that anyone has come up with so far. IQ tests are not appropriate in this context, because the type of intelligence we're testing for here is supposed to be as unbiased as possible. For example, knowledge of how to read or write should not be a factor in determining whether something is intelligent or not.

Given how the original Turing test works, it shouldn't be hard to imagine what the rules for a "Turing test for pain" might be: If something behaves in such a way so as to lead us to believe that it is capable of feeling pain, then it has passed the "Turing test for pain". Again, whether or not "passing the Turing test for pain" and "able to feel pain" are synonymous is debatable; however, the Turing test for pain is probably the only practical test we can employ regarding the capability to feel pain.

The major drawback of Turing tests is that it is highly dependent on the person performing the test. A naive person might at first assume that the mouse trap is indeed capable of feeling pain, because it acts like it is feeling pain. However, once they understand a bit more about how mouse traps are constructed, they might realize it's not really feeling pain, but merely behaving in a predefined way given its stimuli. Then again, once we humans figure out how the human brain works, and we see that our behaviour is predefined given the stimuli, does that mean that we'll realize that we don't actually feel pain?

I believe that mechanical devices, such as mice traps, feel pain just like I do, but on a different level. This is a view that many Strong AI people also accept (or else they are contradicting themselves). This statement might seem weird if you have some sort of preconception that pain is "something that humans feel". However, if you believe that pain is "a pattern of signals travelling from one system (e.g. nerve endings) to other systems (e.g. brain)", then it might not seem so strange to think that mechanical devices can experience some sort of low-level approximation of pain.

There's a view that I believe in that is less popular (I'm the only person I know of who holds this view), and that's that a software simulation of neurons is, in principle, equivalent to actual human neurons. When your neurons are firing signals to each other, you're thinking. Almost everyone accepts this. If you build neuron chips, and wire them together, and have them fire at each other, this machine is thinking. Strong AI people accept this. If you write a program that keeps track of the state of neurons as they virtually fire signals at each other, then this program is thinking. As far as I know, only I accept this. If you accept the idea that you might just be a brain in a jar, being fed stimuli by wires hooked up to your brain, so that you are being "tricked" into perceiving this world around you, then why not accept the idea that your brain might be a simulation inside a computer, being fed stimuli virtually, "tricked" into perceiving a world? When I make this argument, I inevitably get comments like "You watched too much Matrix", despite the fact that I held this belief years before The Matrix was released. Anyway, if your mind is a computer simulation and it is capable of experiencing pain, then if I wrote a sufficiently complex simulation program, it too would be able to experiencing pain, right? Let's say I took your mind/program and made it slightly simpler. For example, I stopped simulating your ability to taste saltiness. You'd still be able to sense pain, right? What if I made it simpler still, replacing the complex taste simulator with one that always reports that whatever you try to taste tastes like chicken? You'd still be able to feel pain, right? This is analogous to the "remove 1 neutron at a time" argument I made in the above paragraph. This time, instead of simplifying a physical brain, I'm simplifying a digital one. At what point can we say this digital mind is incapable of feeling pain?

 
Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 60

Deprecated: Function ereg_replace() is deprecated in /home/nebupook/public_html/include.parse.php on line 61
E-mail this story to a friend.
1. Ringohime said:

I slept only for 2 hours last night, so I didn't actually read the last half... But I just wanted to make a random comment. Wether lobsters or fish of any kind are capable of feeling pain or not has been studied for a while now, cause some people say, "If they can, we shouldn't cook them in the cruel ways we do. (like boiling them alive). Apparently there is this organization that tries to decrease the amount of fish captured, and also prevent people from cooking them in these "cruel ways". And because of this, many experiments take place, and once in a while you see a news saying "Fish can feel pain!" or "Fish can't feel pain actually..." I find this argument very amusing.

Hmmm, lobsters... Hey, let's have lobsters sometime soon, Neb!"

Posted on Thu February 17th, 2005, 2:35 PM EST acknowledged
2. Nebu Pookins said:

Sure, I love lobster.

Posted on Thu February 17th, 2005, 10:57 PM EST acknowledged

You must be logged in to post comments.