, 2 min read
Sentience is indescribable
Arguably, one of the most nagging scientific question is the nature of sentience. Can we build sentient computers? Is my cat sentient? What does that mean? Will a breakthrough in cognitive science tell us what are consciousness, sentience and free will?
I conjecture that these topics will forever escape us, at least in part.
Near where I live, there is a forest. I can recognize this forest. If you were to drop me asleep in it, I would immediately recognize it. I would recognize the way the trees have grown, the sounds, the smell, the species… But I could never “explain” this forest. That is, I cannot compress down my experience of this forest to a coherent document that I could share. My forest is indescribable: its entropy is too high.
My brain is limited. I can only describe simple structures with a degree of complexity far lower than that of my brain. My brain cannot describe itself. Software appears sentient (though maybe not sapient) when its entropy becomes comparable to our own brain. To me, Gmail’s spam filter appears sentient. I know the science behind Gmail’s spam filter. It uses some kind of Bayes classifier. But that’s like saying that the brain is made of neurons.
Of course, software can run other software (e.g., your browser runs JavaScript). Similarly, my brain can predict what my wife will say under some circumstances (mostly when I screw up). But if I were to lay down on paper how I manage to predict my wife’s actions, the result would be illegible: I cannot communicate my understanding to another brain.
So, are my computer and my cat sentient? By this definition: absolutely. They are not sapient, that is, they cannot pass for human beings, but I cannot describe how they work. I can merely describe how some parts of them work and make some limited predictions. For example, I can tell how a CPU work, but my description is going to quickly become fuzzy: I am constantly puzzled by how superscalar processors deal with computations. They reorder the instructions in ways that are not intuitive to me. At least as far as my brain is concerned, it is magic! Similarly, a forest is sentient. Earth is sentient.
Of course, this definition of sentience is evolving. A simple engine may be indescribable (magic!) at some point, and then later perfectly describable after some training in mechanics. But as long as I can, eventually, understand how the engine works and communicate my understanding, then I have shown that its complexity is sufficiently below that of our brains.
If you accept my point of view, then it has some consequences for morality. Some say that sentient beings deserve respect. For example, you should not own sentient beings. Yet if sentience is nothing special, but merely a computing system with entropy approaching our own, then why should they deserve special consideration? Perhaps we just want to cling to the belief that sentience is somehow special.