2 Comments

Regarding the introduction: "I think you could answer this concern by saying that those ‘reasonable grounds‘ aren’t applicable to AI systems. We are morphologically fundamentally different."

I don’t think that "being morphologically different" is a good reason to answer that sentience could not apply to AI systems. This would be theory-laden in the sense that you are implying an answer to the ontological constitution of consciousness.

Even though the physical processes in our brain seem to be necessary for our brain to create consciousness, that might not apply to other substrates that could constitute consciousness, like AI systems.

If, in turn, the constitution of consciousness turns out to be morphologically bound to brains, then you are totally right in arguing that we do not have reasonable grounds to ascribe sentience to an AI system.

This is an interesting point in any case. I think I will write a post on this problem. It seems like there is a tendency to ascribe something special to the neurobiological substrate because it looks as if only it can create consciousness.

Expand full comment
author

„I don’t think that "being morphologically different" is a good reason to answer that sentience could not apply to AI systems. This would be theory-laden in the sense that you are implying an answer to the ontological constitution of consciousness.“

I agree. The point I tried to sketch is: „Morphological similarity is a good *heuristic* when we talk about the sentience of beings that came about in a similar process (evolution)“. It's a plausible and practical tool to talk about a subset of beings that could be sentient. (I think I got caught up in my own terminology, I think morphological similarity is a criterion, rather than a reason per se)

„Even though the physical processes in our brain seem to be necessary for our brain to create consciousness, that might not apply to other substrates that could constitute consciousness, like AI systems.

If, in turn, the constitution of consciousness turns out to be morphologically bound to brains, then you are totally right in arguing that we do not have reasonable grounds to ascribe sentience to an AI system.“

Agreed. I think this is a point I am sketching later as well. The tools we have to talk about the sentience of other beings are limited to a subset of all beings that are potentially concious (e.g. all beings that came about through an evolutionary process/biological beings). In this sense, if a being has a neocortex, this might be a sufficient criterion to ascribe consciousness, but we have to be agnostic about the consciousness of beings that don't have a neocortex (as we don't know if the neocortex is necessary).

„It seems like there is a tendency to ascribe something special to the neurobiological substrate because it looks as if only it can create consciousness.“

Would love to read it! I agree – we overestimate our ability to ascribe ability, because when we talk about biological brains it seems necessary to have a neocortex, while if we look at the whole set of beings that could be conscious, a neocortex is only sufficient.

Expand full comment