@Bill Benzon Hi Bill Benson! You are the first person who I've come across who has come to the same conclusion as I did. LLMs cannot semantically understand the meaning of words. I believe this is because semantic understanding of words and concepts is a form of Qualia. And computers cannot feel qualia. We feel the semantic meaning of words as a sensation, as a qualia, which requires a consciousness. Only a consciousness can feel qualia. Meaning has three components, 1) a structural relationship component (how it structurally relates to other words) 2) intention & adhesion (only a consciousness can have an intention and understand its adhesion to real world) 3) a qualia component (the part of... (read 423 more words →)
@Bill Benzon
Hi Bill Benson! You are the first person who I've come across who has come to the same conclusion as I did. LLMs cannot semantically understand the meaning of words. I believe this is because semantic understanding of words and concepts is a form of Qualia. And computers cannot feel qualia. We feel the semantic meaning of words as a sensation, as a qualia, which requires a consciousness. Only a consciousness can feel qualia.
Meaning has three components,
1) a structural relationship component (how it structurally relates to other words)
2) intention & adhesion (only a consciousness can have an intention and understand its adhesion to real world)
3) a qualia component (the part of... (read 423 more words →)