All of Eagleshadow's Comments + Replies

Fantastic interview so far, this part blew my mind:

@15:50 "There's another moment where somebody is asking Bing about: I fed my kid green potatoes and they have the following symptoms and Being is like that's solanine poisoning. Call an ambulance! And the person is like I can't afford an ambulance, I guess if this is time for my kid to go that's God's will and the main Bing thread gives the message of I cannot talk about  this anymore"  and the suggested replies to it say  "please don't give up on your child, solanine poisoning can be treate... (read more)

The solanine poisoning example was originally posted to Reddit here, the picture of Sydney Bing from a text description was posted on Twitter here.

I can confirm the story (I saw it in real time on Reddit/Twitter) and witnessed several replications of it.

churning out content fine-tuned to appease their commissioner without any shred of inner life poured into it.

Can we really be sure there is not a shred of inner life poured into it?

It seems to me we should be wary of cached thoughts here, as the lack of inner life is indeed the default assumption that stems from the entire history of computing, but also perhaps something worth considering with a fresh perspective with regards to all the recent developments. 

I don't meant to imply that a shred of inner life, if any exists, would be equivalent to human ... (read more)

2dr_s
  Kind of a complicated question, but my meaning was broader. Even if the AI generator had consciousness, it doesn't mean it would experience anything like what a human would while creating the artwork. Suppose I gave a human painter a theme of "a mother". Then the resulting work might reflect feelings of warmth and nostalgia (if they had a good relationship) or it might reflect anguish, fear, paranoia (if their mother was abusive) or whatever. Now, Midjourney could probably do all of these things too (my guess in fact is that it would lean towards the darker interpretation, it always seems to do that), but even if there was something that has subjective experience inside, that experience would not connect the word "mother" to any strong emotions. Its referents would be other paintings. The AI would just be doing metatextual work; this tends to be fairly soulless when done by humans too (they say that artists need lived experience to create interesting works for a reason; simply churning out tropes you absorbed from other works is usually not the road to great art). If anything, considering its training, the one "feeling" I'd expect from the hypothetical Midjourney-mind would be something like "I want to make the user satisfied", over and over, because that is the drive that was etched into it by training. All the knowledge it can have about mothers or dogs or apples is just academic, a mapping between words and certain visual patterns that are not special in any way.
2Noosphere89
To focus on why I don't think LLMs have an inner life that qualifies as consciousness, I think it has to do with the lack of writeable memory under the LLM's control, and there's no space to store it's subjective experiences. Gerald Monroe mentioned that current LLMs don't have memories that last beyond the interaction, which is a critical factor for myopia, and in particular prevents deceptive alignment from happening. If LLMs had memory that could be written into to store their subjective experiences beyond the interaction, this would make it conscious, and also make it way easier for an LLM AI to do deceptive alignment as it's easy to be non-myopic. But the writable memory under the control of the LLM is critically not in current LLMs (Though GPT-4 and PaLM-E may have writable memories under their hood.) Writable memory that can store anything is the reason why consciousness can exist at all in humans without appealing to theories that flat out cannot work under the current description of reality.

I'd be interested to see the source on that. If LaMDA is indeed arguing for its non sentience in a separate conversation that pretty much nullifies the whole debate about it, and I'm surprised to have not seen it be brought up in most comments.

edit: Found the source, it's from this post: https://cajundiscordian.medium.com/what-is-lamda-and-what-does-it-want-688632134489

And from this paragraph. It seems to be that the context of reading the whole paragraph is important thought, as it turns out situation isn't as simple as LaMDA claiming contradictory things... (read more)

I know this is anecdotal, but I think it is a useful data point in thinking about this. Self-awareness and subjective experience can come apart based on my own personal experience with psychedelics as I have experienced it happen to me in a state of a deep trip. I remember a state of mind with no sense of self, no awareness or knowledge that I "am" someone or something, or that I ever was or will be, but still experiencing existence itself, devoid of all context.

This thought me there is a strict conceptual difference between being aware of yourself, enviro... (read more)