All of mouse_mouse's Comments + Replies

Glad I could help. If you want to learn more about LLMs and have enough interest in the topic, I recommend getting hands-on with one that is "raw." You almost certainly can't run anything nearly as big as ChatGPT on your home computer, but there are models available on huggingface which will run on home computers.

I found that playing around with LLMs, especially of the size that is runnable on my PC, really helped illuminate their capabilities and deficits for me. When they're ~7B parameters in size they're somewhat reliable if you prompt them correctly, b... (read more)

I'm still not sure I'm understanding the delineation between software that counts as cognition and software that doesn't count. Neural networks are not ghosts in the machine: they are software. Software that was defined by humans, and then trained by computer. 

Crucially, they can be made entirely deterministic -- and actually are, if the temperature of the network is 0. Randomness has to be deliberately introduced into the system in order for the machine to not give exactly the same response to the same prompt (this is the "temperature" I was referrin... (read more)

2amelia


I feel you are misunderstanding the technology here and assuming some things happened that didn't happen. ChatGPT is not actually "learning" in these examples. What is happening is that you are attaching additional context to the prompt for the next response, which changes the probabilities of the output text. There is no internalization of knowledge happening here, because the context is not internal. Put another way, the network isnt actually changing its weights or connections or biases in any way in this process. It is not building new neural pathways,... (read more)

1amelia