You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

zedzed comments on Open thread, Jan. 19 - Jan. 25, 2015 - Less Wrong Discussion

3 Post author: Gondolinian 19 January 2015 12:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (302)

You are viewing a single comment's thread. Show more comments above.

Comment author: shminux 19 January 2015 10:58:31PM *  0 points [-]

I understand what you are saying, but I am not convinced that there is a big difference.

Entropy measures the uncertainty in the distribution of the parameters. It measures something about our information about the system.

How would you change this uncertainty without disturbing the system?

But energy (relative to ground state) doesn't change no matter how much information you gain about a system's internal microstates.

How would you gain this information without disturbing the system (and hence changing its energy)?

EDIT: see also my reply to spxtr.

Comment author: passive_fist 19 January 2015 11:21:16PM *  0 points [-]

How would you change this uncertainty without disturbing the system?

You have to define what 'disturbing the system' means. This is just the classical Maxwell's demon question, and you can most definitely change this uncertainty without changing the thermodynamics of the system. Look at http://en.wikipedia.org/wiki/Maxwell%27s_demon#Criticism_and_development

Especially, the paragraph about Landauer's work is relevant (and the cited Scientific American article is also interesting).