JoshuaZ comments on Paper draft: Relative advantages of uploads, artificial general intelligences, and other digital minds - Less Wrong

10 Post author: Kaj_Sotala 07 August 2011 04:53PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (11)

You are viewing a single comment's thread.

Comment author: JoshuaZ 08 August 2011 04:08:07AM *  1 point [-]

I'm having a lot of trouble understanding the second paragraph in section 2.1.2, especially by the sentences "Amdahl's law assumes that the size of the problem stays constant as the number of processors increases, but Gustafson (1988) notes that in practice the problem size scales with the number of processors." Can you expand on what you mean here?

Edit: Also there's a typo in 4.1- "practicioners".

Comment author: gwern 09 August 2011 10:32:33PM 5 points [-]

I think the point is that when you increase the data set, you then expose more work for the parallelism to handle.

If I have a 1kb dataset and I have a partially parallel algorithm to run on it, I will very quickly 'run out of parallelism' and find that 1000 processors are as good as 2 or 3. Whereas, if I have a 1pb dataset, same data and same algorithm, I will be able to add processors for a long time before I finally run out of parallelism.

Comment author: Kaj_Sotala 10 August 2011 09:37:29AM *  1 point [-]

gwern's explanation is right. Gustafson's law. I'll clarify that.