cwillu comments on Open Thread: May 2010 - Less Wrong

3 Post author: Jack 01 May 2010 05:29AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (543)

You are viewing a single comment's thread.

Comment author: cwillu 01 May 2010 09:21:31PM *  9 points [-]

Has anybody considered starting a folding@home team for lesswrong? Seems like it would be a fairly cheap way of increasing our visibility.

<30 seconds later>

After a brief 10 word discussion on #lesswrong, I've made a lesswrong team :p

Our team number is 186453; enter this into the folding@home client, and your completed work units will be credited.

Comment author: nhamann 02 May 2010 12:55:07AM 2 points [-]

Does anyone know the relative merits of folding@home and rosetta@home, which I currently run? I don't understand enough of the science involved to compare them, yet I would like to contribute to the project which is likely to be more important. I found this page, which explains the differences between the projects (and has some information about other distributed computing projects), but I'm still not sure what to think about which project I should prefer to run.

Comment author: MichaelGR 03 May 2010 06:14:22PM *  1 point [-]

Personally I run Rosetta@home because, based on my research, it could be more useful to designing new proteins and computationally predicting the function of proteins. Folding seems to be more about understanding how they proteins fold, which can help with some diseases, but isn't nearly the game changing that in silico design and shape prediction would be.

I also think that the SENS Foundation (Aubrey de Grey & co) have some ties to Rosetta, and might use it in the future to design some proteins.

I'm a member of the Lifeboat Foundation team: http://lifeboat.com/ex/rosetta.home

But we could also create a Less Wrong team if there's enough interest.

Comment author: Jack 01 May 2010 10:03:41PM 1 point [-]

So I think I have it working but... theres nothing to tell me if my CPU is actually doing any work. It says it's running but... is there supposed to be something else? I used to do SETI@home back in the day and they had some nice feedback that made you feel like you were actually doing something (of course, you weren't because your computer was looking for non-existent signals, but still).

Comment author: zero_call 02 May 2010 12:49:27AM *  1 point [-]

...of course, you weren't because your computer was looking for non-existent signals...

The existence of ET signals is an open qustion. SETI is a fully legitimate organization ran according to a well thought out plan for collecting data to help answer this question.

Comment author: Jack 02 May 2010 01:00:19AM 1 point [-]

I think the probability they ever find what they're looking for is extraordinarily low. But I don't have anything against the organization.

Comment author: zero_call 02 May 2010 01:14:29AM *  1 point [-]

Right on, but just so you know, other (highly informed) people think that we may find a signal by 2027, so there you go. For an excellent short article (explaining this prediction), see here.

Comment author: Jack 02 May 2010 01:55:41AM 0 points [-]

I don't think the author deals with the Fermi paradox very well, and the paradox is basically my reason for assigning a low probability to SETI finding something.

Comment author: zero_call 02 May 2010 02:08:13AM 0 points [-]

The Fermi paradox also struck me as a big issue when I first looked into these ideas, but now it doesn't bother me so much. Maybe this should be the subject of another open thread.

Comment author: cwillu 01 May 2010 10:28:25PM *  0 points [-]

I use the origami client manager thingie; it handles deploying the folding client, and gives a nice progress meter. The 'normal' clients should have similar information available (I'd expect that origami is just polling the clients themselves).

Comment author: Jack 01 May 2010 09:35:53PM 1 point [-]

What is this?

Comment author: MichaelGR 03 May 2010 06:16:18PM *  2 points [-]

I wrote a quick introduction to distributed computing a while ago:

http://michaelgr.com/distributed-computing/

My favorite project (the one which I think could benefit humanity the most) is Rosetta@home.

Comment author: Rain 01 May 2010 09:41:21PM *  2 points [-]

Donating money to scientific organizations (in the form of a larger power bill). You run your CPU (otherwise idle) to crunch difficult, highly parallel problems like protein folding.

Comment author: cwillu 01 May 2010 10:16:47PM 0 points [-]

Granted that in many cases, it's donating money that you were otherwise going to burn.

Comment author: mattnewport 01 May 2010 10:23:57PM 2 points [-]

Granted that in many cases, it's donating money that you were otherwise going to burn.

No, modern CPUs use considerably less power when they are idle. A computer running folding at home will be drawing more power than if it were not.

Comment author: rwallace 01 May 2010 11:56:47PM 6 points [-]

But you've already paid for the hardware, you've already paid for the power to run the CPU at baseload, and the video card, and the hard disk, and all the other components; if you turn the machine off overnight, you're paying for wear and tear on the hardware turning it off and on every day, and paying for the time you spend booting up, reloading programs and reestablishing your context before you can get back to work.

In other words, the small amount of money spent on the extra electricity enables the useful application of a much larger chunk of resources.

That means if you run Folding@home, your donation is effectively being matched not just one for one but severalfold, and not by another philanthropist, but by the universe.

Comment author: Rain 03 May 2010 02:17:32PM 4 points [-]

if you turn the machine off overnight, you're paying for wear and tear on the hardware turning it off and on every day, and paying for the time you spend booting up, reloading programs and reestablishing your context before you can get back to work.

I've seen numerous discussions about whether it's better / more economical to turn off your machine or to leave it running all the time, and I have never seen a satisfactory conclusion based on solid evidence.

Comment author: RobinZ 03 May 2010 02:35:05PM 0 points [-]

That's because it depends on the design. On the lifetime point, for example: if the machine tends to fail based on time spent running (solder creep, perhaps), leaving it running more often will reduce the life, but if the machine tends to fail based on power cycling (low-cycle fatigue, perhaps), turning it on and off more often will reduce the life.

Given that I've dropped my MacBook from a height of four feet onto a concrete slab, I figure the difference is roundoff error as far as I am concerned.

Comment author: CarlShulman 02 May 2010 03:40:09AM *  1 point [-]

A severalfold match isn't very impressive if the underlying activity is at least several orders of magnitude less efficient than alternatives, which seems likely here.

Comment author: rwallace 02 May 2010 10:43:46AM 0 points [-]

It seems highly unlikely to me. Biomedical research in general and protein folding in particular are extremely high leverage areas. I think you will be very hard put to it to find a way to spend resources even a single order of magnitude more efficiently (let alone make a case that the budget of any of us here is already being spent more efficiently, either on average or at the margin).

Comment author: CarlShulman 02 May 2010 08:44:06PM 6 points [-]
  1. Moore's Law means that the cost of computation is falling exponentially. Even if one thought that providing computing power was the best way to spend money (on electricity) it would likely be better to save the money spent on the electric power and buy more computing power later, unless the computation is much much more useful now.

  2. Biomedical research already gets an outsized portion of all R&D, with diminishing returns. The NIH budget is over $30 billion.

  3. Slightly accelerating protein folding research doesn't benefit very much from astronomical waste considerations compared to improving the security of future progress with existential risk reduction.

Comment author: Kaj_Sotala 12 May 2010 06:50:36AM *  3 points [-]

it would likely be better to save the money spent on the electric power and buy more computing power later, unless the computation is much much more useful now.

In principle, this is true; in practice, saying things like these seems more likely to make the people in question to simply cease donating electricity, instead of ceasing to donate electricity and donating the saved money to something more useful. Installing a program and running it all the time doesn't really feel like you're spending money, but explicitly donating money requires you to cross the mental barrier between free and paid in a way that running the program doesn't.

For those reasons, I'd be very hesitant about arguing against running programs like Folding@Home; it seems likely to cause more harm than good.

Comment author: rwallace 02 May 2010 09:03:38PM 0 points [-]
  1. In practice, it is worth doing the computation now -- we can easily establish this by looking at the past, and noting that the people who performed large computations then, would not have been better off waiting until now.

  2. $30 billion is a lot of money compared to what you and I have in our pockets. It's dirt cheap compared to the trillions being spent on unsuccessful attempts to treat people who are dying for lack of better biotechnology.

  3. By far the most important way to reduce real life existential risks is speed.

  4. Even if you could find a more cost effective research area to finance, it is highly unlikely that you are actually spending every penny you can spare in that way. The value of spending resources on X, needs to be compared to the other ways you are actually spending those resources, not to the other ways you hypothetically could be spending them.

Comment author: Vladimir_Golovin 03 May 2010 06:46:29AM 0 points [-]

and the video card

They have high-performance GPU clients that are a lot faster than CPU-only ones.

Comment author: Jack 02 May 2010 12:00:41AM *  0 points [-]

Assuming whatever gets learned through folding@home has applications they should offer users partial ownership of the intellectual property.

Comment author: rwallace 02 May 2010 03:19:07AM 0 points [-]

It's scientific research, the results are freely published.

Comment author: mattnewport 01 May 2010 11:59:26PM 0 points [-]

I'm not saying it isn't a net gain, it may well be according to your own personal weighing of the factors. I'm just saying it is not free. Nothing is.

Comment author: cwillu 01 May 2010 10:43:48PM *  0 points [-]

Many != all.

My desktop is old enough that it uses very little more power at full capacity than it does at idle.

Additionally, you can configure (may be the default, not sure) the client to not increase the clock rate.

Comment author: mattnewport 01 May 2010 11:28:03PM *  1 point [-]

Many != all.

It is also not equal to 'some'. The vast majority of computers today will use more power when running folding at home than they would if they were not running folding at home. There may be some specific cases where this is not true but it will generally be true.

My desktop is old enough that it uses very little more power at full capacity than it does at idle.

You've measured that have you? Here's an example of some actual measurements for a range of current processors' power draw at idle and under load. It's not a vast difference but it is real and ranges from about 30W / 40% increase in total system power draw to around 100W / 100% increase.

Additionally, you can configure (may be the default, not sure) the client to not increase the clock rate.

I couldn't find mention of any such setting on their site. Do you have a link to an explanation of this setting?

Comment author: cwillu 02 May 2010 01:39:18AM *  1 point [-]

On further consideration, my complaint wasn't my real/best argument, consider this a redirect to rwallace's response above :p

That said, I personally don't take 'many' as meaning 'most', but more in the sense of "a significant fraction", which may be as little as 1/5 and as much as 4/5. I'd be somewhat surprised if the number of old machines (5+ years old) in use wasn't in that range.

re: scaling, the Ubuntu folding team's wiki describes the approach.

Comment author: Rain 01 May 2010 11:02:00PM *  0 points [-]

Idle could also mean 'off', which would be significant power savings even (especially?) for older CPUs.

Comment author: cwillu 02 May 2010 01:18:40AM *  1 point [-]

One who refers to their powered-off computer as 'idle' might find themselves missing an arm.

Comment author: Rain 03 May 2010 02:15:36PM *  1 point [-]

Except I'm talking about opportunity cost rather than redefining the word. You can turn off a machine you aren't using, a machine that's idle.