CellBioGuy comments on Stupid questions thread, October 2015 - Less Wrong

3 Post author: philh 13 October 2015 07:39PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (223)

You are viewing a single comment's thread. Show more comments above.

Comment author: CellBioGuy 17 October 2015 09:37:20PM *  1 point [-]

I suppose the question is: what % of the star's outgoing energy can we harness in principle, such that waste heat is hard to tell apart from background

You can never have the temperature of outgoing radiation indistinguishable from the cosmic background, since energy is being generated by the star and in equilibrium more energy must leave than enters from the background.

The CMB reads as ~2.76k. Let's say you wanted to radiate the entire energy output of a star at 3.76 k. That means the flux out equals the star's flux plus the CMB flux in. For a star like the sun, the surface area of material required comes to a sphere a fifth of a light year (13000 AUs) in radius to dissipate a solar luminosity of energy (divide quantity of radiator material by the fraction of the solar luminosity you want to use, but keep in mind having the radiators closer than a 5th of a light year would probably be pointless since they'd be heated up by the sun) (also keep in mind such an object would look as large as the full moon 44 light years away and as wide as Jupiter in our sky 2,000 light years away). For ten kelvin, a sphere 0.025 light years or 1560 AUs in radius. For 50 kelvin, 62 AUs (twice as large as Neptune's orbit).

Of course, there's also the starlight flux of all the other nearby stars, which makes this worse for very low temperatures.

(Calculations done using energy out = energy in from CMB + solar flux, and the definition of blackbody radiation)

EDIT: I should go over some astronomy papers and figure out what amounts of material typically produce observable infrared excesses.