jacob_cannell comments on [Link] Study: no big filter, we're just too early - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (44)
EDIT: After updating through this long thread, I am now reasonably confident that the above statement is incorrect. Passive shielding in the form of ice can cool the earth against's the sun's irradiance to a temp lower than the black body temp, and there is nothing special about the CMB irradiance. See the math here at the end of the thread.
Sure - if it wasn't actively cooled, but of course we are assuming active cooling. The less incoming radiation the system absorbs, the less excess heat it has to deal with.
Sure you need to expend energy, but obviously the albedo/reflectivity matters a great deal. Do you know what the physical limits for reflectivity are? For example - if the object's surface can reflect all but 10^-10 of the incoming radiation, then the active cooling demands are reduced in proportion, correct?
I'm thinking just in terms of optimal computers, which seems to lead to systems that are decoupled from the external environment (except perhaps gravitationally), and thus become dark matter.
The limits of reversible computing have been discussed in the lit, don't have time to review it here, but physics doesn't appear to impose any hard limit on reversible efficiency. Information requires mass to represent it and energy to manipulate it, but that energy doesn't necessarily need to be dissipated into heat. Only erasure requires dissipation. Erasure can be algorithmically avoided by recycling erased bits as noise fed into RNGs for sampling algorithms. The bitrate of incoming sensor observations must be matched by an outgoing dump, but that can be proportionally very small.
I think you're still not 'getting it', so to speak. You've acknowledged that active cooling is required to keep your computronium brain working. This is another way of saying you expend energy to remove entropy from some part of the system (at the expense of a very large increase in entropy in another part of the system). Which is what I said in my previous reply. However you still seem to think that, given this consideration, stealth is possible.
By the way, the detection ranges given in that article are for current technology! Future technology will probably be much, much better. It's physically possible, for instance, to build a radio telescope consisting of a flat square panel array of antennas one hundred thousand kilometers on a side. Such a telescope could detect things we can't even imagine with current technology. It could resolve an ant crawing on the surface of pluto or provide very detailed surface maps of exoplanets. Unlike stealth, there is no physical limit that I can think of to how large you can build a telescope.
Not theoretically, no. However, at any temperature higher than 0 K, purely reversible computing is impossible. Unfortunately there is nowhere in the universe that is that cold, and again, maintaining this cold temperature requires a constant feed of energy. These considerations impose hard, nonzero limits on power consumption. Performing meaningful computations with arbitrarily small power consumption is impossible in our universe.
You're repeatedly getting very basic facts about physics and computation wrong. I love talking about physics but I don't have the time or energy to keep debating these very basic concepts, so this will probably be my last reply.
No - because you didn't actually answer my question, and you are conflating the reversible computing issue with the stealth issue.
I asked:
The energy expended and entropy produced for cooling is proportional to the incoming radiation absorbed, correct? And this can be lowered arbitrarily with reflective shielding - or is that incorrect? Nothing whatsoever to do with stealth, the context of this discussion concerns only optimal computers.
Don't understand this - the theory on rev computing says that energy expenditure is proportional to bit erasure, plus whatever implementation efficiency. The bit erasure cost varies with temperature sure, but you could still theoretically have a rev computing working at 100K.
You seem to be thinking that approaching zero energy production requires zero temperature - no. Low temperature reduces the cost of bit erasure, but bit erasure itself can also be reduced to arbitrarily low levels with algorithmic level recycling.
Which are?
Such as? Am I incorrect in the assumption that the cost of active cooling is proportional to the temperature or entropy to remove and thus the incoming radiation absorbed - and thus can be reduced arbitrarily with shielding?
Limit: External surface area of computer times σT^4.
As for active cooling, I think the burden of proof here is up to you to present a viable system and the associated calculations. How much energy does it take to keep a e.g. sphere of certain radius cold?
The thermal power you quoted is the perfect black body approximation. For a grey body, the thermal power is:
P = eoAT^4
where e is the material specific emissivity coefficient , and the same rule holds for absorption.
You seem to be implying that for any materials, there is a fundamental physical law which requires that absorption and emission efficiency is the same - so that a reflector which absorbs only e% of the incoming radiation is also only e% efficient at cooling itself through thermal emission.
Fine - even assuming that is the case, there doesn't seem to be any hard limit to reflective efficiency. A hypothetical perfect whitebody which reflects all radiation perfectly would have no need of cooling by thermal emission - you construct the object (somewhere in deep space away from stars) and cool it to epsilon above absolute zero, and then it will remain that cold for the duration of the universe.
There is also current ongoing research into zero-index materials that may exhibit 'super-reflection'. 1
If we can build super-conductors, then super-reflectors should be possible for advanced civs - a super conductor achieves a state of perfect thermal decoupling for electron interactions, suggesting that exotic material states could achieve perfect thermal decoupling for photon interactions.
So the true physical limit is for a perfect white body with reflectivity 1. The thermal power and entropy absorbed is zero, no active cooling required.
Furthermore, it is not clear at all that reflection efficiency must always equal emission efficiency.
Wikipedia's article on the Stefan-Boltzmann Law hints at this:
What do you make of that?
Also - I can think of a large number of apparent counter-examples to the rule that reflection and emission efficiency must be tied.
How do we explain greenhouse warming of the earth, snowball earth, etc? The temperature of the earth appears to mainly depend on it's albedo, and the fraction of incoming light reflected doesn't appear to be intrinsically related to the fraction of outgoing light, with separate mechanisms affecting each.
Or just consider a one-way mirror: it reflects light in one direction, but is transparent in the other. If you surround an object in a one-way mirror (at CMB infrared/microwave wavelengths) - wouldn't it stay very cold as it can emit infrared but is protected from absorbing infrared? Or is this destined to fail for some reason?
I find nothing in the physics you have brought up to rule out devices with long term temperatures much lower than 2.7K - even without active cooling. Systems can be out of equilibrium for extremely long periods of time.
Again, you're getting the fundamental and basic physics wrong. You've also evaded my question.
There is no such thing as a perfect whitebody. It is impossible. All those examples you mention are for narrow-band applications. Thermal radiation is wideband and occurs over the entire electromagnetic spectrum.
The piece in the wikipedia article links to papers such as http://arxiv.org/pdf/1109.5444.pdf in which thermal radiation (and absorption) are increased, not decreased!
Greenhouse warming of the Earth is an entirely different issue and I don't see how it's related. The Earth's surface is fairly cold in comparison to the Sun's.
One-way mirrors do not exist. http://web.archive.org/web/20050313084618/http://cu.imt.net/~jimloy/physics/mirror0.htm What are typically called 'one-way mirrors' are really just ordinary two-way partially-reflective mirrors connecting two rooms where one room is significantly dimmed compared to the other.
Well, firstly, you have to cool it down to below 2.7K in the first place. That most certainly requires 'active cooling'. Then you can either let it slowly equilibrate or keep it actively cold. But then you have to consider the carnot efficiency of the cooling system (which dictates energy consumption goes up as e/Tc, where Tc is the temperature of the computer and e is the energy dissipated by the computer). So you have to consider precisely how much energy the computer is going to use at a certain temperature and how much energy it will take to maintain it at that temperature.
EDIT: You've also mentioned in that thread you linked that "Assuming large scale quantum computing is possible, then the ultimate computer is thus a reversible massively entangled quantum device operating at absolute zero." Well, such a computer would not only be fragile, as you said, but it would also be impossible in the strong sense. It is impossible to reach absolute zero because doing so would require an infinite amount of energy: http://io9.com/5889074/why-cant-we-get-down-to-absolute-zero . For the exact same reason, it is impossible to construct a computer with full control over all the atoms. Every computer is going to have some level of noise and eventual decay.
Show instead of tell. I didn't yet answer your question about the initial energy cost of cooling the sphere because it's part of the initial construction cost and you haven't yet answered my questions yet about reflectivity vs emisison and how it relates to temperature.
Says what law - and more importantly - what is the exact limit then? Perfect super-conductivity may be impossible but there doesn't appear to be an intrinsic limit to how close one can get, and the same appears to apply for super-reflection. This whole discussion revolves around modeling technologies approaching said limits.
This helps my case - the incoming radiation is narrow-band microwave from the CMB. The outgoing radiation can be across the spectrum.
If the 'law' can be broken by materials which emit more than the law allows, this also suggests the 'law' can be broken in other ways as in super-reflectors.1
Ok.
If the earth's equilibrium temperature varies based on the surface albedo, this shows that reflectivity does matter and suggests a hypothetical super-reflector shielding for the CMB microwave could lead to lower than CMB temperatures. (because snow covering of the earth leads to lower equilibrium temperatures than a black-body at the same distance from the sun.)
Do you? I'm not clear on that - you haven't answered the earth counter example, which seems to show that even without active cooling, all it takes is albedo/reflectivity for an object's equilibrium temperature to be lower than that of a black body in the same radiation environment. Is there something special about low temps like 2.7k?
Apparently coherence in current quantum computers requires millikelvin temperatures, which is why I'm focusing on the limits approaching 0K. And from what I understand this is fundamental - as the limits of computing involve very long large coherent states only possible at temperatures approaching 0.
If we weren't considering quantum computing, then sure I don't see any point to active cooling below 2.7K. The the energy cost of bit erasures is ~CTc for some constant C, but the cooling cost goes as e/Tc. So this effectively cancels out - you don't get any net energy efficiency gain for cooling below the background temperature. (of course access to black holes much colder than the CMB changes that)
Yes - but again we are discussing limits analysis where said quantities approach zero, or infinity or whatever.
You can trivially prove this for yourself. High-energy gamma rays cannot be completely reflected by matter. All thermal radiation contains some high-energy gamma rays. Thus no material can perfectly reflect thermal radiation. QED.
No it's not. CMB radiation spans the entire EM spectrum. Thermal radiation is almost the exact opposite of narrow-band radiation.
It's not really broken though. It's just that radiation in these materials happens through mechanisms beyond conventional blackbody radiation. A common LED emits radiation far in excess of its thermal radiation. This doesn't mean that Stefan-Boltzmann is 'broken', it just means that an extra emission mechanism is working. A mechanism that requires free energy to run (unlike normal thermal radiation which requires no free energy). And sure enough, if you read that paper the extra mechanism requires extra free energy.
But you can't use an extra emission mechanism to reduce the emitted raditation.
You keep making this same mistake. Thermal equilibrium temperature does not depend on surface reflectivity. https://www.researchgate.net/post/Is_it_possible_to_distinguish_thermal_bodies_in_equilibrium/1
This is a very basic physics error.
It makes no difference what type of computing you're considering. I suggest reading http://arxiv.org/pdf/quant-ph/9908043.pdf
Specifically, the limiting factor is not temperature at all but error rate of your computer hardware, quantum or not. The ultimate limit to efficiency is set by the error rate, not the temperature at which you can cool the system to.
For any system, even exotic? By what law? A simple google search seems to disagree - gamma rays are reflected today, in practice, (albeit with difficulty and inefficiently) by multilayer reflectors.
The vast majority of the energy peaks in microwave frequencies, but fine yes there is always some emission in higher frequencies - practical shielding would be complex and multilayer.
You keep bringing this up, but you can't explain how it applies to some basic examples such as the earth. How can you explain the fact that the temperature of planets such as earth, venus varies greatly and depends mostly on their albedo? Is it because the system is not in equilibrium? Then who cares about equilibrium? It almost never applies.
If the earth/sun system is not in equilibrium, then my hypothetical reflective object somewhere in deep space receiving radiation only from the CMB is certainly not in equilibrium either.
And finally the universe itself is expanding and is never in equilibrium - the CMB temperature is actually decaying to zero over time.
Until I see a good explanation of planetary albedo and temperature, I can't take your claim of "basic physics mistake" seriously.
Read that of course, and I'd recommend some of Mike Frank's stuff over it.1 Obviously the energy cost of bit erasure is the same for all types of computing. Quantum computing is different only in having much lower error/noise/temp tolerances due to decoherence issues.
These are directly linked.
Heat is just thermal noise. And noise and errors are fundamentally the same - uncertainty over states that can explode unless corrected. The error rate for the most advanced computers is absolutely limited by thermal noise (and quantum noise).
This is trivially obvious at extremes - ie the error rate of a computer at 10000K is 100% for most materials. The lowest error rates are only achievable by exotic matter configurations at very low temperatures.
The idealized perfect computer is one with zero entropy - ie ever quantum state stores meaningful information, and every transition at every time step is a planned computation.
Looking at it another way, using devices and transitions larger than the absolute physical limits is just an easy way to do error correction to handle thermal noise.
I still can't understand why you think the Earth system is representative here.... are you asking why the Earth isn't the same temperature as the Sun? Or the same temperature as the background of space? Because if you remove any one, it would equilibrate with the other. But you're proposing to put your system in deep space where there is only the background. If you did that to Earth, you'd find it would very rapidly equilibrate to close to 2.7 K, and the final temperature is irrespective of surface albedo.
Albedo doesn't have any relationship with final temperature. Only speed at which equilibrium is reached.
Again, I don't feel like I have to 'explain' anything here... perhaps you could explain, in clearer terms, why you think it bears any relationship to the system we are discussing?
It's great that you've read those, unfortunately it seems you haven't understood them at all.
Not in the way you probably think. Error rate depends on hardware design as well as temperature. You're confusing a set of concepts here. As errors are generated in the computation, the entropy (as measured internally) will increase, and thus the heat level will increase. If this is what you are saying, you are correct. But the rate of generation of these errors (bits/s) is not the same as the instantaneous system entropy (bits) - they're not even the same unit! You could have a quantum computer at infinitesimally low temperature and it would still probably generate errors and produce heat.
This is really just another way of saying that your computer is not 100% reversible (isentropic). This is because of inevitable uncertainties in construction (is the manufacturing process that created the computer itself a perfectly error-free computer? If so, how was the first perfectly error-free computer constructed?), uncertainties in the physics of operation, and inevitable interaction with the outside world. If you claim you can create a perfectly isentropic computer, then the burden of proof is on you to demonstrate such a system. You can't expect me to take it on faith that you can build a perfectly reversible computer!