I suppose the question is: what % of the star's outgoing energy can we harness in principle, such that waste heat is hard to tell apart from background
You can never have the temperature of outgoing radiation indistinguishable from the cosmic background, since energy is being generated by the star and in equilibrium more energy must leave than enters from the background.
The CMB reads as ~2.76k. Let's say you wanted to radiate the entire energy output of a star at 3.76 k. That means the flux out equals the star's flux plus the CMB flux in. For a star like the sun, the surface area of material required comes to a sphere a fifth of a light year (13000 AUs) in radius to dissipate a solar luminosity of energy (divide quantity of radiator material by the fraction of the solar luminosity you want to use, but keep in mind having the radiators closer than a 5th of a light year would probably be pointless since they'd be heated up by the sun) (also keep in mind such an object would look as large as the full moon 44 light years away and as wide as Jupiter in our sky 2,000 light years away). For ten kelvin, a sphere 0.025 light years or 1560 AUs in radius. For 50 kelvin, 62 AUs (twice as large as Neptune's orbit).
Of course, there's also the starlight flux of all the other nearby stars, which makes this worse for very low temperatures.
(Calculations done using energy out = energy in from CMB + solar flux, and the definition of blackbody radiation)
EDIT: I should go over some astronomy papers and figure out what amounts of material typically produce observable infrared excesses.
This thread is for asking any questions that might seem obvious, tangential, silly or what-have-you. Don't be shy, everyone has holes in their knowledge, though the fewer and the smaller we can make them, the better.
Please be respectful of other people's admitting ignorance and don't mock them for it, as they're doing a noble thing.
To any future monthly posters of SQ threads, please remember to add the "stupid_questions" tag.