Posts

Sorted by New

Wiki Contributions

Comments

Ann3d10

Thanks for the reference! I'm definitely confused about the inclusion of "pre-prepared (packaged) meat, fish and vegetables" on the last list, though. Does cooking meat or vegetables before freezing it (rather than after? I presume most people aren't eating meat raw) actually change its processed status significantly?

Ann4d10

Suppose my intuition is that the 'conscious experience' of 'an iPhone' varies based on what software is running on it. If it could run a thorough emulation of an ant and have its sensory inputs channeled to that emulation, it would be more likely to have conscious experience in a meaningful-to-me way than if nobody bothered (presuming ants do implement at least a trivial conscious experience).

(I guess that there's not necessarily something that it's like to be an iPhone, by default, but the hardware complexity could theoretically support an iAnt, which there is it is something that it's like to be?)

Ann4d32

That certainly seems distinct from brain mass, though (except that it takes a certain amount to implement in the first place). I'd expect similar variation in feeling pain by becoming different neurologies of human; I know there are many reported variations in perception of felt pain inside our species already.

Ann4d10

But that's in the limit. A function of electron = 0, ant = 1, cockroach = 4, mouse = 300 fits it just as well as electron = 0, ant = 1, cockroach = 2, mouse = 2^75, as does electron = 0, ant = 100, cockroach = 150, mouse = 200.

Ann4d20

"Moral weights depend on intensity of conscient experience." - Just going to note that I've no particular reason to concede this point at the moment, so don't directly consider the next question a question of moral weight; I'd rather disassociate it first:

Is there ... any particular reason to expect intensity of conscious experience to grow 'super-additively', such that a tiny conscious mind experiences 1 intensity units, but a mind ten times as large experiences (since you reject linear, we'll step up to the exponential) 1024 intensity units? Given our general inability to exist as every mass of brain, what makes this more intuitive than no, marginal, or linear increase in intensity?

Personally, I would be actively surprised to spend time as a lower-brain-mass conscious animal and report that my experiences were (exceptionally) less intense. Why do our intuitions differ on this?

Ann4d21

Yes, but also that there might not actually be a specific new thing, a detrimental thing, to gesture at.

If root causes of obesity existed all along, and changes in the modern Western diet revealed the potential for obesity in our region rather than actively causing it, looking for root causes specifically in things that have changed may not work out if the things that have changed are not the root causes.

(I.e., it's a seemingly useful constraint on looking at the solution space, that might not be true -- and not so useful a constraint if it isn't.)

Ann5d21

You don't actually have to do any adjustments to the downsides, for beneficial statistical stories to be true. One point I was getting at, specifically, is that it is better than being dead or suffering in specific alternative ways, also. There can be real and clear downsides to carrying around significant amounts of weight, especially depending what that weight is, and still have that be present in the data in the first place because of good reasons.

I'll invoke the 'plane that comes back riddled in bullet holes, so you're trying to armor where the bullet holes are' meme. The plane that came back still came back; it armored the worst places, and now its other struggles are visible. It's not a negative trend, that we have more planes with damage now, than we did when they didn't come back.

I do think it's relevant that the U.S. once struggled with nutritional deficiencies with corn, answered with enriched and fortified products that helped address those, and likely still retains some of the root issues (that our food indeed isn't as nutritious as it should be, outside those enrichments). That the Great Depression happened at all; and the Dust Bowl. There's questions here not just of personal health, but of history; and when I look at some of the counterfactuals, given available resources, I see general trade-offs that can't be ignored when looking at - specifically - the statistics.

Ann8d10

Raw spinach in particular also has high levels of oxalic acid, which can interfere with the absorption of other nutrients, and cause kidney stones when binding with calcium. Processing it by cooking can reduce its concentration and impact significantly without reducing other nutrients in the spinach as much.

Grinding and blending foods is itself processing. I don't know what impact it has on nutrition, but mechanically speaking, you can imagine digestion proceeding differently depending on how much of it has already been done.

You do need a certain amount of macronutrients each day, and some from fat. You also don't necessarily want to overindulge on every micronutrient. If we're putting a number of olives in our salad equivalent to the amount of olive oil we'd otherwise use, we'll say 100 4g olives, that we've lowered the sodium from by some means to keep that reasonable ... that's 72% of recommended daily value of our iron and 32% of our calcium. We just mentioned that spinach + calcium can be a problem; and the pound of spinach itself contains 67% of iron and 45% of our calcium. 

... That's also 460 calories worth of olives. I'm not sure if we've balanced our salad optimally here. Admittedly, if I'm throwing this many olives in with this much spinach in the first place, I'm probably going to cook the spinach, throw in some pesto and grains or grain products, and then I've just added more olive oil back in again ... ;)

And yeah, greens with oil might taste better or be easier to eat than greens just with fatty additions like nuts, seeds, meat, or eggs. 

Ann8d10

For the first point, there's also the question of whether 'slightly superhuman' intelligences would actually fit any of our intuitions about ASI or not. There's a bit of an assumption in that we jump headfirst into recursive self-improvement at some point, but if that has diminishing returns, we happen to hit a plateau a bit over human, and it still has notable costs to train, host and run, the impact could still be limited to something not much unlike giving a random set of especially intelligent expert humans the specific powers of the AI system. Additionally, if we happen to set regulations on computation somewhere that allows training of slightly superhuman AIs and not past it ...

Those are definitely systems that are easier to negotiate with, or even consider as agents in a negotiation. There's also a desire specifically not to build them, which might lead to systems with an architecture that isn't like that, but still implementing sentience in some manner. And the potential complication of multiple parts and specific applications a tool-oriented system is likely to be in - it'd be very odd if we decided the language processing center of our own brain was independently sentient/sapient separate from the rest of it, and we should resent its exploitation.

I do think the drive/just a thing it does we're pointing at with 'what the model just does' is distinct from goals as they're traditionally imagined, and indeed I was picturing something more instinctual and automatic than deliberate. In a general sense, though, there is an objective that's being optimized for (predicting the data, whatever that is, generally without losing too much predictive power on other data the trainer doesn't want to lose prediction on).

Ann8d60

"Clearly we are doing something wrong."

I'm going to do a quick challenge to this assumption, also: What if we, in fact, are not?

What if the healthy weight for an American individual has actually increased since the 1920s, and the distribution followed it? Alternately, what if the original measured distribution of weights is not what was healthy for Americans? What if the additional proportion of specifically 'extreme' obesity is related to better survival of disability that makes avoiding weight gain infeasible, or medications that otherwise greatly improve quality of life? Are there mechanisms by which this could be a plausible outcome of statistics that are good, and not bad?

Load More