Wiki Contributions

Comments

Sorted by
bglass1-1

It's not a take that I've thought about deeply, but could the evidence be explained by a technological advancement: the ability to hop between diverging universes?

  • It would explain why we don't see aliens; they discover the technology, and that empty parallel worlds are closer in terms of energy expenditure.

  • It could also explain why the interlopers don't bother us much; they are scouting for uninhabited parallel earths with easily-accessible resources, and skipping those with a population. The only ones we see are the ones incompetent or unlucky enough to crash.

  • It would explain why aliens aren't ridiculously outclassing us technologically. They don't have to solve interstellar travel before they start hopping.

  • It would provide an alternate explanation for why aliens 'look like us'; they are from timelines with varying amounts of divergence. (The default explanation of course being that we are primed to see humans everywhere, so our imagined monsters look human.)

I can easily think of a few arguments against this possibility.

  • If dimension hoppers aren't far ahead of us technologically, trading with us has advantages. Why skip, instead of open trade?

  • Technology would probably continue to advance. Hyper-advanced dimension hoppers should be better capable of scouting dimensions, and of displacing populated worlds, and yet we don't see them. (Perhaps they are better at hiding, but then, they don't need to hide.)

  • Instead of 'where is the alien AI' we are now left with 'where is the divergent timeline AI'.

That last one in particular makes me think this explanation isn't likely. I'd expect rogue AI and self-replicating machines to be invading constantly.

bglass30

There was a worldbuilding contest last year for writing short stories featuring AGI with positive outcomes. You may be interested in it, although it's undoubtedly propaganda of some sort.

If you write such a story, please link it.

These are not fables, so I apologize for that. However, I've written many short stories (that are not always obviously) about alignment and related topics. The Well of Cathedral is about trying to contain a threat that grows in power exponentially, Waste Heat is about unilateral action to head off a catastrophe causing its own catastrophe, and Flourishing is a romance between a human and AI, but also about how AIs don't think like humans at all.

More than half my works are inadvertently about AI or alignment in some way or another... Dais 11, Dangerous Thoughts, The Only Thing that Proves You, and I Will Inform Them probably also count, as does I See the Teeth (though only tangentially at end) and Zamamiro (although that one's quality is notably poor).

I guess what I'm saying is, if there's ever a competition I'll probably write an entry, otherwise please check out my AO3 links above.

bglass20

I think I see what you mean. A new AI won't be under the control of egregores. It will be misaligned to them as well. That makes sense.

bglass70

I really appreciate your list of claims and unclear points. Your succinct summary is helping me think about these ideas.

There is no highly viral meme going around right now about producing tons of paperclips.

A few examples came to mind: sports paraphernalia, tabletop miniatures, and stuffed animals (which likely outnumber real animals by hundreds or thousands of times).

One might argue that these things give humans joy, so they don't count. There is some validity to that. AI paperclips are supposed to be useless to humans. On the other hand, one might also argue that it is unsurprising that subsystems repurposed to seek out paperclips derive some 'enjoyment' from the paperclips... but I don't think that argument will hold water for these examples. Looking at it another way, some amount of paperclips are indeed useful.

No egregore has turned the entire world to paperclips just yet. But of course that hasn't happened, else we would have already lost.

Even so: consider paperwork (like the tax forms mentioned in the post), skill certifications in the workplace, and things like slot machines and reality television. A lot of human effort is wasted on things humans don't directly care about, for non-obvious reasons. Those things could be paperclips.

(And perhaps some humans derive genuine joy out of reality television, paperwork, or giant piles of paperclips. I don't think that changes my point that there is evidence of egregores wasting resources.)

bglass30

information vs truth

Thanks, that gets rid of most of my confusion.

Without additional cost, I'd definitely prefer to know what happens even if my favorite character might die.

For a different show, I would not care. Whether or not I value the information depends on the show, or the domain... How much I'm willing to pay for information, and by extension the truth, depends a lot on the thing about which I'm learning.

To me it looks like the thing itself is what is important, and my desire to have accurate beliefs stems from caring about the thing. It's not that I care about the accurate beliefs themselves, so much.

Even so, I don't want false beliefs about anything. All domains are one.

bglass10

Hopefully that helps clarify what I'm talking about there.

It does. Those examples help a lot. Thank you!

Preferring anything over truth creates room for confusion.

We might be talking about preferring things over truth in two different ways.

If you prefer something alternate to the truth, the thing you prefer could be right or wrong. To the extent it's wrong you are confusing yourself. I agree with that, and I think that's what you mean by 'preferring something over truth'.

What I meant is more like "How much effort I'm going to expend getting at this truth."

An (admittedly trivial) example: There's a TV show I like, whose ending is only available for paying customers for a streaming service. I judged it not worth my money to buy the service to get the ending. In a sense, I value the money more than the truth of the show's ending.

An example with greater consequence: for most truths, I'm unwilling to sacrifice stranger's lives to learn them. Which isn't to say that lives are a sacred value that cannot be traded--just that most truths aren't worth that cost. In that sense, I value human lives above truth.

(That's probably a bad example because so seldom can one trade human lives to learn truths, but alas. The first real world situation that comes to mind is Covid19 vaccine testing, and I'd absolutely let volunteers risk their lives for that.)

Does my view on how much effort to spend pursuing truth still lead to confusion? It might.

But maybe that 20-year marriage sounds way sweeter.

A person using my reasoning may think "I don't want to risk sacrificing my marriage to learn that particular truth." Depending on the marriage, maybe it would be worth it to hold back... although my intuition says a marriage based on lies won't be. And of course, to know whether it would be worth it or not means that you've got to risk sacrificing it. That way is closed to you.

I'm going to have to think about it more. I don't want to trade poorly.

Even so, one can't study everything. How does one choose which truths to pursue? Indeed, to bring it back around, how does one choose which biases to focus on correcting, and which to let go for now because trying to overcome them would only add clutter to one's mental processes?

Thanks for trying to figure that out, and responding so thoroughly.

bglass30

I think the description of the Void in the twelve virtues is purposefully vague. Perhaps to shake the reader enough to get them to think for themselves, or perhaps to be a place for personal interjection into the twelve virtues.

You may try to name the highest principle with names such as “the map that reflects the territory” or “experience of success and failure” or “Bayesian decision theory”. But perhaps you describe incorrectly the nameless virtue. How will you discover your mistake? Not by comparing your description to itself, but by comparing it to that which you did not name.

It could also a test of a sort. In any case, I didn't think it would be worthwhile to explain my conception of the Void.

bglass140

Almost everything in this post sounds right to me.

These Drama Triangle patterns are everywhere. Utterly everywhere.

It doesn't seem that way to me; but then, what everywhere are you talking about?

I can see those patterns in argumentation online--a lot--and in a few dysfunctional people I know, and indeed in my own past in some places. Regarding my real-life modern friends, family, and coworkers, it doesn't seem like anyone relates to each other through those roles (at least not often enough to describe it as 'utterly everywhere').

Could the pattern be general enough to match very many circumstances? For example, one can act combative, or cooperate, or not react at all, to what happens in one's life. Thus any interaction can be mapped to the triangle.

Perhaps I'm missing something. If it's just that few of the people in my life regularly have the victim mindset, I feel very fortunate.

This echoes the Virtue of the Void

I thought that Void was something different, but I also feel like I shouldn't try to explain the difference in my conception of the Void.

At any rate, I agree strongly with the idea that we need to prune our mental processes and otherwise reduce the effort/cost from our attempts to be more rational. Mental noise is the source of much confusion. But I don't agree that Truth is the only thing that matters, or the ultimate thing that matters.

Which is something people always say right before they tell you to stop trying to find the Truth--and that's not my point at all! Keep pushing toward Truth! Nothing that you want is going to be accomplished without it. And if what you want changes as you learn more, so be it.

All I mean is that, in a technical sense, Truth is the penultimate value. It is fine to want things more than you want the Truth. The mistake is thinking you can get those things while discarding the Truth.

But that seems like a basic lesson... so what did you mean? It may be that every single time someone thinks what they want is at odds with the truth, they are wrong--is that what you meant?

Or perhaps, did you simply mean that getting at the truth requires unwavering devotion, far stronger than what people normally apply toward anything they want? I think that's also true.

I feel like I'm missing something again.

bglass40

Perhaps it is carbon dioxide. Here is a paper on it:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3341709/

To summarize, the idea is that high partial pressure of CO2 causes blood pH to change, which influences the body's regulatory mechanisms and eventually leads to obesity.

At higher altitudes the carbon dioxide fraction in the air is unchanged, but the partial pressure is lower. I'd expect that lower partial pressures of CO2 would mean less effect on blood pH.

bglass10

I made an account for your challenge.

I plotted items by color and looked at minimums, maximums, and averages. Yellow items consistently provide just under twenty mana while blue items always provide about the same as the reading, except that they sometimes provide twenty extra. I was too lazy to try to figure out the red or green items. Given what I know, I can submit the blue items HoC, RoJ, and PoH to get about 140 mana for 101 gold, and call it a day.

However, I also noticed that no blue item provided over 60 mana. I will add in the yellow items PoP and WoJ for a margin of safety. Painfully, that brings the total to 177. Combined with the chance that other items will provide extra it should be enough for some confidence.

The Wizard Wakalix did not inform me of the reliability of blue items; they merely called their device a liar. (Perhaps they aren't yet aware of the pattern because they don't utilize the arcane magic known as python). I may as well include a letter that outlines my findings regarding yellow and blue items. That way, the next time Wakalix goes to the caravans they can make use of the information, and the next time they want to hire someone I will be their first thought.

If I were devious, I might try to include red and green items to obfuscate the reliability of blue for predicting mana cost. Sending three blue items at once will surely let the cat out of the bag. However, If I expected deviousness, this setup is an excellent way to deceive an errand-runner into providing free magic items. I'll count on the norms regarding my work to prevent either of those outcomes. I may check with other errand runners to make sure Wakalix isn't running a scam, just in case.

Edit: Reading the comments allowed me to notice I mixed up the direction of the error: blue items read 20 over, not provide 20 over. Well. Good thing I included a margin.

Load More