Alex_Altair

Sequences

Entropy from first principles

Wikitag Contributions

Comments

Sorted by

Is speed reading real? Or is it all just trading-off with comprehension?

I am a really slow reader. If I'm not trying, it can be 150wpm, which is slower than talking speed. I think this is because I reread sentences a lot and think about stuff. When I am trying, it gets above 200wpm but is still slower than average.

So, I'm not really asking "how can I read a page in 30 seconds?". I'm more looking for something like, are there systematic things I could be doing wrong that would make me way faster?

One thing that confuses me is that I seem to be able to listen to audio really fast, usually 3x and sometimes 4x (depending on the speaker). It feels to me like I am still maintaining full comprehension during this, but I can imagine that being wrong. I also notice that, despite audio listening being much faster, I'm still not really drawn to it. I default to finding and reading paper books.

I just went through all the authors listed under "Some Writings We Love" on the LessOnline site and categorized what platform they used to publish. Very roughly;

Personal website: 
IIIII-IIIII-IIIII-IIIII-IIIII-IIIII-IIIII-IIII (39)
Substack: 
IIIII-IIIII-IIIII-IIIII-IIIII-IIIII- (30)
Wordpress: 
IIIII-IIIII-IIIII-IIIII-III (23)
LessWrong: 
IIIII-IIII (9)
Ghost: 
IIIII- (5)
A magazine: 
IIII (4)
Blogspot: 
III (3)
A fiction forum: 
III (3)
Tumblr: 
II (2)

"Personal website" was a catch-all for any site that seemed custom-made rather than a platform. But it probably contained a bunch of sites that were e.g. Wordpress on the backend but with no obvious indicators of it.

I was moderately surprised at how dominant Substack was. I was also surprised at how much marketshare Wordpress still had; it feels "old" to me. But then again, Blogspot feels ancient. I had never heard of "Ghost" before, and those sites felt pretty "premium".

I was also surprised at how many of the blogs were effectively inactive. Several of them hadn't posted since like, 2016.

Oh, sure, I'm happy to delete it since you requested. Although, I don't really understand how my comment is any more politically object-level than your post? I read your post as saying "Hey guys I found a 7-leaf clover in Ireland, isn't that crazy? I've never been somewhere where clovers had that many leaves before." and I'm just trying to say "FYI I think you just got lucky, I think Ireland has normal clovers."

[Deleted on request]

Rediscovering some math.

[I actually wrote this in my personal notes years ago. Seemed like a good fit for quick takes.]

I just rediscovered something in math, and the way it came out to me felt really funny.

I was thinking about startup incubators, and thinking about how it can be worth it to make a bet on a company that you think has only a one in ten chance of success, especially if you can incubate, y'know, ten such companies.

And of course, you're not guaranteed success if you incubate ten companies, in the same way that you can flip a coin twice and have it come up tails both times. The expected value is one, but the probability of at least one success is not one.

So what is it? More specifically, if you consider ten such 1-in-10 events, do you think you're more or less likely to have at least one of them succeed? It's not intuitively obvious which way that should go.

Well, if they're independent events, then the probability of all of them failing is 0.9^10, or

And therefore the probability of at least one succeeding is  More likely than not! That's great. But not hugely more likely than not.

(As a side note, how many events do you need before you're more likely than not to have one success? It turns out the answer is 7. At seven 1-in-10 events, the probability that at least one succeeds is 0.52, and at 6 events, it's 0.47.)

So then I thought, it's kind of weird that that's not intuitive. Let's see if I can make it intuitive by stretching the quantities way up and down — that's a strategy that often works. Let's say I have a 1-in-a-million event instead, and I do it a million times. Then what is the probability that I'll have had at least one success? Is it basically 0 or basically 1?

...surprisingly, my intuition still wasn't sure! I would think, it can't be too close to 0, because we've rolled these dice so many times that surely they came up as a success once! But that intuition doesn't work, because we've exactly calibrated the dice so that the number of rolls is the same as the unlikelihood of success. So it feels like the probability also can't be too close to 1.

So then I just actually typed this into a calculator. It's the same equation as before, but with a million instead of ten. I added more and more zeros, and then what I saw was that the number just converges to somewhere in the middle.

If it was the 1300s then this would have felt like some kind of discovery. But by this point, I had realized what I was doing, and felt pretty silly. Let's drop the "", and look at this limit;

If this rings any bells, then it may be because you've seen this limit before;

or perhaps as

The probability I was looking for was , or about 0.632.

I think it's really cool that my intuition somehow knew to be confused here! And to me this path of discovery was way more intuitive that just seeing the standard definition, or by wondering about functions that are their own derivatives. I also think it's cool that this path made  pop out on its own, since I almost always think of e in the context of an exponential function, rather than as a constant. It also makes me wonder if 1/e is more fundamental than . (Similar to how  is more fundamental than .)

we only label states as 'different' if they actually result in different controller behaviour at some point down the line.

This reminds me a lot of the coarse-graining of "causal" states in comp mech.

I got a ton of value from ILIAD last year, and strongly recommend it to anyone interested!

For anyone reading this comment thread in the future, Dalcy wrote an amazing explainer for this paper here.

Load More