LESSWRONG
LW

Xachariah
363434230
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Why CFAR's Mission?
Xachariah10y20

This is my main question. I've never seen anything to imply that multi-day workshops are effective methods of learning. Going further, I'm not sure how Less Wrong supports Spaced Repetition and Distributed Practice on one hand, while also supporting an organization that's primary outreach seems to be crash courses. It's like Less Wrong is showing a forum wide cognitive dissonance that nobody notices.

That leaves a few options:

  • I'm wrong (though I consider it highly unlikely)
  • CFAR never bothered to look it up or uses self selection to convince themselves it's effective
  • CFAR is trying to optimize for something aside from spreading rationality, but they aren't actually saying what.
Reply
Stupid Questions May 2015
Xachariah10y40

I didn't mean to imply nonlinear functions are bad. It's just how humans are.

Picking gambles 1A and 2B, on the other hand, cannot be described by any utility function.

Prospect Theory describes this and even has a post here on lesswrong. My understanding is that humans have both a non-linear utility function as well as a non-linear risk function. This seems like a useful safeguard against imperfect risk estimation.

[Insurance is] not a Dutch Book in the usual sense: it doesn't guarantee either side a profit.

If you setup your books correctly, then it is guaranteed. A dutch book doesn't need to work with only one participant, and in fact many dutch books only work with on populations rather than individuals, in the same way insurance only guarantees a profit when properly spread across groups.

Reply
Stupid Questions May 2015
Xachariah10y30

The point of the Allais paradox is less about how humans violate the axiom of independence and more about how our utility functions are nonlinear, especially with respect to infinitesimal risk.

There is an existing Dutch Book for eliminating infinitesimal risk, and it's called insurance.

Reply
Stupid Questions January 2015
Xachariah11y00

You may be interested in the term 'inverted classroom', if you're not already aware of it.

The basic idea is that it's the normal school system you grew up with, except students watch video lectures as homework, then do all work in class while they've got an expert there to help. Also, the time when the student is stuck in one place and forced to focus is when they're actually doing the hard stuff.

There's so many reasons why it's better than traditional education. I just hope inverted classrooms start to catch on sooner rather than later.

(Edit: I know this isn't your exact proposal, but it uses many of the features you mention and it can be immediately grafted into the existing public school system with a single change of curriculum and the creation of some videos. It's the low hanging fruit for education.)

Reply
Stupid Questions (10/27/2014)
Xachariah11y10

Anecdotally someone close to me did one of those and it was a quick way to burn thousands of dollars.

I tried to dissuade them, but end the end they came back with less knowledge than I did of the subject, and all I did was follow some youtube tutorials and look at stack overflow to create a couple learning apps for android.

Reply
Ethical frameworks are isomorphic
Xachariah11y10

All ethical frameworks are equal the same way that all graphing systems are equal.

But I'll be damned if it isn't easier to graph circles with polar coordinates than it is with Cartesian coordinates.

Reply
Open thread, 11-17 August 2014
Xachariah11y20

You don't need to upvote them necessarily. Just flip a coin.

If you downvote them too, then it just looks like they made a bad post.

Reply
Harry Potter and the Methods of Rationality discussion thread, July 2014, chapter 102
Xachariah11y60

Transfiguration sickness isn't because things turn into poison. Your body goes into a transfigured state, minor changes occur, and when you come back from that state things are different. It'd be tiny things. Huge problems would cause you to die instantly, but little transcription errors would kill you in the timeframe described.

Eg, your veins wouldn't match up right. The DNA in your cells would be just a little bit off and you'd get spontaneous cancer in your entire body. Some small percent of neurotransmitters and hormones would be transformed into slightly different ones... etc. None of that would be contagious or even harmful to somebody consuming it. But to the animal itself it'd be devastating.

Also remember that once the transfiguration reverts and you're back to yourself, you're in a stable state. The only issue is that you're not back together perfectly. Quirrell would only get sick if he drank the blood while it was transfigured and then it changed form while inside of him.

Reply
Harry Potter and the Methods of Rationality discussion thread, July 2014, chapter 102
Xachariah11y100

Death iss not truly gainssaid. Real sself is losst, as you ssay. Not to my pressent tasste. Admit I conssidered it, long ago.

It's still not a lie.

He considered it long ago and then he did it. He doesn't want to try it again because he's already got some and/or they wouldn't fix his current situation. Literally truthful but appropriately misleading.

Reply
[LINK] Another "LessWrongers are crazy" article - this time on Slate
Xachariah11y210

I thought the article was quite good.

Yes it pokes fun at lesswrong. That's to be expected. But it's well written and clearly conveys all the concepts in an easy to understand manner. The author understands lesswrong and our goals and ideas on a technical level, even if he doesn't agree with them. I was particularly impressed in how the author explained why TDT solves Newcomb's problem. I could give that explanation to my grandma and she'd understand it.

I don't generally believe that "any publicity is good publicity." However, this publicity is good publicity. Most people who read the article will forget it and only remember lesswrong as that kinda weird place that's really technical about decision stuff (which is frankly accurate). Those people who do want to learn more are exactly the people lesswrong wants to attract.

I'm not sure what people's expectations are for free publicity but this is, IMO, best case scenario.

Reply
Load More
47Exploiting the Typical Mind Fallacy for more accurate questioning?
13y
73
136Punctuality - Arriving on Time and Math
13y
40
9Harry Potter and the Methods of Rationality discussion thread, part 12
13y
699