You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

New music powers

-6 Elo 02 September 2016 07:39AM

Original post: http://bearlamp.com.au/new-music-powers/


I have written before about how I am pretty terrible at canvassing music in my head.  This lends to the appalling ability (to musically oriented people) to be able to do things like listen to the same song on repeat 500 times or more in a row without being bothered by it either way.  I never cared more than the sense of "this is interesting but irrelevant" on the idea.

Being indifferent to music has given me the ability to be completely useless at holding a musical preference, or explore the value of music in terms of going to music events, or participating in musical experiences.


This week something changed!  Or more accurately last week.  Last week I was listening to a piece for the n'th time, but at the same time was quite badly sleep deprived.  As I was listening the music started falling apart.  Different parts of the music changed volume so that I could isolate different instruments and follow different features of the music.  At the time, being a bit sleep deprived I took it as a warning that maybe it was time to go to bed.  hint hint: your going a little nuts.

Today I noticed I can still do it.  When I am no longer sleep deprived I can pay attention to music in a different way than I used to be able to.  I can single out the drums and only "listen" to that part, or the guitar, or the vocals.  (it's pop music on the radio).  

Of course the reason I bothered to write about it, and the reason that it's interesting is; as half the readers can probably imagine - I told a musical friend of mine that I had developed new powers and he said, 

Wait, people can't normally do that?

So I get to add this to the pile of typical mind, sensory perception assumptions that we make when we interpret our own individual world through our own senses.  What if your's worked a bit differently?  How much would that fundamentally change how you operate as a human?  How much you assume about the world around you and how it works?  And how everyone else works?


Question:  What are your natural assumptions about how your senses work?  Have you ever noticed anyone else acting on different basic natural assumptions?


Meta: this took 45mins to write.

How It Feels to Improve My Rationality

5 SquirrelInHell 18 March 2016 09:59AM

Note: this has started as a comment reply, but I thought it got interesting (and long) enough to deserve its own post.

Important note: this post is likely to spark some extreme reactions, because of how human brains are built. I'm including warnings, so please read this post carefully and in order written or don't read it at all.

I'm going to attempt to describe my subjective experience of progress in rationality.

Important edit: I learned from the responses to this post that there's a group of people which whom this resonates pretty well, and there's also a substantial group with whom it does not at all resonate, to the degree they don't know if what I'm saying even makes sense and is correlated to rationality in any meaningful way. If you find yourself in the second group, please notice that trying to verify if I'm doing "real rationality" or not is not a way to resolve your doubts. There is no reason why you would need to feel the same. It's OK to have different experiences. How you experience things is not a test of your rationality. It's also not a test of my rationality. All in all, because of publishing this and reading the comments, I've found out some interesting stuff about how some clusters of people tend to think about this :)

Also, I need to mention that I am not an advanced rationalist, and my rationality background is mostly reading Eliezer's sequences and self-experimentation.

I'm still going to give this a shot, because I think it's going to be a useful reference for a certain level in rationality progress.

I even expect myself to find all that I write here silly and stupid some time later.

But that's the whole point, isn't it?

What I can say about how rationality feels to me now, is going to be pretty irrelevant pretty soon.

I also expect a significant part of readers to be outraged by it, one way or the other.

If you think this is has no value, maybe try to imagine a rationality-beginner version of you that would find a description such as this useful. If only as a reference that says, yes, there is a difference. No, rationality does not feel like a lot of abstract knowledge that you remember from a book. Yes, it does change you deeply, probably deeper than you suspect.

In case you want to downvote this, please do me a favour and write a private message to me, suggesting how I could change this so that it stops offending you.

Please stop any feeling of wanting to compare yourself to me or anyone else, or to prove anyone's superiority or inferiority.

If you can't do this please bookmark this post and return to it some other time.

...

...

Ready?

So, here we go. If you are free from againstness and competitiveness, please be welcome to read on, and feel free to tell me how this resonates, and how different it feels inside your own head and on your own level.


Part 1. Pastures and fences

Let's imagine a vast landscape, full of vibrant greenery of various sorts.

Now, my visualization of object-level rationality is staking out territories, like small parcels of a pasture surrounded by fences.

Inside of the fences, I tend to gave more of neat grass than anything else. It's never perfect, but when I keep working on an area, it's slowly improving. If neglected, weeds will start growing back sooner or later.

Let's also imagine that the ideas and concepts I generalize as I go about my work become seeds of grass, carried by the wind.

What the work feels like, is that I'm running back and forth between object level (my pastures) and meta-level (scattering seeds).

As result of this running back and forth I'm able to stake new territories, or improve previous ones, to have better coverage and less weeds.

The progress I make in my pastures feeds back into interesting meta-level insights (more seeds carried by the wind), which in turn tend to spread to new areas even when I'm not helping with this process on purpose.

My pastures tend to concentrate in clusters, in areas that I have worked on the most.

When I have lots of action in one area, the large amounts of seeds generated (meta techniques) are more often carried to other places, and at those times I experience the most change happening in other, especially new and unexplored, areas.

However even if I can reuse some of my meta-ideas (seeds), then still to have a nice and clear territory I need to go over there, and put in the manual work of clearing it up.

As I'm getting better and more efficient at this, it becomes less work to gain new territories and improve old ones.

But there's always some amount of manual labor involved.


Part 2. Tells of epistemic high ground

Disclaimer: not using this for the Dark Side requires a considerable amount of self-honesty. I'm only posting this because I believe most of you folks reading this are advanced enough not to shoot yourself in the foot by e.g. using this in arguments.

Note: If you feel the slightest urge to flaunt your rationality level, pause and catch it. (You are welcome.) Please do not start any discussion motivated by this.

So, what clues do I tend to notice when my rationality level is going up, relative to other people?

Important note: This is not the same as "how do I notice if I'm mistaken" or "how do I know if I'm on the right path". These are things I notice after the fact, that I judge to be correlates, but they are not to be used to choose direction in learning or sorting out beliefs. I wrote the list below exactly because it is the less talked about part, and it's fun to notice things. Somehow everyone seems to have thought this is more than I meant it to be.

Edit: check Viliam's comment for some concrete examples that make this list better.

In a particular field:

  • My language becomes more precise. Where others use one word, I now use two, or six.
  • I see more confusion all around.
  • Polarization in my evaluations increases. E.g. two sensible sounding ideas become one great idea and one stupid idea.
  • I start getting strong impulses that tell me to educate people who I now see are clearly confused, and could be saved from their mistake in one minute if I could tell them what I know... (spoiler alert, this doesn't work).

Rationality level in general:

  • I stop having problems in my life that seem to be common all around, and that I used to have in the past.
  • I forget how it is to have certain problems, and I need to remind myself constantly that what seems easy to me is not easy for everyone.
  • Writings of other people move forward on the path from intimidating to insightful to sensible to confused to pitiful.
  • I start to intuitively discriminate between rationality levels of more people above me.
  • Intuitively judging someone's level requires less and less data, from reading a book to reading ten articles to reading one article.

Important note: although I am aware that my mind automatically estimates rationality levels of various people, I very strongly discourage anyone (including myself) from ever publishing such scores/lists/rankings. If you ever have an urge to do this, especially in public, think twice, and then think again, and then shut up. The same applies to ever telling your estimates to the people in question.

Note: Growth mindset!


Now let's briefly return to the post I started out replying to. Gram_Stone suggested that:

You might say that one possible statement of the problem of human rationality is obtaining a complete understanding of the algorithm implicit in the physical structure of our brains that allows us to generate such new and improved rules.

Now after everything I've seen until now, my intuition suggests Gram_Stone's idealized method wouldn't work from inside a human brain.

A generalized meta-technique could become one of the many seeds that help me in my work, or even a very important one that would spread very widely, but it still wouldn't magically turn raw territory into perfect grassland.


Part 3. OK or Cancel?

The closest I've come to Gram_Stone's ideal is when I witnessed a whole cycle of improving in a certain area being executed subconsciously.

It was only brought to my full attention when an already polished solution in verbal form popped into my head when I was taking a shower.

It felt like a popup on a computer screen that had "Cancel" and "OK" buttons, and after I chose OK the rest continued automatically.

After this single short moment, I found a subconscious habit was already in place that ensured changing my previous thought patterns, and it proved to work reliably long after.


That's it! I hope I've left you better off reading this, than not reading this.

Meta-note about my writing agenda: I've developed a few useful (I hope) and unique techniques and ideas for applied rationality, which I don't (yet) know how to share with the community. To get that chunk of data birthed out of me, I need some continued engagement from readers who would give me feedback and generally show interest (this needs to be done slowly and in the right order, so I would have trouble persisting otherwise). So for now I'm writing separate posts noncommittally, to test reactions and (hopefully) gather some folks that could support me in the process of communicating my more developed ideas.

Confession Thread: Mistakes as an aspiring rationalist

18 diegocaleiro 02 June 2015 06:10PM

We looked at the cloudy night sky and thought it would be interesting to share the ways in which, in the past, we made mistakes we would have been able to overcome, if only we had been stronger as rationalists. The experience felt valuable and humbling. So why not do some more of it on Lesswrong?

An antithesis to the Bragging Thread, this is a thread to share where we made mistakes. Where we knew we could, but didn't. Where we felt we were wrong, but carried on anyway.

As with the recent group bragging thread, anything you've done wrong since the comet killed the dinosaurs is fair game, and if it happens to be a systematic mistake that over long periods of time systematically curtailed your potential, that others can try to learn avoiding, better. 

This thread is an attempt to see if there are exceptions to the cached thought that life experience cannot be learned but has to be lived. Let's test this belief together!

Experience of typical mind fallacy.

2 Elo 27 April 2015 06:39PM

following on from:

http://lesswrong.com/lw/dr/generalizing_from_one_example/

I am quite sure in my experience that at some point between the ages of 10-15 I concluded that; "no the rest of the world does not think like me, I think in an unusual way".

This idea disagrees with the typical mind fallacy (where people outwardly generalise to think everyone else has similar minds to their own).

I suspect I started with a typical mind model of the world but at some point it broke badly enough that I re-modelled on "I just think differently to most others".

I wanted to start a new discussion; rather than continuing on from one in 2009;

Where do your experiences lie in relation to typical minds?

[Link] Quantity Always Trumps Quality

11 [deleted] 31 August 2012 05:15PM

http://www.codinghorror.com/blog/2008/08/quantity-always-trumps-quality.html

The ceramics teacher announced on opening day that he was dividing the class into two groups. All those on the left side of the studio, he said, would be graded solely on the quantity of work they produced, all those on the right solely on its quality. His procedure was simple: on the final day of class he would bring in his bathroom scales and weigh the work of the "quantity" group: fifty pound of pots rated an "A", forty pounds a "B", and so on. Those being graded on "quality", however, needed to produce only one pot - albeit a perfect one - to get an "A".

Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the "quantity" group was busily churning out piles of work - and learning from their mistakes - the "quality" group had sat theorizing about perfection, and in the end had little more to show for their efforts than grandiose theories and a pile of dead clay.

For some reason it just seems we in particular could learn something from this anecdote.

Iterate more. The practice effect is your friend as is mining out positive outliers in really huge sets. I wanted to also mention something about using going meta as a way to procrastinate but I feared I would summon a Newsome.


Edit: This has been mentioned before. I think it is good to remind people of it. Desrtopa writes:

Not only has it been mentioned before, last time it came up I searched and failed to find corroboration of the claim that it actually happened. Since applying a deliberately inconsistent grading rubric is not something professors are normally allowed to do, I strongly suspect that the anecdote is fictional.

It is therefore best to assume this is a parable.