Ruby

LessWrong Team

 

I have signed no contracts or agreements whose existence I cannot mention.

Sequences

LW Team Updates & Announcements
Novum Organum

Comments

Sorted by
Ruby20

Oh, very reasonable. I'll have a think about how to solve that. So I can understand what you're trying to do, why is it you want to refresh the page?

Ruby20

Oh, that's the audio player widget. Seems it is broken here! Thank you for the report.

Ruby40

Cheers for the feedback, I apologize for confusing and annoyingness.

What do you mean by "makes the URL bar useless"? What's the use you're hoping would still be there? (typing in a different address should still work

The point of the modals is they don't lose your place in the feed in a way that's hard technically to do with proper navigation, though it's possible we should just figure out how to do that.

And ah yeah, the "view all comments" isn't a link on right-click, but I can make it be so (the titles are already that). That's a good idea.

All comment threads are what I call a "linear-slice" (parent-child-child-child) with no branching. Conveying this relationship while breaking with the convention of the rest of the site (nesting) has proven tricky, but I'm reluctant to give up the horizontal space, and it looks cleaner. But two comments next to each other are just parent/child, and if there are ommitted comments, there's a bar saying "+N" that when clicked, will display them.

Something I will do is make it so the post-modal and comments-modal is one, and when you click to view a particular comment, you'll be shown it but the rest will also be there, which should hopefully help with orienting.

Thanks again for writing up those thoughts!

Ruby30

I'm curious for examples, feel free to DM if you don't want to draw further attention to them

Ruby60

Thread for feedback on the New Feed

Question, complaints, confusions, bug reports, feature requests, and long philosophical screeds – here is the place!

Ruby30

I think that intellectual intimacy should include having similar mental capacities.

Seems right, for both reasons of understanding and trust.

A part of me wants to argue that these are intertwined

I think the default is they're intertwined but the interesting thing is they can come apart: for example, you develop feelings of connection and intimacy through shared experience, falsely assume you can trust (or shared values or whatever), but then it turns out the experiences shared never actually filtered for that.

Ruby141

This matches with the dual: mania. All plans, even terrible ones, seem like they'll succeed and this has flow through effects to elevated mood, hyperactivity, etc.

Whether or not this happens in all minds, the fact that people can alternate fairly rapidly between depression and mania with minimal trigger suggests there can be some kind of fragile "chemical balance" or something that's easily upset. It's possible that's just in mood disorders and more stable minds are just vulnerable to the "too many negative updates at once" thing without greater instability.

Ruby20

To clarify here, I think what Habryka says about LW generally promoting lots of content being normal is overwhelmingly true (e.g. spotlights and curation) and this is book is completely typical of what we'd promote to attention, i.e. high quality writing and reasoning. I might say promotion is equivalent to upvote, not to agree-vote.

I still think there details in the promotion here that I think make inferring LW agreement and endorsement reasonable:

  1. lack of disclaimers around disagreement (absence is evidence) together with a good prior that LW team agrees a lot with Eliezer/Nate view on AI risk
  2. promoting during pre-order (which I do find surprising)
  3. that we promoted this in a new way (I don't think this is as strong evidence as we did before, mostly it's that we've only recently started doing this for events and this is the first book to come along, we might have and will do it for others). But maybe we wouldn't have or as high-effort absent agreement.

But responding to the OP, rather than motivation coming from narrow endorsement of thesis, I think a bunch of the motivation flows more from a willingness/desire to promote Eliezer[1] content, as (i) such content is reliably very good, and (ii) Eliezer founded LW and his writings make up the core writings that define so much of site culture and norms. We'd likely do the same for another major contributor, e.g. Scott Alexander.

I updated from when I first commented thinking about what we'd do if Eliezer wrote something we felt less agreement over, and I think we'd do much the same. My current assessment is the book placements is something like ~"80-95%" neutral promotion of high-quality content the way we generally do, not because of endorsement, but maybe there's a 5-20% it got extra effort/prioritization because we in fact endorse the message, but hard to say for sure.

 

  1. ^

    and Nate

Ruby30

LW2 had to narrow down in scope under the pressure of ever-shorter AI timelines

I wouldn't say the scope was narrowed, in fact the admin team took a lot of actions to preserve the scope, but a lot of people have shown up for AI or are now heavily interested in AI, simply making that the dominant topic. But, I like to think that people don't think of LW as merely an "AI website".

Ruby22

It really does look dope

Load More