Austin Chen

Hey there~ I'm Austin, currently building https://manifund.org. Always happy to meet LessWrong people; reach out at akrolsmir@gmail.com!

Wikitag Contributions

Comments

Sorted by

I agree with the paper that paying here probably has minimal effects on devs, but also even if it does have an effect it doesn't seem likely to change the results, unless somehow the AI group was more more incentivized to be slow than the non AI group. 

Minor point of clarity: I briefly attended a talk/debate where Nate Soares and Scott Aaronson (not Sumner) was discussing these topics. Are we thinking of the same event, or was there a separate conversation with Nate Soares and Scott Sumner?

If you're looking to do an event in San Francisco, lmk, we'd love to host one at Mox! 

Thanks Ozzie - we didn't invest that much effort into badges this year but I totally agree there's an opportunity to do something better. Organizer-wise it can be hard to line up all the required info before printing, but having a few sections where people can sharpie things in or pick stickers, seems like low hanging fruit. 

This could also extend beyond badges - for example, one could pick different colored swag t-shirts to signal eg (academia vs lab vs funder) at a conference. 

I'll also send this to Rachel for the Curve, which I expect she might enjoy this as a visual and event design challenge. 

Huh, seems pretty cool and big-if-true. Is there a specific reason you're posting this now? Eg asking people for feedback on the plan? Seeking additional funders for your $25m Series A? 

My guess btw is that some donors like Michael have money parked in a DAF, and thus require a c3 sponsor like Manifund to facilitate that donation - until your own c3 status arrives, ofc. 

(If that continues to get held up. but you receive an important c3 donation commitment in the meantime, let us know and we might be able to help - I think it's possible to recharacterize same year donations after c3 status arrives, which could unblock the c4 donation cap?) 

From the Manifund side: we hadn't spoken with CAIP previously but we're generally happy to facilitate grants to them, either for their specific project or as general support. 

A complicating factor is that, like many 501c3s, we have a limited budget to be able to send towards c4s, eg I'm not sure if we could support their maximum ask of $400k on Manifund. I do feel happy to commit at least $50k of our "c4 budget" (which is their min ask) if they do raise that much through Manifund; beyond that, we should chat!

Thanks to Elizabeth for hosting me! I really enjoyed this conversation; "winning" is a concept that seems important and undervalued among rationalists, and I'm glad to have had the time to throw ideas around here.

I do feel like this podcast focused a bit more on some of the weirder or more controversial choices I made, which is totally fine; but if I were properly stating the case for "what is important about winning" from scratch, I'd instead pull examples like how YCombinator won, or how EA has been winning relative to rationality in recruiting smart young folks. AppliedDivinityStudies's "where are all the successful rationalists" is also great.

Very happy to answer questions ofc!

Thanks for the feedback! I think the nature of a hackathon is that everyone is trying to get something that works at all, and "works well" is just a pipe dream haha. IIRC, there was some interest in incorporating this feature directly into Elicit, which would be pretty exciting.

Anyways I'll try to pass your feedback to Panda and Charlie, but you might also enjoy seeing their source code here and submitting a Github issue or pull request: https://github.com/CG80499/paper-retraction-detection

Load More