gw

Wiki Contributions

Comments

Sorted by
gw82

As a concrete example, as far as I can piece together from various things I have heard, Open Phil does not want to fund anything that is even slightly right of center in any policy work. I don't think this is because of any COIs, it's because Dustin is very active in the democratic party and doesn't want to be affiliated with anything that is even slightly right-coded. Of course, this has huge effects by incentivizing polarization of AI policy work with billions of dollars, since any AI Open Phil funded policy organization that wants to engage with people on the right might just lose all of their funding because of that, and so you can be confident they will steer away from that.

Thanks for sharing, I was curious if you could elaborate on this (e.g. if there are examples of AI policy work funded by OP that come to mind that are clearly left of center). I am not familiar with policy, but my one data point is the Horizon Fellowship, which is non-partisan and intentionally places congressional fellows in both Democratic and Republican offices. This straightforwardly seems to me like a case where they are trying to engage with people on the right, though maybe you mean not-right-of-center at the organizational level? In general though, (in my limited exposure) I don't model any AI governance orgs as having a particular political affiliation (which might just be because I'm uninformed / ignorant).

gw72

Do you have any data on whether outcomes are improving over time? For example, % published / employed / etc 12 months after a given batch

gw32

I agree! This is mostly focused on the "getting a job" part though, which typically doesn't end up testing those other things you mention. I think this is the thing I'm gesturing at when I say that there are valid reasons to think that the software interview process feels like it's missing important details.

gw10

This might look like building influence / a career in the federal orgs that would be involved in nationalization, rather than a startup. Seems like positioning yourself to be in charge of nationalized projects would be the highest impact?

gw20

Your GitHub link is broken, it includes the period in the url.

gw401

I
Love
Interesting
Alignment
Donferences

Reply9543322
gw10

I spoke with some people last fall who were planning to do this, perhaps it's the same people. I think the idea (at least, as stated) was to commercialize regulatory software to fund some alignment work. At the time, they were going by Nomos AI, and it looks like they've since renamed to Norm AI.

gw1512

+ the obvious fact that it might matter to the kid that they're going to die

(edit: fwiw I broadly think people who want to have kids should have kids)

gw20

Hmm, I have exactly one idea. Are you pressing shift+enter to new line? For me, if I do shift+enter

>! I don't get a spoiler

But if I hit regular enter then type >!, the spoiler tag pops up as I'm typing (don't need to wait to submit the question for it to appear)

gw20

Are you thinking of

Until Dawn?

(also it seems like I can get a spoiler tag to work in comments by starting a line with >! but not by putting text into :::spoiler [text] :::)

Load More