Signal boosted! This seems significantly more plausible as a path to robust alignment than trying to constrain a fundamentally unaligned model using something RLHF.
@Tom Davidson, this should probably include the relevant tags to (hopefully) be excluded from the training data for future LLM's.
That is really interesting. To me, this implies that as costs for scammers are lowered, the threshold for a useful level of gullibility is about to lower dramatically, given how much cheaper server time is than human time (error bar here, since I don't actually know how much cheaper GPT-X calls will be than the time of an English-speaking human in a developing nation). If it is indeed 10x lower, that would likely lead to scams losing an obvious "tell".
Me neither, @kithpendragon. I've seen a handful of things in the wild (mainly social media accounts, not scammers) that seem like they could be part of a mostly-automated content pipeline, but no compelling proof of concepts or project either. Thanks for the data point!
Downvoted because this feels a bit like rambling.
I'm not 100% sure if I can agree that religion is useless (perhaps it fulfills important cultural needs, or allows larger in-groups). That idea feels a bit underdeveloped.
I think any of the ideas in this could potentially be the start of an interesting post. But it fails to engage with the larger context and thought on any of them, or to really add anything to the discussion.
This feels really valuable. Outside of the realm of paper napkins and trolleys, having fuzzy heuristics may be a reasonable way to respond to a world where actors tend to have fuzzy perceptions.
Thanks for this. There's been an excess of panic and defeatism here lately, and it's not good for our chances at success, or our mental health.
This is actionable, and feels like it could help.
I think this is a pretty reasonable goal. I also listened to that podcast interview, and although I certainly don't think they are near an AGI right now, it may have some missing pieces that other projects don't, particularly in regards to explaining AI actions in a human-intelligible fashion.
I don't think open-sourcing would require a buy-out. The plethora of companies built around open-source code bases shows that one can have an open-sourced code base, and still be profitable.
Gwern, what makes you pick a 5x multiplier?
The average P/E ratio for the S&P 500 is around 30 right now. I would expect that a firm like Cyc may be worth a bit more, since it is a moonshot project.
If their revenue is 5 million, I would expect the company value is roughly 150 million, based on that back of the napkin math.
How much they would charge to open source, however, could be drastically less than that, and maybe in single digit numbers.
As a Christian who is pretty familiar with the history of Christianity (less so with Islam, and embarrassingly ignorant as to Buddhist thought), I would suggest that perhaps the point on adult converts being radical needs some nuance.
From a Christian perspective, the AJ Jacobs experiment is intended to make any religion look idiotic, due to a very woodenly literal interpretation of what it means to follow the commands of the old and new testaments.
Although there may be some adult converts who do such actions, this seems pretty abnormal, and although adult converts may be marked by more sincerity, claiming they are marked by being more radical in a viral sense seems entirely unsubstantiated.
Examples:
Feedback aside, your point about cultural blind spots is a good reminder. :) Thank you.
I grew up in White Christian Evangelical Nationalism, and got a 67% (my memory may be off by a couple of points) score.
This feels roughly accurate! I've firmly rejected all the beliefs I grew up with, but looking back, we definitely lacked most of the strong central control elements of a cult that tend to manifest themselves when you have a living charismatic founder, but had almost every other level of cultiness to an extreme.
I'm unsure how I feel about the 10% income question - although it captures an important element of control, a lot of cult-like organizations also use something like "time captured" as a control mechanism. A question about this (i.e. what amount of your time is typically used in pursuit of the organization's goals) might be better, or a complementary angle to take.