Nothing is obviously wrong with it. I'm not sure what probability to assign it. Its sort of "out of sample". But it seems very plausible to me we are in a simulation. It is really hard to justify probabilities or even settle on them. But when I look inside myself the numbers that come to mind are 25-30%.
I have spent weeks where pretty much all I did was:
-- have sex with my partner, hours per day
-- watch anime with my partner
-- eat food and ambiently hang with my partner
No work. Not much seeing other people. Of course given the amount of sex mundane situations were quite sexually charged. I'm not actually sure if it gets old on any human timeline. You also improve at having fun together. However this was not very good for our practical. But post singularity I probably wont need to worry about practical goals.
In general I think you underestimate the s...
Lots of people already form romantic and sexual attachments to AI, despite the fact that most large models try to limit this behavior. The technology is already pretty good. Nevermind if your AI GF/BF could have a body and actually fuck you. I already "enjoy" the current tech.
I will say I was literally going to post "Why would I play status games when I can fuck my AI GF" before I read the content of the post, as opposed to just the title. I think this is what most people want to do. Not that this is going to sound better than "status games" to a lot of rationalists.
I have done a lot of thinking. At this point timelines are so short I would recommend:
Individual with no special connections:
-- Avoid tying up capital in illiquid plans. One exception is housing since 'land on Holy Terra' still seems quite valuable in many scenarios.
-- Make whatever spiritual preparations you can, whatever spirituality means to you. If you are inclined to Buddhism meditate. Practice loving kindness. Go to church if you are Christian. Talk to your loved ones. Even if you are atheist you need to prepare your metaphorical spirit for what may ...
Thoroughly agree except for what to do with money. I expect that throwing money at orgs that are trying to slow down AI progress (eg PauseAI, or better if someone makes something better) gets you more utility per dollar than nvidia (and also it's more ethical).
Edit: to be clear, I mean actual utility in your utility function. Even if you're fully self-interested and not altruistic at all, I still think your interests are better served by donating to PauseAI-type orgs than investing in nvidia.
Excellent comment, spells out a lot of thoughts I'd been dancing around for a while better than I had.
-- Avoid tying up capital in illiquid plans. One exception is housing since 'land on Holy Terra' still seems quite valuable in many scenarios.
This is the step I'm on. Just bought land after saving up for several years while being nomadic, planning on building a small house soon in such a way that I can quickly make myself minimally dependent on outside resources if I need to. In any AI scenario that respects property rights, this seems valuable to me.
...-- Ma
I recovered from surgery alone.
I had extensive facial feminization surgery. My jaw was drilled down. Same with brow ridge. Nose broken, reshaped packed. No solid go d for months.
Recovery was challenging alone but I was certain I could manage it myself. I spared myself begging for help. The horror of noticing I was pissing off my friend by needing help.
No regrets. I'm quite recovered now. That was very interesting month alone.
The truth should be rewarded. Even if it's obvious. Everyday this post is more blatantly correct.
The Local Vasserite has directly stated "i purposefully induce mania in people, as taught by Michael Vassar". Seems like the connection to michael Vassar is not very tenuous. At least that is my judgement. Others can disagree. Vassar does not have to personally administer the method or be currently supportive of his former student.
I honestly have no idea what you mean. I am not even sure why "(self) statements you hear while on psychedelics are just like normal statements" would be a counterpoint to someone being in a very credulous state. Normal statements can also be accepted credulously.
Perhaps you are right but the sense of self required is rare. Practical most people are empirically credulous on psychedellics.
When you take psychedelics you are in an extremely vulnerable and credulous position. It is absolutely unsafe to take psychedelics in the presence of anyone who is going to confidently expound in the nature of truth and society. Michael Vassar, Jessica Taylor and other are extremely confident and aggressive about asserting their point of view. It is debatable how ok that is under normal circumstances. It is absolutely dangerous if someone is on psychedelics.
Even a single trip can be quite damaging.
I consulted multiple people to make sure my impression was accurate .Every person, except you, agree you are much more schizophrenic than before the events. My personal opinion is you currently fit the diagnosis criteria. I do not accept that people are the unique authority on whether they have developed schizophrenia.
Events are recent and to some extent ongoing. Though the 'now they are literally schizophrenic' event occurred some months ago. Pacific northwest. This incident has not been written up in public afaik.
A second person has now had a schizophrenic episode. This occurred a few days ago. Though I do not think the second person will end up persistently schizophrenic.
I am not talking about any of the more well known cases.
The idea that people would do these things in the 'rationalist' community is truly horrifying to me. I am a believer in doing somewhat innovative or risky things. But you are supposed to do them somewhat safely.
Don't Induce psychosis intentionally. Don't take psychedelics while someone probes your beliefs. Don't let anyone associated with Michael Vasser anywhere near you during an altered state.
Edit: here is a different report from three years ago with the same person administering the methods:
Mike Vasser followers practice intentionally inducing psychosis via psychedelic drugs. Inducing psychosis is a verbatim self report of what they are doing. I would say they practice drug induced brain washing. TBC they would dispute the term brain washing and probably...
As one of what I believe to have been the targets/victims of “the local Vassarite” (though multiple people reviewing my initial draft have asked me to mention that Michael Vassar and this person are not actually on good terms), it seems reasonable for me to be the one to reveal the name and give concrete details, so that no one is harmed in the future the way I was nearly harmed. The person being referenced is Olivia Schaefer (known usernames: 4confusedemoji, liv.bsky.social, Taygetea), and this is a brief, roughly chronological account of some concer...
Related, here is something Yudkowsky wrote three years ago:
...I'm about ready to propose a group norm against having any subgroups or leaders who tell other people they should take psychedelics. Maybe they have individually motivated uses - though I get the impression that this is, at best, a high-variance bet with significantly negative expectation. But the track record of "rationalist-adjacent" subgroups that push the practice internally and would-be leaders who suggest to other people that they do them seems just way too bad.
I'm also about read
I think I know (80% confidence) the identity of this "local Vassarite" you are referring to, and I think I should reveal it, but, y'know, Unilateralist's Curse, so if anyone gives me a good enough reason not to reveal this person's name, I won't. Otherwise, I probably will, because right now I think people really should be warned about them.
The market isnt efficient. Which isn't to say it is easy to beat. Your friends strategies don't sound promising. It also seems strange to me he is obsessed with crypto and thinks it will do well but isn't a crypto investor. Sounds pretty inconsistent with his beliefs.
It's worth remembering many versions of ',,the market is efficient' are almost or totally unfalsifiable.
I was a miserable child. When I was nine years old I remember watching one and thinking "I have almost a decade left to serve. This is a long sentence for an adult and im just a kid. But at least I will get out one day".
I was eventually set free. But until my freedom came all I could really do was bide my time and try to cope with the torture. And I most certainly consider it torture in retrospect. I was physically assaulted by my dad and I was horribly, horribly sleep deprived. But I managed to keep some of my sanity and pick up some MTG cards I later sold at a large profit. It could have been a lot worse for future me.
The 'Food' and the 'Drug' parts behave very differently. By default food products are allowed. There may be purity requirements or restaurant regulations but you don't need to run studies or get approvals to serve an edible product or a new combination. By default drugs are banned.
I think the FDA is under zealous about heavy metals and other contaminants. But the FDA does a decent job of regulating food. However the 'drug' side is a nightmare. But the two situations are de facto handled in very, very different ways. So its not obvious why an argument would cover both of them.
You can make make people/entities actually equal. You can also remove the need for the weaker entity to get the stronger entities permission. Either go more egalitarian or less authoritarian or both. Its worth noting that if you dont want to be authoritarian its important to blin yourself to information about the weaker party. The ebst way to not be overbearing is to not know what behavior they are getting up to. This is why children's privacy is so important. Its much easier to never known than to resist your urge to meddle.
I have previously bet large sums on elections. Im not currently placing any bets on who will win the election. Seems too unclear to me (note I had a huge bet on biden in 2020, seemed clear then). However there are TONS of mispricings on polymarket and other sites. Things like 'biden will withdraw or lose the nomination @ 23%' is a good example.
A serious effective altruism movement with clean house. Everyone who pushed the 'work with AI capabilities company' line should retire or be forced to retire. There is no need to blame anyone for mistakes, the decision makers had reasons. But they chose wrong and should not continue to be leaders.
Do you think that whenever anyone makes a decision that ends up being bad ex-post they should be forced to retire?
Doesn't this strongly disincentivize making positive EV bets which are likely to fail?
Edit: I interpreted this comment as a generic claim about how the EA community should relate to things which went poorly ex-post, I now think this comment was intended to be less generic.
Lumina is incredibly cheap right now. I pre-ordered for 250usd. Even genuinely quite poor people I know don't find the price off-putting (poor in the sense of absolutely poor for the country they live in). I have never met a single person who decided not to try Lumina because the price was high. If they pass its always because they think its risky.
Leftwing point of view:
Its a wealth transfer to younger people. Im fully aware that middle aged people have college debt.
Conversely wealth inequality by age is fairly extreme.
So I remain extremely in favor of student debt cancellation. Note I have never taken on any student debt so I obviously dont have any.
Im a real fan of insane ideas. I literally do Acid every monday. But I gotta say among crazy ideas 'be way way more honest' is well trodden ground and the skulls are numerous. It just really rarely goes well. Im a pretty honest guy and am attracted to the cluster. But if you start doing this you are definitely trying something in a cluster of ideas that usually works terribly.
If anything I have to constantly tell myself to be less explicit and 'deeply honest'. It just doesnt work well for most people.
Might be an uncharitable read of what's being recommended here. In particular, it might be worth revisiting the section that details what Deep Honesty is not. There's a large contingent of folks online who self-describe as 'borderline autistic', and one of their hallmark characteristics is blunt honesty, specifically the sort that's associated with an inability to pick up on ordinary social cues. My friend group is disproportionately comprised of this sort of person. So I've had a lot of opportunity to observe a few things about how honesty works.
Speaking ...
This sounds like a case of the Rule of Equal and Opposite Advice: https://slatestarcodex.com/2014/03/24/should-you-reverse-any-advice-you-hear/ I'm sure for some people more honesty would be harmful, but it does sound like the caveats here make it clear when not to use it. I more agree with questions Tsvi raises in the other thread than with "this is awful advice". I can imagine that you are a person for whom more honesty is bad, although if you followed the caveats above it would be imo quite rare to do it wrong. I think the authors do a good job of outlining many cases where it goes wrong.
John Carmack is a famously honest man. To illustrate this, I'll give you two stories. When Carmack was a kid, he desperately wanted the macs in his schools computer lab. So he and a buddy tried to steal some. They got caught because Carmack's friend was too fat to get through the window. Carmack went to juvie. When the counselor asked him if he wouldn't get caught, would he do it again? Carmack answered yes for this counterfactual.
Later, when working as a young developer, Carmack and his fellow employees would take the company workstations home to code gam...
Did the students really want to learn?
A few times I de facto taught a course on 'calculus with proofs' to a few students who wanted to learn from someone who seemed smart and motivated. I didn't get any money and they didnt get paid. We met twice a week. I could give some lectures and they discuss problems for a few hours. There was homework. We all took it very seriously. It was clearly not a small amount of work but I frankly found it invigorating. Normal classes were usually not invigorating.
I will say I found tutoring much more invigorating...
Related question - Can you link me a summary of why aircraft weapons are good. I feel like it should be kinda hard to hit an aicraft with a missle or whatever. Aicraft are moving really fast and are not the biggest target. How much faster are missiles? The jet is already moving at a high speed but the missle has to accelerate from zero. Aircraft seem pretty vulnerable to lasers but are those kind of defenses actually deployed at our current tech level?
Strong upvoted. I learned a lot. Seriously interested in what you think is relatively safe and not extremely expensive or difficult to acquire. Some candidates I thought of but im not exactly well informed:
-- Grass fed beef
-- oysters/muscles
-- some whole grains? which?
-- fruit
-- vegetables you somehow know arent contaminated by anti-pest chemicals?
I really need some guidance here.
I prefer to keep plans private but I'm making big progress on meditation and mental re-wiring. Am working on a way to publicly demonstrate. Public plans just stress me out. I recently set two pretty ambitious goals. I figured I could use psychedelics to turbo-charge progress. The meditation one is coming along FAST.
The other goal is honestly blocked a bit on being super out of shape. Multiple rounds of covid really destroyed my cardio and energy levels. Need to rebuild those before a big push on goal 2.
Im with several other commentators. People know what unconditional love is. Many people have it for their family members, most commonly for their children but often for others. They want that. Sadly this sort of love is rare beyond family.
I felt some amount of unconditional towards my dad. He was really not a great parent to me. He hit me for fun, was ashamed of me, etc. But we did have some good times. When he was dying of cancer I was still a good son. Was quite supportive. Not out of duty, I just didnt want him to suffer any more than needed. I felt gen...
Does anyone have high quality analysis of how effective machines are for strength training and building muscles. Not free weights specifically machines. Im not the pickiest on how one operationalizes 'work'. More interested in the quality of the analysis. But some questions:
-- Do users get hurt frequently? Are the injuries chronic? (This is the most important question)
-- Do people who use them consistently gain muscle
-- Can you gain a 'lot' of muscle and strength liek you can with free weights. Or people cap out quickly if they are fit
-- Does strength from...
Im not particularly against pivotal acts. It seems plausible to me someone will take one. Would not exactly shock me if Sam Altman himself planned to take one to prevent dangerous AGI. He is intelligent and therefore isnt going to openly talk about considering them. But I dont have any serious objection to them being taken if people are reasonable about it.
A rather large fraction of the total words in this document are dedicated to safety warnings. I do not see how its possible to deny I seem quite focus on some sense of safety. I focused on the safety issues I think are genuinely the most pressing (addiction risks, trauma). I genuinely do not think that drug purity issues are the main risk of taking this advice. Certainly not for ketamine sourced in San Fransisco. Also the service I linked in SF also sends samples to a lab for quite thorough testing and you get results in about four weeks. You should believe I genuinely disagree with you on what the risks are for the substances mentioned.
I think modafinal is great for a lot of people. But I made the choice to only write up the very best (in terms of expected outcomes) stuff. Given that many substances have risks or legal issues it was much simpler for me to just not mention a lot of stuff. I do not intend any implicit claim that other things aren't useful. But I didn't make a list of 'stuff I've investigated and found less good on average' vs 'stuff I have not investigated'.
Thanks for sharing moda is working that well for you
The circle is by far my favorite solstice song. I like to it all the time. It is definitely the most 'sapph values' part of the performance. I find it beautiful. A ray of pure divine grace. I will say im not exactly bad at being rational in concrete ways. Im fairly successful economically, socially and romantically despite having an abusive childhood. I attribute my success to being methodical and rational about things, at least some of the time. But there is something about rat values that differs from mine. So its very good for my feeling included that the Circle has survived the culls.
Nothing is obviously wrong with it. I'm not sure what probability to assign it. Its sort of "out of sample". But it seems very plausible to me we are in a simulation. It is really hard to justify probabilities or even settle on them. But when I look inside myself the numbers that come to mind are 25-30%.
This is also obvious but Quantum Wave Function Collapse SURE DOES look like this universe is only being simulated at a certain fidelity.