social system designer http://aboutmako.makopool.com
Do people ever talk about dragons and dinosaurs in the same contexts? If so you're creating ambiguities. If not (and I'm having difficulty thinking of any such contexts) then it's not going to create many ambiguities so it's harder to object.
I think I've been calling it "salvaging". To salvage a concept/word allows us to keep using it mostly the same, and to assign familiar and intuitive symbols to our terms, while intensely annoying people with the fact that our definition is different from the normal one and thus constantly creates confusion.
I'm sure it's running through a lot of interpretation, but it has to. He's dealing with people who don't know or aren't open about (unclear which) the consequences of their own policies.
According to wikipedia, the Biefield brown effect was just ionic drift, https://en.wikipedia.org/wiki/Biefeld–Brown_effect#Disputes_surrounding_electrogravity_and_ion_wind
I'm not sure what wikipedia will have to say about charles buhler, if his work goes anywhere, but it'll probably turn out to be more of the same.
I just wish I knew how to make this scalable (like, how do you do this on the internet?) or work even when you don't know the example person that well. If you have ideas, let me know!
Immediate thoughts (not actionable) VR socialisation and vibe-recognising AIs (models trained to predict conversation duration and recurring meetings) (But VR wont be good enough for socialisation until like 2027). VR because easier to persistently record, though apple has made great efforts to set precedents that will make it difficult, especially if you want to use eye tracking data, they've also developed trusted compute stuff that might make it possible to use the data in privacy-preserving ways.
Better thoughts: Just a twitterlike that has semi-private contexts. Twitter is already like this for a lot of people, it's good for finding the people you enjoy talking to. The problem with twitter is that a lot of people, especially the healthiest ones, hold back their best material, or don't post at all, because they don't want whatever crap they say when they're just hanging out to be public and on the record forever. Simply add semi-private contexts. I will do this at some point. Iceshrimp probably will too. Mastodon might even do it. X might do it. Spritely definitely will but they might be in the oven for a bit. Bluesky might never, though, because radical openness is a bit baked into the protocol currently, which is based, but not ideal for all applications.
Wow. Marc Andreeson says he had meetings at DC where he was told to stop raising AI startups because it was going to be closed up in a similar way to defence tech, a small number of organisations with close government ties. He said to them, 'you can't restrict access to math, it's already out there', and he says they said "during the cold war we classified entire areas of physics, and took them out of the research community, and entire branches of physics basically went dark and didn't proceed, and if we decide we need to, we're going to do the same thing to the math underneath AI".
So, 1: This confirms my suspicion that OpenAI leadership have also been told this. If they're telling Andreeson, they will have told Altman.
And for me that makes a lot of sense of the behavior of OpenAI, a de-emphasizing of the realities of getting to human-level, a closing of the dialog, comically long timelines, shrugging off responsibilities, and a number of leaders giving up and moving on. There are a whole lot of obvious reasons they wouldn't want to tell the public that this is a thing, and I'd agree with some of those reasons.
2: Vanishing areas of physics? A perplexity search suggests that may be referring to nuclear science, radar, lasers, and some semiconductors. But they said "entire areas of physics". Does any of that sound like entire areas of physics? To me that phrase is strongly reminiscent of certain stories I've heard (possibly overexcited ones), physics that, let's say, could be used to make much faster missiles, missiles so fast that it's not obvious that they could be intercepted even using missiles of the same kind. A technology that we'd prefer to consign to secrecy than use, and then later have to defend ourselves against it once our adversaries develop their own. A black ball. If it is that, if that secret exists, that's very interesting for many reasons, primarily due to the success of the secrecy, and the extent to which it could very conceivably stay secret for basically ever. And that makes me wonder about what might happen with some other things.
All novel information:
The medical examiner’s office determined the manner of death to be suicide and police officials this week said there is “currently, no evidence of foul play.”
Balaji’s death comes three months after he publicly accused OpenAI of violating U.S. copyright law while developing ChatGPT
The Mercury News [the writers of this article] and seven sister news outlets are among several newspapers, including the New York Times, to sue OpenAI in the past year.
The practice, he told the Times, ran afoul of the country’s “fair use” laws governing how people can use previously published work. In late October, he posted an analysis on his personal website arguing that point.
In a Nov. 18 letter filed in federal court, attorneys for The New York Times named Balaji as someone who had “unique and relevant documents” that would support their case against OpenAI. He was among at least 12 people — many of them past or present OpenAI employees — the newspaper had named in court filings as having material helpful to their case, ahead of depositions.
OpenAI has staunchly refuted those claims, stressing that all of its work remains legal under “fair use” laws.
I found that I lost track of the flow in the bullet points.
I'm aware that that's quite normal, I do it sometimes too, I also doubt it's an innate limit, and I think to some extent this is a playful attempt to make people more aware of it. It would be really cool if people could become better at remembering the context of what they're reading. Context-collapse is like, the main problem in online dialog today.
I guess game designers never stop generating challenges that they think will be fun, even when writing. Sometimes a challenge is frustrating, and sometimes it's fun, and after looking at a lot of 'difficult' video games I think it turns out surprisingly often whether it ends up being fun or frustrating is not totally in the designer's control, it's up to the player. Are they engaging deeply, or do they need a nap? Do they just want to be coddled all the way through?
(Looking back... to what extent was Portal and the renaissance it brought to puzzle games actually a raising of the principle "you must coddle the player all the way through, make every step in the difficulty shallow, while making them feel like they're doing it all on their own", to what extent do writers also do this (a large extent!), and how should we feel about that?
I don't think games have to secretly coddle people, I guess it's just something that a good designer needs to be capable of, it's a way of demonstrating mastery, but there are other approaches. EG: Demonstrating easy difficulty gradations in tutorials then letting the player choose their difficulty level from then on.)
(Yes, ironic given the subject.)
Trying to figure out what it would mean to approach something cooperatively and not cohabitively @_@
I feel like it would always be some kind of trick. The non-cohabitive cooperator invites us to never mind about building real accountability mechanisms, "we can just be good :)" they say. They invite us to act against our incentives, and whether they will act against theirs in return will remain to be seen.
Let's say it will be cooperative because cooperation is also cohabitive in this situation haha.
Conditions where a collective loss is no worse than an individual loss. A faction who's on the way to losing will be perfectly willing to risk coal extinction, and may even threaten to cross the threshold deliberately to extort other players.