cousin_it

Wikitag Contributions

Comments

Sorted by

ideally the only 1/4″ cables onstage are short runs to DIs

And all the pedalboard stuff that happens before the DI. But mostly I agree.

Btw, do you already know that a piezo signal is much improved by a preamp with >1 meg ohm input impedance? I figured that out with my electric cello.

I think there's a worldwide trend toward more authoritarian leaders, which contributed to both these events. And it should raise our probability of e.g. Turkey or China doing something silly. But where this trend comes from, I'm not sure. It certainly predates the current AI wave. It could be due to social media making people more polarized or something. But then again there were plenty of worse dictators in history, long before social media or electricity. So maybe what's happening now is regression to the mean, and nice democracy was an anomaly in place and time.

Yeah. I remember where I was and how I felt when covid hit in 2020, and when Russia attacked Ukraine in 2022. This tariff announcement was another event in the same row.

And it all seems so stupidly self-inflicted. Russia's economy was booming until Feb 2022, and US economy was doing fine until Feb 2025. Putin-2022 and Trump-2025 would've done better for their countries by simply doing nothing. Maybe this shows the true value of democratic checks and balances: most of the time they add overhead, but sometimes they'll prevent some exceptionally big and stupid decision, and that pays for all the overhead and then some.

Your examples sound familiar to me too, but after rereading your comment and mine, maybe it all can be generalized in a different way. Namely, that internal motivation leads to a low level of effort: reading some textbooks now and then, solving some exercises, producing some small things. It still feels a bit like staying in place. Whereas it takes external motivation to actually move forward with math, or art, or whatever - to spend lots of effort and try to raise my level every day. That's how it feels for me. Maybe some people can do it without external motivation, or maybe they lucked into getting external motivation in the right way, I don't know.

I agree feedback is a big part of it. For example, the times in my life when I've been most motivated to play musical instruments were when I had regular opportunities to play in front of people. Whenever that disappeared, the interest went away too.

But also I think some of it is sticky, or due to personality factors. We could even say it's not about willpower at all, but about value differences. Some people are just more okay with homeostasis, staying at a certain level (which can be lower or higher for different people) and using only as much effort as needed for that. While others keep climbing and applying effort without ever reaching a level that lets them relax. Many billionaires seem to be of that second type. I'm more of the first type, with many of my active periods being prompted by external changes, threats to homeostasis. It's clear that type 2 achieves more than type 1, but it's not clear which type is happier and whether one should want to switch types.

cousin_it*Ω9190

Good post. But I thought about this a fair bit and I think I disagree with the main point.

Let's say we talk about two AIs merging. Then the tuple of their expected utilities from the merge had better be on the Pareto frontier, no? Otherwise they'd just do a better merge that gets them onto the frontier. Which specific point on the frontier is a matter of bargaining, but the fact that they want to hit the frontier isn't, it's a win-win. And the merges that get them to the frontier are exactly those that output a EUM agent, maximizing some linear combination of their utilities. If the point they want to hit is in a flat region of the frontier, the merge will involve coinflips to choose which EUM agent to become; and if it's curvy at that point, the merge will be deterministic. For realistic agents who have more complex preferences than just linearly caring about one cake, I expect the frontier will be curvy, so deterministic merge into a EUM agent will be the best choice.

cousin_it*101

"Apparatchik" in the USSR was some middle-aged Ivan Ivanovich who'd yell at you in his stuffy office for stepping out of line. His power came from the party apparatus. While the power of Western activists is the opposite: it comes from civil society, people freely associating with each other.

This rhetorical move, calling a Western thing by an obscure and poorly fitting Soviet name, is a favorite of Yarvin: "Let's talk about Google, my friends, but let's call it Gosplan for a moment. Humor me." In general I'd advise people to stay away from his nonsense, it's done enough harm already.

cousin_it*20

The objection I'm most interested in right now is the one about induced demand (that's not the right term but let's roll with it). Like, let's say we build many cheap apartments in Manhattan. Then the first bidders for them will be rich people - from all over the world! - who would love to get a Manhattan apartment for a bargain price. The priced-out locals will stay just as priced out, shuffled to the back of the line, because there's quite many rich people in the world who are willing to outbid them. Maybe if we build very many apartments, and not just in Manhattan but everywhere, the effect will eventually run out; but it'll take very many indeed.

The obvious fix is to put a thumb on the scale somehow, for example sell these cheap apartments only as primary residences. But then we lose the theoretical beauty of "just build more", and we really should figure out what mix of "just build more" and "put a thumb on the scale" is the most cost-efficient for achieving what we want. Maybe some thumb on the scale will even give us what we want without building more, since there's a lot of empty housing and non-primary housing.

cousin_it*42

Maybe you're pushing your proposal a bit much, but anyway as creative writing it's interesting to think about such scenarios. I had a sketch for a weird utopia story where just before the singularity, time stretches out for humans because they're being run at increasing clock speed, and the Earth's surface also becomes much larger and growing. So humanity becomes this huge, fast-running civilization living inside an AI (I called it "Quetzalcoatl", not sure why) and advising it how it should act in the external world.

My wife used to have a talking doll that said one phrase in a really annoying voice. Well, at some point the doll short-circuited or something, and started turning on at random times. In the middle of the night for example it would yell out its phrase and wake everyone up. So eventually my wife took the doll to the garbage dump. And on the way back she couldn't stop thinking about the doll sitting there in the garbage, occasionally yelling out its phrase: "Let's go home! I'm already hungry!" This isn't creative writing btw, this actually happened.

Load More