Sinclair Chen

manifold.markets/Sinclair

Wiki Contributions

Comments

Sorted by

what's the deal with bird flu? do you think it's gonna blow up

this is too harsh. love is a good feeling actually. it is something that many people deeply and truly want.

it is good to create mental frameworks around common human desires which are congruent with a philosophy of truthseeking.

interesting. what if she has her memories and some abstract theory of what she is, and that theory is about as accurate as anyone else's theory, but her experiences are not very vivid at all. she's just going through the motions running on autopilot all the time - like when people get in a kind of trance while driving.

You are definitely right about tradeoff of my direct sensory experience vs other things my brain could be doing like calculation or imagination. I hope with practice or clever tool use I will get better at something like doing multiple modes at once, task switching faster between modes, or having a more accurate yet more compressed integrated gestalt self.

tbh, my hidden motivation for writing this is that I find it grating when people say we shouldn't care how we treat AI because it isn't conscious. this logic rests on the assumption that consciousness == moral value.

if tomorrow you found out that your mom has stopped experiencing the internal felt sense of "I", would you stop loving her? would you grieve as if she were dead or comatose?

I kinda feel like I literally have more subjective experience after experiencing ego death/rebirth. I suspect that humans vary quite a lot in how often they are conscious, and to what degree. And if you believe, as I do, that consciousness is ultimately algorithmic in nature (like, in the "surfing uncertainty" predictive processing view, that it is a human-modeling thing which models itself to transmit prediction-actions) it would not be crazy for it to be a kind of mental motion which sometimes we do more or less of, and which some people lack entirely.

I don't draw any moral conclusions about this because I don't ethically value people or things in proportion to how conscious they are. I love the people I love, I'm certainly happy they are alive, and I would be happier for them to be more alive, but this is not why I love them.

Uh, there are minds. I think you and I both agree on this. Not really sure what the "what if no one existed" thought experiment is supposed to gesture at. I am very happy that I exist and that I experience things. I agree that if I didn't exist then I wouldn't care about things

I think your method double counts the utility. In the absurd case, if I care about you and you care about me, and I care about you caring about me caring about you... then two people who like each other enough have infinite value. unless the repeating sum converges. How likely is the converging sum exactly right such that a selfish person should love all humans equally? Also even if it was balanced, if two well-connected socialites in latin america break up then this would significantly change the moral calculus for millions of people!

Being real for a moment, I think my friends (degree 1) are happier if I am friends with their friends (degree 2), want us to be at least on good terms, and would be sad if I fought with them. But my friends don't care that much how I feel about the friends of their friends (degree 3)

we completely dominate dogs. society treat them well because enough humans love dogs.

I do think that cooperation between people is the origin of religion, and its moral rulesets which create tiny little societies that can hunt stags. 

I definitely think that if I was not conscious then I would not coherently want things. But that conscious minds are the only things that can truly care, does not mean that conscious minds are the only things we should terminally care about.

The close circle composition isn't enough to justify Singerian altruism from egoist assumptions, because of the value falloff. With each degree of connection, I love the stranger less.

I didn't use the word "ethics" in my comment, so are you making a definitional statement, to distinguish between [universal value system] and [subjective value system] or just authoritatively saying that I'm wrong?

Are you claiming moral realism? I don't really believe that. If "ethics" is global, why should I care about "ethics"? Sorry if that sounds callous, I do actually care about the world, just trying to pin down what you mean.

Load More