In response to comment by Calvin on Thought Crimes
Comment author: Coscott 15 January 2014 05:54:34AM 2 points [-]

I wouldn't say they are doomed to fail because it is a slippery slope to *NO THINKING ABOUT RESISTANCE *, but I would say that is a good reason to object to thought-taboo devices.

I think a law stopping you from creating a second copy of a human or creating a new human counts as a thought crime, if the copy or new human is being run in your mind.

In response to comment by Coscott on Thought Crimes
Comment author: Calvin 15 January 2014 06:12:20AM *  1 point [-]

I guess it is kind of a slippery slope, indeed. There are probably ways in which it could work only as intended (hardwired chip or whatever), but allowing other people to block your thoughts is only a couple of steps from turning you into their puppet.

As for simulation as though crime, I am not sure. If they need to peek inside your brain to check if you are not running illegally constructed internal simulations, the government can just simulate a copy of you (with a warrant, I guess), either torture or explicitly read it's mind (either way terrible) to find out what is going on and then erase it (I mean murder, but government does it so it is kind of better, except not really.).

Your approval of such measures, probably depends on the relative values that you assign to freedom and privacy.

In response to Thought Crimes
Comment author: Calvin 15 January 2014 05:45:05AM *  2 points [-]

The way I can see it in sci-fi terms:

If human mind is the first copy of a brain that has been uploaded to an computer, than it deserves the same rights as any human. There is a rule against running more than one instance of the same person at the same time.

Human mind created on my own computer from first principles, so to speak of, does not have any rights, but there is also a law in place to prevent such agents from being created, as human minds are dangerous toys.

Plans to enforce thought-taboo devices are likely to fail, as no self-respecting human being would allow such an crude ingerence of third parties into their own thought process. I mean, it starts with NO THINKING ABOUT NANOTECHNOLOGY and in time changes to *NO THINKING ABOUT RESISTANCE *.

EDIT:

Also, assuming that there is really a need to extract some information from an individual, I would reluctantly grant government right to create temporary copy of an individual to be interrogated, interrogate (i.e. torture) the copy and then delete it shortly afterwards. It is squicky, but in my head superior to leaving the original target with memories of interrogation.

Comment author: Moss_Piglet 15 January 2014 04:12:43AM *  5 points [-]

From the OP:

What are your best arguments against the reality/validity/usefulness of IQ?

-

appeals that would limit testing or research even if IQ's validity is established are not [welcome].

Emphasis mine.

We all know the standard "that's racist" argument already, newerspeak is clearly asking for a factual reason why measures of general intelligence are not real / invalid / not useful. Not to mention that the post did not make any claims about, or even mention, heredity of intelligence or race / gender differences in intelligence.

Comment author: Calvin 15 January 2014 05:06:02AM *  -1 points [-]

Let's make distinction between "I have a prejudice against" and "I know something about you"

Assuming I know that IQ is valid and true objective measure, I can use it to judge your cognitive skills, and your opinion about the result does not matter to anyone, just as much as your own opinion about BMI.

Assuming that I am not sure if IQ is valid, then I would rather refrain from reaching any conclusions or acting as if it actually mattered (because I am afraid of consequences), thus making it useless for me in my practical day to day life.

Comment author: ephion 14 January 2014 01:15:31AM 1 point [-]

Speaking of the Principle of Charity...

Comment author: Calvin 14 January 2014 01:18:20AM 2 points [-]

Yes, I do stand corrected.

Comment author: Torello 13 January 2014 11:28:29PM 3 points [-]

Doesn't cryonics (and subsequent rebooting of a person) seem obviously too difficult? People can't keep cars running indefinitely, wouldn't keeping a particular consciousness running be much harder?

I hinted at this in another discussion and got downvoted, but it seems obvious to me that the brain is the most complex machine around, so wouldn't it be tough to fix? Or does it all hinge on the "foom" idea where every problem is essentially trivial?

Comment author: Calvin 13 January 2014 11:33:15PM *  2 points [-]

Most of the explanations found on cryonics site, do indeed seem to base their arguments around the hopeful explanation that given nanotechnology and science of the future every problem connected to as you say rebooting would become essentially trivial.

Comment author: lsparrish 13 January 2014 09:35:11PM 0 points [-]

It feels to me like the general pro-cryo advocacy here would be a bit of a double standard, at least when compared to general memes of effective altruism, shutting up and multiplying, and saving the world. If I value my life equally to the lives of others, it seems pretty obvious that there's no way by which the money spent on cryonics would be a better investment than spending it on general do-gooding.

I think the scale on which it is done is the main thing here. Currently, cryonics is performed so infrequently that there isn't much infrastructure for it. So it is still fairly expensive compared to the amount of expected utility -- probably close to the value implied by regulatory tradeoffs ($5 million per life). On a large, industrial scale I expect it to be far better value than anything Givewell is going to find.

Comment author: Calvin 13 January 2014 09:45:42PM 0 points [-]

This is good argument capable of convincing me into pro-cryonics position, if and only if someone can follow this claim by an evidence pointing to high probability estimate that preservation and restoration will become possible during a resonable time period.

If it so happens, that cryopreservation fails to prevent information-theoretic death then value of your cryo-magazines filled with with corpses will amount to exactly 0$ (unless you also preserve the organs for transplants).

Comment author: DaFranker 13 January 2014 09:03:53PM 1 point [-]

Thanks for the response! This puts several misunderstandings I had to rest.

P.S. Why programing of Azathoth? In my mind it makes it sound as if desire to have children was something intristically bad.

Programming of Azathoth because Azathoth doesn't give a shit about what you wish your own values were. Therefore what you want has no impact whatsoever on what your body and brain are programmed to do, such as make some humans want to have children even when every single aspect of it is negative (e.g. painful sex, painful pregnancy, painful birthing, hell to raise children, hellish economic conditions, absolutely horrible life for the child, etc. etc. such as we've seen some examples of in slave populations historically)

Comment author: Calvin 13 January 2014 09:39:04PM 0 points [-]

I suspect our world views might differ for a bit, as I don't wish that my values where any different than they are. Why should I?

If Azathoth decided to instill the value that having children is somehow desirable deep into my mind, than I am very happy that as a first world parent I have all the resources I need to turn it into a pleasant endeavor with a very high expected value (happy new human who hopefully likes me and hopefully shares my values, but I don't have much confidence in a second bet).

Comment author: Chrysophylax 13 January 2014 08:47:48PM 0 points [-]

No, but cows, pigs, hens and so on are being systematically chopped up for the gustatory pleasure of people who could get their protein elsewhere. For free-range, humanely slaughtered livestock you could make an argument that this is a net utility gain for them, since they wouldn't exist otherwise, but the same cannot be said for battery animals.

Comment author: Calvin 13 January 2014 09:22:46PM 0 points [-]

In this case, I concur that your argument may be true if you include animals in your utility calculations.

While I do have reservations against causing suffering in humans, I don't explicitly include animals in my utility calculations, and while I don't support causing suffering for the sake of suffering, I don't have any ethical qualms against products made with animal fur, animal testing or factory farming, so that in regards to pigs, cows and chickens, I am an utility monster.

Comment author: solipsist 13 January 2014 04:32:21PM -1 points [-]

I didn't mean for the hermit to be sad, just less happy than the child.

Comment author: Calvin 13 January 2014 04:40:00PM 0 points [-]

Ah, must have misread your representation, but English is not my first language, so sorry about that.

I guess if I was particularly well organized ruthlessly effective utilitarian ass some people here, I could now note down in my notebook, that he is happier then I previously thought and it is moral to kill him if, and only if the couple gives birth to 3, not 2 happy children.

Comment author: Chrysophylax 13 January 2014 03:04:03PM 1 point [-]

We live in a world full of utility monsters. We call them humans.

Comment author: Calvin 13 January 2014 04:25:37PM 3 points [-]

I am assuming that all the old sad hermits are of this world are being systematically chopped for spare parts granted to deserving and happy young people, while good meaning utilitarians hide this sad truth from us, so that I don't become upset about those atrocities that are currently being committed in my name?

We are not even close to utility monster, and personally I know very few people who I would consider actual utilitarians.

View more: Prev | Next