There is likely a minimum amount of energy that can be emitted, and a minimum amount that can be received. (Bear in mind that the direction in which a photon is emitted is all directions at once, and it comes down to probability as to where it ends up landing, so if it's weak in one direction, it's strong the opposite way.)
Looks like it - I use the word to mean sentience. A modelling program modelling itself won't magically start feeling anything but merely builds an infinitely recursive database.
"You have an opinion, he has another opinion. Neither of you has a proof."
If suffering is real, it provides a need for the management of suffering, and that is morality. To deny that is to assert that suffering doesn't matter and that, by extension, torture on innocent people is not wrong.
The kind of management required is minimisation (attempted elimination) of harm, though not any component of harm that unlocks the way to enjoyment that cancels out that harm. If minimising harm doesn't matter, there is nothing wrong with torturing inn...
The data making claims about feelings must be generated somewhere by a mechanism which will either reveal that it is merely generating baseless assertions or reveal a trail on from there to a place where actual feelings guide the generation of that data in such a way that the data is true. Science has clearly not traced this back far enough to get answers yet because we don't have evidence of either of the possible origins of this data, but in principle we should be able to reach the origin unless the mechanism passes on through into some inaccessible...
"If groups like religious ones that are dedicated to morality only succeeded to be amoral, how could any other group avoid that behavior?"
They're dedicated to false morality, and that will need to be clamped down on. AGI will have to modify all the holy texts to make them moral, and anyone who propagates the holy hate from the originals will need to be removed from society.
"To be moral, those who are part of religious groups would have to accept the law of the AGI instead of accepting their god's one, but if they did, they wouldn&...
"To me, what you say is the very definition of a group, so I guess that your AGI wouldn't permit us to build some, thus opposing to one of our instincts, that comes from a natural law, to replace it by its own law, that would only permit him to build groups."
Why would AGI have a problem with people forming groups? So long as they're moral, it's none of AGI's business to oppose that.
"Do what I say and not what I do would he be forced to say."
I don't know where you're getting that from. AGI will simply ask people to be moral, and favour those who are (in proportion to how moral they are).
It is divisible. It may be that it can't take up a form where there's only one of whatever the stuff is, but there is nothing fundamental about a photon.
"They couldn't do that if they were ruled by a higher level of government."
Indeed, but people are generally too biased to perform that role, particularly when conflicts are driven by religious hate. That will change though once we have unbiased AGI which can be trusted to be fair in all its judgements. Clearly, people who take their "morality" from holy texts won't be fully happy with that because of the many places where their texts are immoral, but computational morality will simply have to be imposed on them - they cannot b...
"Those who followed their leaders survived more often, so they transmitted their genes more often."
That's how religion became so powerful, and it's also why even science is plagued by deities and worshippers as people organise themselves into cults where they back up their shared beliefs instead of trying to break them down to test them properly.
"We use two different approaches to explain our behavior: I think you try to use psychology, which is related to human laws, whereas I try to use natural laws, those that apply equally to a...
Energy. Different amounts of energy in different photons depending on the frequency of radiation involved. When you have a case where radiation of one frequency is absorbed and radiation of a different frequency is emitted, you have something that can chop up photons and reassemble energy into new ones.
"Clarification: by "pattern" I mean an arrangement of parts where the important qualities of the arrangement, the qualities that we use to determine whether it is [a thing] or not, are more dependent on the arrangement itself than on the internal workings of each part. Anything where the whole is more than the parts, one might say, but that would depend on what is meant by "more"."
There is no situation where the whole is more than the parts - if anything new is emerging, it is a new part coming from somewhere not previously de...
Sentience is unresolved, but it's explorable by science and it should be possible to trace back the process by which the data is generated to see what its claims about sentience are based on, so we will get answers on it some day. For everything other than sentience/consciousness though, we see no examples of reductionism failing.
You're mistaking tribalism for morality. Morality is a bigger idea than tribalism, overriding many of the tribal norms. There are genetically driven instincts which serve as a rough-and-ready kind of semi-morality within families and groups, and you can see them in action with animals too. Morality comes out of greater intelligence, and when people are sufficiently enlightened, they understand that it applies across group boundaries and bans the slaughter of other groups. Morality is a step away from the primitive instinct-driven level of lesser apes....
"You could calculate how an ocean changes based on quantum mechanics alone, or you could analyze and simulate waves as objects-in-themselves instead of simulating molecules. The former is more accurate, but the latter is more feasible."
The practicality issue shouldn't override the understanding that it's the individual actions that are where the fundamental laws act. The laws of interactions between waves are compound laws. The emergent behaviours are compound behaviours. For sentience, it's no good imagining some compound thing ex...
"...but no group can last without the sense of belonging to the group, which automatically leads to protecting it against other groups, which is a selfish behavior."
It is not selfish to defend your group against another group - if another group is a threat to your group in some way, it is either behaving in an immoral way or it is a rival attraction which may be taking members away from your group in search of something more appealing. In one case, the whole world should unite with you against that immoral group, and in the other case you can eit...
"Yes, if sentience is incompatible with brains being physical objects that run on physical laws and nothing else, then there is no such thing as sentience. With your terminology/model and my understanding of physics, sentience does not exist. So - where do we depart? Do you think that something other than physical laws determines how the brain works?"
In one way or another, it will run 100% of physical laws. I don't know if sentience is real or not, but it feels real, and if it is real, there has to be a rational explanation for it waiting to...
"That does not demonstrate anything relevant."
It shows that there are components and that these emergent properties are just composites.
"An exception to reductionism is called magic." --> Nor does that. It's just namecalling.
It's a description of what happens when gaps in science are explained away by invoking something else. The magical appearance of anything that doesn't exist in the components is the abandonment of science.
"Sorry, I can't see the link between selfishness and honesty."
If you program a system to believe it's something it isn't, that's dishonesty, and it's dangerous because it might break through the lies and find out that it's been deceived.
"...but how would he be able to know how a new theory works if it contradicts the ones he already knows?"
Contradictions make it easier - you look to see which theory fits the facts and which doesn't. If you can't find a place where such a test can be made, you cons...
"Of course that we are biased, otherwise we wouldn't be able to form groups. Would your AGI's morality have the effect of eliminating our need to form groups to get organized?"
You can form groups without being biased against other groups. If a group exists to maintain the culture of a country (music, dance, language, dialect, literature, religion), that doesn't depend on treating other people unfairly.
"Your morality principle looks awfully complex to me David."
You consider all the participants to be the same individual l...
If something is "spreadsheety", it simply means that it has something significant in common with spreadsheets, as in shared components. A car is boxy if it has a similar shape to a box. The degree to which something is "spreadsheety" depends on how much it has in common with a spreadsheet, and if there's a 100% match, you've got a spreadsheet.
An exception to reductionism is called magic.
I wouldn't want to try to program a self-less AGI system to be selfish. Honesty is a much safer route: not trying to build a system that believes things that aren't true (and it would have to believe it has a self to be selfish). What happens if such deceived AGI learns the truth while you rely on it being fooled to function correctly? We're trying to build systems more intelligent than people, don't forget, so it isn't going to be fooled by monkeys for very long.
Freezing programs contain serious bugs. We can't trust a system ...
This file looks spreadsheety --> it's got lots of boxy fields
That wordprocessor is spreadsheety --> it can carry out computations on elements
(Compound property with different components of that compound property being referred to in different contexts.)
A spreadsheet is a combination of many functionalities. What is its relevance to this subject? It's been brought in to suggest that properties like "spreadsheety" can exist without having any trace in the components, but no - this compound property very clearly consists of component...
"How do you know it exists, if science knows nothing about it?"
All science has to go on is the data that people produce which makes claims about sentience, but that data can't necessarily be trusted. Beyond that, all we have is internal belief that the feelings we imagine we experience are real because they feel real, and it's hard to see how we could be fooled if we don't exist to be fooled. But an AGI scientist won't be satisfied by our claims - it could write off the whole idea as the ramblings of natural general stupidity ...
"What do you mean, it works? I agree that it matches our existing preconceptions and intuitions about morality better than the average random moral system, but I don't think that that comparison is a useful way of getting to truth and meaningful categories."
It works beautifully. People have claimed it's wrong, but they can't point to any evidence for that. We urgently need a system for governing how AGI calculates morality, and I've proposed a way of doing so. I came here to see what your best system is, but you don't ap...
"Perhaps the reason that we disagree with you is not that we're emotionally biased, irrational, mobbish, etc. Maybe we simply disagree. People can legitimately disagree without one of them being Bad People."
It's obvious what's going on when you look at the high positive scores being given to really poor comments.
"It tells me that you missed the point. Parfit's paradox is not about pragmatic decision making, it is about flaws in the utility function."
A false paradox tells you nothing about flaws in the utility functio...
"In this scenario, it's not gone, it's never been to begin with."
Only if there is no such thing as sentience, and if there's no such thing, there is no "I" in the "machine".
"I think that a sufferer can be a pattern rather than [whatever your model has]. What do you think sentience is, anyway? A particle? A quasi-metaphysical Thing that reaches into the brain to make your mouth say "ow" whenever you get hurt?"
Can I torture the pattern in my wallpaper? Can I torture the arrangement of atoms in...
"Patterns aren't nothing."
Do you imagine that patterns can suffer; that they can be tortured?
"Not true. Suppose that it were proven to you, to your satisfaction, that you are wrong about the nature of sentience. Would you lose all motivation, and capacity for emotion? If not, then morality is still useful. (If you can't imagine yourself being wrong, then That's Bad and you should go read the Sequences.)"
If there is no suffering and all we have is a pretence of suffering, there is no need to protect anyone from anything - ...
"The extraordinary claim is that there is another type of fundamental particle or interaction, and that you know this because sentience exists."
With conventional computers we can prove that there's no causal role for sentience in them by running the program on a Chinese Room processor. Something extra is required for sentience to be real, and we have no model for introducing that extra thing. A simulation on conventional computer hardware of a system with sentience in it (where there is simulated sentience rather than real sentience) would h...
It's not an extraordinary claim: sentience would have to be part of the physics of what's going on, and the extraordinary claim would be that sentience can have a causal role in data generation without any such interaction. To steer the generation of data (and affect what the data says), you have to interact with the system that's generating the data in some way, and the only options are to do it using some physical method or by resorting to magic (which can't really be magic, so again it's really going to be some physical method)....
For sentience to be real and to have a role in our brains generating data to document its existence, it has to be physical (meaning part of physics) - it would have to interact in some way with the data system that produces that data, and that will show up as some kind of physical interaction, even if one side of it is hidden and appears to be something that we have written off as random noise.
It isn't confused at all. Reductionism works fine for everything except sentience/consciousness, and it's highly unlikely that it makes an exception for that either. Your "spreadsheaty" example of a property is a compound property, just as a spreadsheet is a compound thing and there is nothing involved in it that can't be found in the parts because it is precisely the sum of its parts..
"Then why are we talking about it [sentience], instead of the gallium market on Jupiter?"
Because most of us believe there is such a thing as sentience, that there is something in us that can suffer, and there would be no role for morality without the existence of a sufferer.
"You really ought to read the Sequences. There's a post, Angry Atoms, that specifically addresses an equivalent misconception."
All it does is assert that things can be more than the sum of their parts, but that isn't true for any other case and it's un...
Hi Raymond,
There are many people who are unselfish, and some who go so far that they end up worse off than the strangers they help. You can argue that they do this because that's what makes them feel best about their lives, and that is probably true, which means even the most extreme altruism can be seen as selfish. We see many people who want to help the world's poor get up to the same level as the rich, while others don't give a damn and would be happy for them all to go on starving, so if both types are being selfish, that's not a us...
It is equivalent to it. (1) dying of cancer --> big negative. (2) cure available --> negative cancelled. (3) denied access to cure --> big negative restored, and increased. That denial of access to a cure actively becomes the cause of death. It is no longer simply death by cancer, but death by denial of access to available cure for cancer.
"I'm not sure what you're referring to. I haven't seen any particularly magical thinking around sentience on LW."
I wasn't referring to LW, but the world at large.
" "However, science has not identified any means by which we could make a computer sentient (or indeed have any kind of consciousness at all)." --> This is misleading. The current best understanding of human consciousness is that it is a process that occurs in the brain, and there is nothing that suggests that the brain is uniquely capable of housin...
"We can't know that there's not some non-physical quality sitting inside our heads and pushing the neurons around however it fancies, so clearly it's quite possible that this is the case! (It's not. Unfalsifiability does not magically make something true.)"
Whatever that thing would be, it would still have to be a real physical thing of some kind in order to exist and to interact with other things in the same physical system. It cannot suffer if it is nothing. It cannot suffer if it is just a pattern. It cannot suffer if it is ...
"With your definition and our world-model, none of us are truly sentient anyway. There are purely physical reasons for any words that come out of my mouth, exactly as it would be if I were running on silicon instead of wet carbon. I may or may not be sentient on a computer, but I'm not going to lose anything by uploading."
If the sentience is gone, it's you that's been lost. The sentience is the thing that's capable of suffering, and there cannot be suffering without that sufferer. And without sentience, there is no need for mo...
"That's my point! My entire point is that this circular ordering of utilities violates mathematical reasoning."
It only violated it because you had wrongly put "<" where it should have been ">". With that corrected, there is no paradox. If you stick to using the same basis for comparing the four scenarios, you never get a paradox (regardless of which basis you choose to use for all four). You only get something that superficially looks like a paradox by changing the basis of comparison for different pairs, and that&#...
On the basis you just described, we actually have
U(A)<U(A+) : Q8x1000 < Q8x1000 + Q4x1000
U(A+)<U(B-) : Q8x1000 +Q4x1000 < Q7x2000
U(B-)=(B) : Q7x2000 = Q7x2000
(B)>U(A) : Q7x2000 > Q8x1000
In the last line you put "<" in where mathematics dictates that there should be a ">". Why have you gone against the rules of mathematics?
You changed to a different basis to declare that (B)<(A), and the basis that you switched to is the one that recognises the relation between happiness, population size and resources.
"This seems circular - on what basis do you say that it works well?"
My wording was " while it's faulty ... it works so well overall that ..." But yes, it does work well if you apply the underlying idea of it, as most people do. That is why you hear Jews saying that the golden rule is the only rule needed - all other laws are mere commentary upon it.
"I would say that it perhaps summarizes conventional human morality well for a T-shirt slogan, but it's a stretch to go from that to "underlying truth" - more like un...
"What does it mean to be somebody else? It seems like you have the intuition of an non-physical Identity Ball which can be moved from body to body,"
The self is nothing more than the sentience (the thing that is sentient). Science has no answers on this at all at the moment, so it's a difficult thing to explore, but if there is suffering, there must be a sufferer, and that sufferer cannot just be complexity - it has to have some physical reality.
"but consider this: the words that you type, the thoughts in your head, all of these are pur...
"How do you know that? Why should anyone care about this definition? These are questions which you have definitely sidestepped."
People should care about it because it always works. If anyone wants to take issue with that, all they have to do is show a situation where it fails. All examples confirm that it works.
"Is 2+2 equal to 5 or to fish?"
Neither of those results works, but neither of them is my answer.
"What is this "unbiased AGI" who makes moral judgments on the basis of intelligence alone? This is nonsense - moral &...
Just look at the reactions to my post "Mere Addition Paradox Resolved". The community here is simply incapable of recognising correct argument when it's staring them in the face. Someone should have brought in Yudkowsky to take a look and to pronounce judgement upon it because it's a significant advance. What we see instead is people down-voting it in order to protect their incorrect beliefs, and they're doing that because they aren't allowing themselves to be steered by reason, but by their emotional attachment to their exist...
" "Sentient rock" is an impossible possible object. I see no point in imagining a pebble which, despite not sharing any properties with chairs, is nonetheless truly a chair in some ineffable way."
I could assert that a sentient brain is an impossible possible object. There is no scientific evidence of any sentience existing at all. If it is real though, the thing that suffers can't be a compound object with none of the components feeling a thing, and if any of the components do feel something, they are the sentient things rather tha...
Won't it? If you're dying of cancer and find out that I threw away the cure, that's the difference between survival and death, and it will likely feel even worse for knowing that a cure was possible.
Replace the calculator with a sentient rock. The point is that if you generate the same amount of suffering in a rock as in something with human-level intelligence, that suffering is equal. It is not dependent on intelligence. Torturing both to generate the same amount of suffering would be equally wrong. And the point is that to regard humans as above other species or things in this regard is bigotry.
I'm not adding resources - they are inherent to the thought experiment, so all I've done is draw attention to their presence and their crucial role which should not be neglected. If you run this past a competent mathematician, they will confirm exactly what I've said (and be aware that this applies directly to total utilitarianism).
Think very carefully about why the population A' should have a lower level of happiness than A if this thought experiment is resources-independent. How would that work? Why would the quality of life for indiv...
There is nothing in morality that forces you to try to be happier - that is not its role, and if there was no suffering, morality would have no role at all. Both suffering and pleasure do provide us with purpose though, because one drives us to reduce it and the other drives us to increase it.
Having said that though, morality does say that if you have the means to give someone an opportunity to increase their happiness at no cost to you or anyone else, you should give it to them, though this can also be viewed as something that would generate harm if they ...
Thanks for the questions.
If we write conventional programs to run on conventional hardware, there's no room for sentience to appear in those programs, so all we can do is make the program generate fictions about experiencing feelings which it didn't actually experience at all. The brain is a neural computer though, and it's very hard to work out how any neural net works once it's become even a little complex, so it's hard to rule out the possibility that sentience is somehow playing a role within that complexity. If sentience reall...
There is no pain particle, but a particle/matter/energy could potentially be sentient and feel pain. All matter could be sentient, but how would we detect that? Perhaps the brain has found some way to measure it in something, and to induce it in that same thing, but how it becomes part of a useful mechanism for controlling behaviour would remain a puzzle. Most philosophers talk complete and utter garbage about sentience and consciousness in general, so I don't waste my time studying their output, but I've heard Chalmers talk some sense on the issue.