Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
mormon2-30

"How would you act if you were Eliezer?"

If I made claims of having a TDT I would post the math. I would publish papers. I would be sure I had accomplishments to back up the authority with which I speak. I would not spend a single second blogging about rationality. If I used a blog it would be to discuss the current status of my AI work and to have a select group of intelligent people who could read and comment on it. If I thought FAI was that important I would be spending as much time as possible finding the best people possible to work with and would never resort to a blog to try to attract the right sort of people (I cite LW as evidence of the failure of blogging to attract the right people).

Oh and for the record I would never start a non-profit to do FAI research. I also would do away with the Singularity Summit and replace it with more AGI conferences. I would also do away the most of SIAI's programs and replace them, and the money they cost, with researchers and scientists along with some devoted angel funders.

mormon220

I am going to respond to the general overall direction of your responses.

That is feeble, and for those who don't understand why let me explain it.

Eliezer works for SIAI which is a non-profit where his pay depends on donations. Many people on LW are interested in SIAI and some even donate to SIAI, others potentially could donate. When your pay depends on convincing people that your work is worthwhile it is always worth justifying what you are doing. This becomes even more important when it looks like you're distracted from what you are being paid to do. (If you ever work with a VC and their money you'll know what I mean.)

When it comes to ensuring that SIAI continues to pay especially when you are the FAI researcher there justifying why you are writing a book on rationality which in no way solves FAI becomes extremely important.

EY ask yourself this what percent of the people interested in SIAI and donate are interested FAI? Then ask what percent are interested in rationality with no clear plan of how that gets to FAI? If the answer to the first is greater then the second then you have a big problem, because one could interpret the use of your time writing this book on rationality as wasting donated money unless there is a clear reason how rationality books get you to FAI.

P.S. If you want to educate people to help you out as someone speculated you'd be better off teaching them computer science and mathematics.

Remember my post drew no conclusions so for Yvain I have cast no stones I merely ask questions.

mormon210

How am I troll? Did I not make a valid point? Have I not made other valid points? You may disagree with how I say something but that in no way labels me a troll.

The intention of my comment was to find what the hope for EYs FAI goals are based on here. I was trying to make the point with the zero, zilch idea... that the faith in EY making FAI is essentially blind faith.

mormon2-20

"As a curiosity, having one defector in a group who is visibly socially penalized is actually a positive influence on those who witness it (as distinct from having a significant minority, which is a negative influence.) I expect this to be particularly the case when the troll is unable to invoke a similarly childish response."

Wow I say one negative thing and all of a sudden I am a troll.

Let's consider the argument behind my comment:

Premises: Has EY ever constructed AI of any form FAI, AGI or narrow AI? Does EY have any degrees in any relevant fields regarding FAI? Is EY backed by a large well funded research organization? Could EY get a technical job at such an organization? Does EY have a team of respected experts helping him make FAI? Does EY have a long list of technical math and algorithm rich publications on any area regarding FAI? Has EY ever published a single math paper in for example a real math journal like AMS? Has he published findings on FAI in something like IEEE?

The answer to each of these questions is no.

The final question to consider is: If EY's primary goal is to create FAI first then why is he spending most of his time blogging and working on a book on rationality (which would never be taken seriously outside of LW)?

Answer: this is counter to his stated goal.

So if all answers being in the negative then what hope should any here hold for EY making FAI? Answer: zero, zilch, none, zip...

If you have evidence to the contrary for example proof that not all the answers to the above questions are no then please... otherwise I rest my case. If you come back with this lame troll response I will consider my case proven, closed and done. Oh and to be clear I have no doubt in failing to sway any from the LW/EY worship cult but the exercise is useful for other reasons.

Load More