mormon2 comments on Less Wrong Q&A with Eliezer Yudkowsky: Ask Your Questions - Less Wrong

16 Post author: MichaelGR 11 November 2009 03:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (682)

You are viewing a single comment's thread. Show more comments above.

Comment author: mormon2 15 November 2009 03:50:15AM 8 points [-]

"and then take out whatever time was needed to collect the OB/LW posts in our discussion into a sequence with summaries."

Why? No one in the academic community would spend that much time reading all that blog material for answers that would be best given in a concise form in a published academic paper. So why not spend the time? Unless you think you are that much of an expert in the field as to not need the academic community. If that be the case where are your publications and where are your credentials, where is the proof of this expertise (expert being a term that is applied based on actual knowledge and accomplishments)?

"Since I don't expect senior traditional-AI-folk to pay me any such attention short of spending a HUGE amount of effort to get it and probably not even then, I haven't, well, expended a huge amount of effort to get it."

Why? If you expect to make FAI you will undoubtedly need people in the academic communities' help; unless you plan to do this whole project by yourself or with purely amateur help. I think you would admit that in its current form SIAI has a 0 probability of creating FAI first. That being said your best hope is to convince others that the cause is worthwhile and if that be the case you are looking at the professional and academic AI community.

I am sorry I prefer to be blunt.. that way there is no mistaking meanings...

Comment author: Alicorn 15 November 2009 02:38:32PM 3 points [-]

I think you would admit that in its current form SIAI has a 0 probability of creating FAI first.

No.

Comment author: wedrifid 15 November 2009 10:50:09AM *  0 points [-]

Since I don't expect senior traditional-AI-folk to pay me any such attention short of spending a HUGE amount of effort to get it and probably not even then, I haven't, well, expended a huge amount of effort to get it.

Why? If you expect to make FAI you will undoubtedly need people in the academic communities' help; unless you plan to do this whole project by yourself or with purely amateur help. ...

That 'probably not even then' part is significant.

That being said your best hope is to convince others that the cause is worthwhile and if that be the case you are looking at the professional and academic AI community.

Now that is an interesting question. To what extent would Eliezer say that conclusion followed? Certainly less than the implied '1' and probably more than '0' too.

Comment author: mormon2 15 November 2009 04:59:01PM 3 points [-]

"Since I don't expect senior traditional-AI-folk to pay me any such attention short of spending a HUGE amount of effort to get it and probably not even then, I haven't, well, expended a huge amount of effort to get it.

Why? If you expect to make FAI you will undoubtedly need people in the academic communities' help; unless you plan to do this whole project by yourself or with purely amateur help. ..."

"That 'probably not even then' part is significant."

My implication was that the idea that he can create FAI completely outside the academic or professional world is ridiculous when you're speaking from an organization like SIAI which does not have the people or money to get the job done. In fact SIAI doesn't have enough money to pay for the computing hardware to make human level AI.

"Now that is an interesting question. To what extent would Eliezer say that conclusion followed? Certainly less than the implied '1' and probably more than '0' too."

If he doesn't agree with it now, I am sure he will when he runs into the problem of not having the money to build his AI or not having enough time in the day to solve the problems that will be associated with constructing the AI. Not even mentioning the fact that when you close yourself to outside influence that much you often end up with ideas that are riddled with problems, that if someone on the outside had looked at the idea they would have pointed the problems out.

If you have never taken an idea from idea to product this can be hard to understand.

Comment author: Eliezer_Yudkowsky 15 November 2009 08:49:45PM 8 points [-]

In fact SIAI doesn't have enough money to pay for the computing hardware to make human level AI.

And so the utter difference of working assumptions is revealed.

Comment author: CannibalSmith 17 November 2009 12:42:59PM *  1 point [-]

Back of a napkin math:
10^4 neurons per supercomputer
10^11 neurons per brain
10^7 supercomputers per brain
1.3*10^6 dollars per supercomputer
1.3*10^13 dollars per brain

Edit: Disclaimer: Edit: NOT!

Comment author: wedrifid 17 November 2009 02:19:22PM *  2 points [-]

10^4 neurons per supercomputer
10^11 neurons per brain

Another difference in working assumptions.

Comment author: CannibalSmith 17 November 2009 04:43:45PM *  0 points [-]

It's a fact stated by the guy in the video, not an assumption.

Comment author: wedrifid 17 November 2009 06:47:29PM 0 points [-]

No need to disclaim, your figures are sound enough and I took them as a demonstration of another rather significant difference between the assumptions of Eliezer and mormon2 (or mormon2's sources).

Comment author: wedrifid 15 November 2009 08:06:11PM *  0 points [-]

If you have never taken an idea from idea to product this can be hard to understand.

I have. I've also failed to take other ideas to products and so agree with that part of your position, just not the argument as it relates to context.