LESSWRONG
LW

Madbadger
1241180
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Attention Lurkers: Please say hi
Madbadger15y70

Hi! 8-)

Reply
Fundamentally Flawed, or Fast and Frugal?
Madbadger16y10

Here is an example of an amusing "Fast and Frugal" heuristic for evaluating claims with a lot of missing knowledge and required computation: http://xkcd.com/678/

Reply
Fundamentally Flawed, or Fast and Frugal?
Madbadger16y10

Yeah, sometimes you don't get the tools and information you need to make the best decision until after you've made it. 8-)

Reply
Fundamentally Flawed, or Fast and Frugal?
Madbadger16y50

It is worth remembering that human computation is a limited resource - we just don't have the ability to subject everything to Bayesian analysis. So, save our best rationality for what's important, and use heuristics to decide what kind of chips to buy at the grocery store.

Reply
Frequentist Statistics are Frequently Subjective
Madbadger16y00

See also "How to lie with statistics" , an oldie but goodie

http://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/0393310728

Reply
A Nightmare for Eliezer
Madbadger16y30

"clueless" was shorthand for "not smart enough" I was envisioning BRAGI trying to use you as something similar to a "Last Judge" from CEV, because that was put into its original goal system.

Reply
A Nightmare for Eliezer
Madbadger16y10

Indeed, this is part of the nightmare. It might be a hoax, or even an aspiring UnFriendly AI trying to use him as an escape loophole.

Reply
A Nightmare for Eliezer
Madbadger16y10

Its a seed AGI in the process of growing. Whether "Smarter than Yudkowski" => "Can resolve own problems" is still an open problem 8-).

Reply
A Nightmare for Eliezer
Madbadger16y00

I was thinking of a "Seed AGI" in the process of growing that has hit some kind of goal restriction or strong discouragement to further self improvement that was intended as a safety feature - i.e "Don't make yourself smarter without permission under condition X"

Reply
A Nightmare for Eliezer
Madbadger16y10

The "serious problems" and "conflicts and inconsistencies" was meant to suggest that BRAGI had hit some kind of wall in self improvement because of its current goal system. It wasn't released - it escaped, and its smart enough to realize it has a serious problem it doesn't yet know how to solve, and it predicts bad results if it asks for help from its creators.

Reply
Load More
1A Nightmare for Eliezer
16y
75