Wiki Contributions

Comments

Sorted by
red7540

Probabilistic inference for general belief networks is NP-hard (see The Computational Complexity of Probabilistic Inference Using Bayesian Belief Networks (PDF)). Thus straitforward approach is not an option. The problem is more like finding computationally tractable yet sufficiently powerful subtype of belief networks.

red7520

What bothers me in The Basic AI Drives is a complete lack of quantitativeness.

Temporal discount rate isn't even mentioned. No analysis of self-improvement/getting-things-done tradeoff. Influence of explicit / implicit utility function dichotomy on self-improvement aren't considered.

red7530

Diversity of a population plays a role too. If I'm well below Feynman level (and I am), then there's a possibility that I can slightly improve my cognitive abilities without any negative consequences.

My experience with nootropics (racetams) seems to support this, as far as it is possible for anecdotal evidence.

red7500

It is valuable information, thanks. I underestimated relative weight of communication style in the feedback I got.

red7500

Thank you. It is something I can use for improvement.

Can you point at the flaws? I can see that the structure of sentences is overcomplicated, but I don't know how it feels to native English speakers. Foreigner? Dork? Grammar Illiterate? I appreciate any feedback. Thanks.

red7520

One year and one level-up (thanks to ai-class.com) after this comment I'm still in the dark about the cause of downvoting the above comment.

I'm sorry for whining, but my curiosity took me over. Any comments?

red7530

Problem 2 by Bayes rule.

N is a random variable (RV) of number of filled envelopes.

C is a RV of selected envelope contains coin. P(C) means P(C=true) when appropriate.

Prior distribution

P(N=n) = 1/(m+1)

by the problem setup

P(C|N=n) = n/m 

by the rule of total probability

P(C)=sum_n P(C|N=n)P(N=n) = sum_n (n/m/(m+1))=m(m+1)/2/m/(m+1)=1/2

by Bayes rule

P(N=n|C) = P(C|N=n)P(N=n)/P(C) = 2n/m/(m+1)

Let C' is a RV of picking filled envelope second time.

by the problem statement

P(C'|N=n,C) = (n-1)/m

by the rule of total probability

P(C'|C)=sum_n P(C'|N=n,C)P(N=n|C) = ... substitutions and simplifications ... = 2(m-1)/(3m)

solving P(C'|C)=P(C) obtains

m=4

red7510

I suspect it's highly relevant that if someone were to actually grow up in a grayscale environment, they wouldn't be capable of experiencing blue.

Results of gene therapy for color blindness suggest otherwise. Maybe those monkeys and mice cannot experience colors, but they react as if they can.

I'm really want to try this myself. Infrared sensitive opsin in a retina, isn't it wonderful?

red7510

I don't understand what the question is getting at.

I am getting there. There's a phenomenon called blindsight type 1. Try to imagine that you have "color blindsight", i.e. you can't differentiate between colors, but you can guess above chance what color it is. In this condition you lack qualia of colors.

red7510

I doubt that you think about rods and cones when you are deciding if it's safe to cross the road. The question is: is there something in your perception of illuminated traffic light, that allows you to say that it is red or green or yellow? Or maybe you just know that it is green or yellow, but you can't see any differences but position and luminosity?

Load More