Posts

Sorted by New

Wiki Contributions

Comments

polymer10y20

Perhaps I should've said, hard in the wrong ways. The long term goal for a good professional programmer seems to be understanding what the client wants. Some math is needed to understand the tools, so you can give some context for options. But I spend most of my creative energy making sure my programs do what I want them to do, and that is really hard when each language has it's own prejudice motivating its design.

I seriously considered looking into real time high risk software applications. But I just decided that instead of learning new languages until I ran out of youth, it'd be more fun learning general relativity, or even measure theory. The ideas in those subjects will probably hold out a lot longer then python.

polymer10y20

So, my point regarding the speed.

In the middle of working out a problem, I had to find the limit of

S = 1/e + 2/e^2 + ... + n/e^n + ...

I had never seen this sum before, so now cleverness is required. If I assumed guess C was true, that would imply

e/(e - 1) = (e - 1)S

This claim is much easier to check,

(e - 1)S = 1 + 1/e + 1/e^2 + ... = 1/(1 - 1/e) = e/(e - 1)

We know what S is, and the solution to the problem follows. In retrospect, I understand one method for how I could find the answer. But during the test, I can't see through the noise fast enough (although I can smell the clue). I could go through each guess one by one, but I'm just too slow. Maybe there's something else I'm missing that would've made the guess simpler, but that's what I'm basing the slow opinion off of.

I don't know if being slow at inference in this sense is a barrier, or indicative, of deeper creativity issues (or if I'm just suffering from the availability heuristic.)

Anyways your questions all very good, I don't care for academia perse, I care about the questions. If I don't keep doing academic stuff, I would hope I would've formed enough connections to find some route towards practical problems that still require some creativity.

Your last question is very interesting. I'm not sure how to answer it. My unhealthy worry, I think, is I really don't like wasting peoples time. I suppose I don't care about either being "just OK", if "just OK" isn't wasting peoples time, but I still get to be creative.

I guess I don't want to be a pundit? I mean I'll teach, but I'd be much happier if I was doing something theoretically. If this is impossible for me, I'd like to know the reasons why, and fail out as soon as possible.

Your questions are very interesting though, I still need to think about them more. Thank you for your thoughts, they give very good context to think about this, and its clear you've worried about analogous issues.

polymer10y20

I'm not sure. I'm trying to work towards a career path which uses as much of my ability as I can. The most important job for a professional programmer, was understanding what your client wanted. This is a fine job, but being good at algorithms isn't necessarily a requirement.

When talking to an engineer at Google, I asked what he thought a good career choice was for working on hard problems. His immediate first thought was graduate school, then he sort of mentioned robotics.

My ideal dream isn't being a professor, it's working on something that needs inference, that uses my mathematical abilities. So I'm leaning towards research, but that's the implication not necessarily the goal.

Teaching isn't the goal, hands on altruism isn't the goal. Fitting into a place where I'm using as much of my skill set as possible, is the goal.

And that is a terminal goal, I can do boring stuff in the mean time. My point for jumping out of programming, was exactly that the math wasn't the important part, it was the picture. The math is important to someone else. I'd like to be that someone else.

I try to explain this to people though, and almost all of them think I'm being way to vague (or they don't understand). You go to school because that's the only way you're going to study the distribution of zeroes for the Wronskian of orthogonal polynomials. I've had maybe one professor discourage me from being too picky...

polymer10y60

I agree, that I have a wealth of information to work with right now. Just trying to honestly balance it (felt like LW fit the theme somewhat).

On the one hand, both of those scores are my first time, and they were taken cold. And, I could argue I thought a lot of homework in school was unimportant and unnecessary (because of a poor philosophical attitude).

But of the 26 questions wrong or incomplete on the practice Math subject test, roughly 16 of them I had sufficient knowledge, but I simply wasn't fast enough. And the Algebra class, was really hard, and I did do homework eventually.

It's not like I haven't been very successful in some courses. Graduate Complex Analysis, and stochastic processes come to mind. And the admissions director at my undergrad (University of Oregon) has told me directly I am ready for graduate school, but he would prefer I went to a better school.

I'm just lost. It seems in this context, failure speaks louder then success. Even if I was smart enough, perhaps I simply haven't worked hard enough (or on the right things). The practical consequence would be the same. I wish I knew what the admissions officer saw, it's hard to suppress the feeling he's only saying that because I did well in his courses.

polymer10y00

I'm not quite sure what the following means:

if you add details to a story, it becomes less plausible" is a false statement coming from human interaction.

I don't care whether it's false as a "human interaction". I care whether the idea can be modeled by probabilities.

Is my usage of the word plausible in this way really that confusing? I'd like to know why... Probable, likely, credible, plausible, are all (rough) synonyms to me.

polymer10y00

So plausibility isn't the only dimension for assessing how "good" a belief is.

A or not A is a certainty. I'm trying to formally understand why that statement tells me nothing about anything.

The motivating practical problem came from this question,

"guess the rule governing the following sequence" 11, 31, 41, 61, 71, 101, 131, ...

I cried, "Ah the sequence is increasing!" With pride I looked into the back of the book and found the answer "primes ending in 1".

I'm trying to zone in on what I did wrong.

If I had said instead, the sequence is a list of numbers - that would be stupider, but well inline with my previous logic.

My first attempt at explaining my mistake, was by arguing "it's an increasing sequence" was actually less plausible then the real answer, since the real answer was making a much riskier claim. I think one can argue this without contradiction (the rule is either vague or specific, not both).

However, it's often easy to show whether some infinite product is analytic. Making the jump that the product evaluates to sin, in particular, requires more evidence. But in some qualitative sense, establishing that later goal is much better. My guess was that establishing the equivalence is a more specific claim, making it more valuable.

In my attempt to formalize this, I tried to show this was represented by the probabilities. This is clearly false.

What should I read to understand this problem more formerly, or more precisely? Should I look up formal definitions of evidence?

polymer10y30

Can someone link to a discussion, or answer a small misconception for me?

We know P(A & B) < P(A). So if you add details to a story, it becomes less plausible. Even though people are more likely to believe it.

However, If I do an experiment, and measure something which is implied by A&B, then I would think "A&B becomes more plausible then A", Because A is more vague then A&B.

But this seems to be a contradiction.

I suppose, to me, adding more details to a story makes the story more plausible if those details imply the evidence. Sin(x) is an analytic function. If I know a complex differentiable function has roots on all multiples of pi, Saying the function is satisfied by Sin is more plausible then saying it's satisfied by some analytic function.

I think...I'm screwing up the semantics, since sin is an analytic function. But this seems to me to be missing the point.

I read a technical explanation of a technical explanation, so I know specific theories are better then vague theories (provided the evidence is specific). I guess I'm asking for clarifications on how this is formally consistent with P(A) > P(A&B).

polymer10y50

I disagree, I read the Feynman lectures in high school and learned a great deal. His presentation taught me more about how to think about these things then Giancoli did.

Giancoli better prepared me for what the standard format was for test questions, but it didn't really articulate how I was supposed to use the ideas to generate new ones. Feynman's style of connecting claims with whatever you happened to know, is extremely important. Giancoli doesn't demonstrate this style quite as well.

Of course it was my first textbook, so I could go on and on about why I like it...