You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Request for Widely Applicable Quantitative Methods

2 Post author: atucker 20 February 2011 04:08AM

I'm going to be competing in the Moody's Mega Math Challenge, and I was wondering if there was anything in particular I should brush up on.

If you look at previous problems, you can see that they're pretty varied. I want to know if there's any widely applicable math that we could study (in a fairly short amount of time) to maximize the odds of us knowing something useful for the competition.

Our math backgrounds include:

  • Statistics (taught by a frequentist, mostly just probability theory and p, z, chi-squared, etc. tests)
  • Calculus (single variable and multivariable)
  • Linear Algebra
  • Numerical Methods
We're also pretty competent at programming in various programming languages, and LaTeX.

Currently we're looking into Causality by Judea Pearl, and Linear Programming. Should we look at these? Anything else we should know?

Edit:
I suppose we could also use a genetic algorithm, but those don't seem particularly suited to the competition.

 

Comments (7)

Comment author: jsalvatier 20 February 2011 05:12:51PM 1 point [-]

If you're not terribly concerned about speed, I would try to understand a bit of optimization in general instead of linear programming (which is a special case).

Comment author: atucker 21 February 2011 02:05:47AM 0 points [-]

I should probably do that sometime in my life, if not for this.

Any suggestions for how? Would the wikipedia page be enough?

Comment author: jsalvatier 21 February 2011 04:49:11AM 1 point [-]

I'm not sure what the best way is; I do recommend playing around in excel. Excel has some pretty decent optimization functionality built in (not hard to use either) and it's quite visual. The wikipedia page is a good start, you probably just need to know how to use some tools and some idea about how they work.

The two most traditional approaches to optimization are approximating the function of interest locally as 1) a hyper-plane 2) a quadratic function.

Comment author: atucker 21 February 2011 05:14:56AM 0 points [-]

Huh, interesting.

Thanks for the advice.

Comment author: Daniel_Burfoot 20 February 2011 02:44:31PM 1 point [-]

Boosting methods, particularly AdaBoost, are very effective and easy to understand.

Comment author: atucker 20 February 2011 04:06:33PM 1 point [-]

Ooh, thanks. I'll look at this some more after I get back from robotics...

Comment author: nerzhin 21 February 2011 04:22:57PM 0 points [-]

Looking at your list of backgrounds, the missing thing that jumps out at me is discrete math. You might also want to think about learning some differential equations, if it wasn't included in your calculus sequence.