Captain_Obvious comments on Beautiful Math - Less Wrong

24 Post author: Eliezer_Yudkowsky 10 January 2008 10:43PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (35)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Captain_Obvious 10 January 2008 11:38:00PM 3 points [-]

Back in high school I discovered this by accident (yes, I was really bored!). I suppose it's nothing new, but it turns out that this works for more than simple squares and cubes:

Given any sequence of numbers, keep finding differences of differences until you hit a constant; the number of iterations needed is the maximum exponent in the formula that produced the numbers. That is, this works even if there are other terms, regardless of whether any or all terms have coefficients other than 1.

Comment author: Technoguyrob 21 December 2011 09:37:56PM *  1 point [-]

This is obvious after you learn calculus. The "nth difference" corresponds to nth derivative (a sequence just looks at integer points of a real-valued function), so clearly a polynomial of degree n has constant nth derivative. It would be even more accurate to say that an nth antiderivative of a constant is precisely a degree n polynomial.

Comment author: [deleted] 21 December 2011 09:41:18PM 0 points [-]

Iterated finite differences correspond to derivatives in some non-obvious way I can't remember (and can't be bothered to find out).

Comment author: Vladimir_Nesov 21 December 2011 10:49:56PM 0 points [-]

Notice that the result doesn't hold if the points aren't evenly spaced, so the solution must use this fact.

Comment author: Sniffnoy 21 December 2011 11:12:57PM *  1 point [-]

Differences and derivatives are not the same, though there is the obvious analogy. If you want to take derivatives and antiderivatives, you want to write in the x^k basis or the x^k/k! basis. If you want to take differences and sums, you want to write in the falling factorial basis or the x choose k basis.

Comment author: Technoguyrob 22 December 2011 01:35:44AM 1 point [-]

If you get a non constant, yes. For a linear function, f(a+1) - f(a) = f'(a). Inductively you can then show that the nth one-step difference of a degree n polynomial f at a point a is f^(n)(a). But this doesn't work for anything but n. Thanks for pointing that out!

Comment author: Sniffnoy 22 December 2011 02:06:16AM 0 points [-]

Ah, yes, that's a good point, because the leading coefficient be the same whether you use the x^k basis or the falling factorial basis.

Comment author: TRManderson 02 August 2013 12:36:44PM 0 points [-]

Neither finite differences nor calculus are new to me, but I didn't pick up the correlation between the two until now, and it really is obvious.

This is why I love mathematics - there's always a trick hidden up the sleeve!

Comment author: dlthomas 21 December 2011 09:48:45PM 0 points [-]

Your procedure (though not necessarily your result) breaks for e^x

Comment author: thomblake 21 December 2011 09:54:30PM 1 point [-]

Really for non-polynomials, and I think that was implied by the phrasing.

Comment author: dlthomas 21 December 2011 10:03:09PM 0 points [-]

I agree that it's implied by working out the logic and finding that it doesn't apply elsewhere. I disagree that it is implied by the phrasing.

Given any sequence of numbers

doesn't seem to restrict it, and though I suppose

the number of iterations needed is the maximum exponent in the formula that produced the numbers

implies that there is a "maximum exponent in the formula" and with slightly more reasoning (a number of iterations isn't going to be fractional) that it must be a formula with a whole number maximum exponent, I don't see anything that precludes, for instance, x^2 + x^(1/2), which would also never go constant.

Comment author: thomblake 22 December 2011 03:17:02PM *  0 points [-]

Sorry, I was using the weak "implies", and probably too much charity.

And I usually only look at this sort of thing in the context of algorithm analysis, so I'm used to thinking that x squared is pretty much equal to 5 x squared plus 2 log x plus square root of x plus 37.

Comment author: AnthonyC 26 February 2012 08:17:25AM 0 points [-]

So did I! And in general the nth order finite differences of nth powers will be n factorial.