DSimon comments on The Power of Noise - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (80)
I think an example of what jsteinhardt is referring to would be quicksort. It can take an arbitrary list as an argument, but for many perversely ordered inputs it takes Omega(n^2). However, it does have an efficient average-case complexity of O(n log n). In other words, if the input is sampled from the uniform distribution over permutations the algorithm is guaranteed to finish in O(n log n) time.
Many of the other examples that were considered are similar, in that the algorithm doesn't give an error when given an input outside of the expected distribution, but rather silently works less effectively
Quicksort as a piece of software does not require any particular distribution. It is perfectly happy to work with "perversely ordered inputs".
A human running quicksort with certain expectations about its performance might require a particular distribution, but that's not a characteristic of software.
Recall that the original claim was that expecting a particular distribution breaks modularity.
I think this may be a distinction without a difference; modularity can also be defined as human expectations about software, namely that the software will be relatively easy to hook into a larger system.
I don't find this a useful definition, but YMMV, de gustibus, and all that...