You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

knb comments on Open thread for December 9 - 16, 2013 - Less Wrong Discussion

5 Post author: NancyLebovitz 09 December 2013 04:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (371)

You are viewing a single comment's thread.

Comment author: knb 09 December 2013 11:06:07PM *  5 points [-]

Wirth's Law:

Wirth's law is a computing adage made popular by Niklaus Wirth in 1995. It states that "software is getting slower more rapidly than hardware becomes faster."

Is Wirth's Law still in effect? Most of the examples I've read about are several years old.

ETA: I find it interesting that Wirth's Law was apparently a thing for decades (known since the 1980s, supposedly) but seems to be over. I'm no expert though, I just wonder what changed.

Comment author: passive_fist 10 December 2013 01:02:26AM 10 points [-]

It was my impression that Wirth's law was mostly intended to be tongue-in-cheek, and refer to how programs with user interfaces are getting bloated (which may be true depending on your point of view).

In terms of software that actually needs speed (numerical simulations, science and tech software, games, etc.) the reverse has always been true. New algorithms are usually faster than old ones. Case in point is the trusty old BLAS library which is the workhorse of scientific computing. Modern BLAS implementations are extremely super-optimized, far more optimized than older implementations (for current computing hardware, of course).

Comment author: Manfred 09 December 2013 11:29:17PM *  3 points [-]

It wasn't even true in 1995, I don't think. The first way of evaluating it that comes to mind is the startup times of "equivalent" programs, like MS Windows, Macintosh OS, various Corels, etc.

Comment author: fubarobfusco 10 December 2013 12:52:41AM 4 points [-]

Startup times for desktop operating systems seem to have trended up, then down, between the '80s and today; with the worst performance being in the late '90s to 2000 or so when rebooting on any of the major systems could be a several-minutes affair. Today, typical boot times for Mac, Windows, or GNU/Linux systems can be in a handful of seconds if no boot-time repairs (that's "fsck" to us Unix nerds) are required.

I know that a few years back, there was a big effort in the Linux space to improve startup times, in particular by switching from serial startup routines (with only one subsystem starting at once) to parallel ones where multiple independent subsystems could be starting at the same time. I expect the same was true on the other major systems as well.

Comment author: knb 10 December 2013 05:20:24AM 2 points [-]

My experience is that boot time was worst in Windows Vista (released 2007) and improved a great deal in Windows 7 and 8. MS Office was probably at its worst in bloatiness in the 2007 edition as well.

Comment author: mwengler 10 December 2013 04:49:08PM 0 points [-]

It would be interesting to plot the time sequence of major chip upgrades from intel on the same page as the time sequence of major upgrades of MS Word and/or MS Excel. My vague sense is the mid/early 90s had Word releases that I avoided for a year or two until faster machines came along that made them more usable from my point of view. But it seems the rate of new Word releases has come way down compared to the rate of new chip releases. That is, perhaps hardware is creeping up faster than features are in the current epoch?

Comment author: Waffle_Iron 13 December 2013 05:59:46AM 1 point [-]

This seems to be true for video game consoles. Possibly because good graphics make better ads than short loading times.

Comment author: mwengler 10 December 2013 04:44:01PM 0 points [-]

I find it interesting that Wirth's Law was apparently a thing for decades (known since the 1980s, supposedly) but seems to be over. I'm no expert though, I just wonder what changed.

I think both software and hardware got further out on the learning curve which means their real rates of innovative development have both slowed down which means the performance of software has sped up.

I don't get how I get to the last part of that sentence from the first part either, but it almost makes sense.

Comment author: Tenoke 10 December 2013 10:06:26AM *  -1 points [-]

I mean, this formulation is wrong (software isn't getting slower), except for the tongue-in-cheek original interpretation I guess. On the other hand, software is getting faster at a slower rate than hardware is and that is still an important observation.