A nice little article on the fallacy of premature optimization
Randall Hyde writes this very interesting article where he expounds on some notions I've often discussed in my talks. I'd definately say that our thoughts are one on the subject.
My favorite bit is this little line portion:
'Note, however, that [Sir Tony] Hoare did not say, "Forget about small efficiencies all of the time." Instead, he said "about 97% of the time." This means that about 3% of the time we really should worry about small efficiencies. That may not sound like much, but consider that this is 1 line of source code out of every 33. How many programmers worry about the small efficiencies even this often?'
Do you think about small efficiences about one line out of every 33?
The full version of Sir Tony Hoare's quote is:
"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil."
Full disclosure: I don't just like Randall Hyde's article because he quoted me a few times, but that didn't hurt. *grin*
Comments
- Anonymous
July 27, 2006
It's not quite as catchy, but instead of "Premature optimization is the root of all evil", I prefer "Structure obfuscating optimizations performed without profiling are the root of all evil." - Anonymous
July 27, 2006
The comment has been removed - Anonymous
July 28, 2006
Rico Mariani’s blog has a link today to an article in ACM Ubiquity called The Fallacy of Premature Optimization.... - Anonymous
August 01, 2006
It have been a long time (2001 ? before ?) since I have a seen as CPU limited a casual application. Most application are limited by there I/O interface (either by their [bad] usage of I/O or bad use case inside Operating System implementation). CPU gets more powerfull with times seems in effect.
I follow a bit Gnome blogs and noticed that most performance troubles is in very bad I/O usage (like reading again and again the same file !). The only exception seems to be Pango library and may be some part of Gdk library which may be too CPU intensive.
Now, in the embedded world, the situation is completely different. CPU and RAM do not scale as fast. There are very big sh*t in this space like .NET for small device (sic!), Windows CE/Mobile or other phone software. You can also include Nokia 770 in the list [I own one]. It always amaze me that with ressources in the range of 1995/1998 desktop computer, those kind of devices can be slower and more ressources angry than Windows95 or MacOS7 while doing less ! I still wonder if this is due to requirement of supporting modern usage like WWW, Blog or email, or, sloppy programming. May be both ? Somebody told me once that 'architect' and bad programmers escaped into embedded market. What you think about it ?