[Techtalk] Philosophical question: CPU/memory/disk cheaper than efficiency?

Wim De Smet kromagg at gmail.com
Tue Apr 10 19:28:12 UTC 2007


Good post, couldn't agree with you more. On the first point...

On 4/10/07, Akkana Peck <akkana at shallowsky.com> wrote:
> Wim De Smet writes:
> > Eric S. Raymond claims in his book The Art of UNIX Programming[1] that
> > you should only consider recoding the program if you think you'll be
> > able to make it an order of magnitude faster, i.e. 10, 100, 1000 times
> > faster depending on what you change.
>
> I haven't read ESR's book, but I suspect he's talking about
> refactoring (Joel Spolsky of joelonsoftware.com has also written
> against that) or complete rewrites from scratch.
>
> I doubt ESR or Joel or anyone else would argue against
> performance bugfixes to make a program more efficient.
> Sometimes a program that takes over a machine and "slows other
> processes to a crawl", as the original poster described, does
> that because it has a memory leak or some other inefficiency,
> a bug which could be easily found and fixed. That doesn't involve
> anywhere near the programmer effort that a rewrite implies.

Yeah I was more thinking about picking another algorithm, or
rethinking the architecture, stuff like that. If there is a nasty bug
somewhere you _should_ always fix that. That being said I think there
are plenty of companies that sometimes decide to leave certain bugs
in, because it's just cheaper that way (for them). That leads to the
kind of applications that you hear stories about, like windows server
apps that require you to reboot the server every 24 hours (and run
nothing on it except that app). Obviously this may be good for the
company bottom line in the short term, but it can really hurt the
company's reputation in the long term.

greets,
Wim


More information about the Techtalk mailing list