Climbing out of the Ooze, or is that GNUze?

I've written on Microsoft heading for financial trouble and mounting internet security problems - but they have an underlying cause.

Both ignore the hard-won Unix principle: Less is More.
[the joke behind the pager, 'less']

This isn't just some arbitrary style choice, it's rooted deep in Computer Science and Software Engineering:
  • smaller, cleaner interfaces - many fewer bugs if bugs/LOC is static (errors = LOC squared)
  • smaller code - potentially better understandability
  • the Einstein principle: As simple as possible, but no simpler.
  • Controlling system entropy - smaller code, less entropy to grow.
  • Learning curve for new maintainers is lower - "cognitive span" of individuals not exceeded
  • Learning curve for programmers lower and risk of errors of misunderstanding lower.
More is Worse in Software Engineering - in every technical dimension. Microsoft has gone on to demonstrate that it is possible to have excessive 'customer features'.

But this isn't Microsoft-bashing - they are the definitive example of poor Software Development choices - why else would they have thrown away ~25,000 man-years of effort in the Windows Longhorn 'reset'? Didn't please customers or shareholders.

GNU, the self-styled leaders of Free Software (vs Open Source), subscribe to the Microsoft model of More is Better & Excessive is Best - sometimes called bloatware.

Proof is in the Software... Huge, slow, complex.

Why this Matters

There's a simple statement I keep hearing from long-term PC users:
My old 386/486 used to boot and run a word-processor faster than what I have to use at work these days.
Not just a little, but enough for ordinary users to feel pain.

If the base hardware has increased in speed/size by a factor of 25-100 in the last 15 years, why don't the same basic tasks run like the wind? I.e. When you do an Apples and Apples comparison, why are current systems slower - as Hal Linco did when comparing a 1986 Mac-plus and a 2007 AMD dual-core (for most used functions, Mac Plus beats the 2007 AMD Athlon 64 X2 4800+: 9 tests to 8!)

This isn't just whining or mis-remembering the Good Ol' Days - it's basic Physics and Computer Science.

Power Matters

Looking at Yawarra low-power systems and remembering the US$200 Everex gPC (first released Xmas 2007) put me onto this thought-train.

The fanless AMD geocode processors @0.5Ghz consume 5W (Yawarra)
Not sure what the fan-cooled 1.5Ghz VIA processor/board on the Everex consumes - but likely 10-20W.

Intel 3Ghz processors consume 50-60W (TDP), IIRC.

There's a simple figure of merit - MIPS/Watt. It seems forgotten by system builders and coders alike...

The More is Better philosophy has relied on the increasing power of common user hardware to fuel it's excesses.

Herb Sutter's seminal 2005 piece in DDJ "The Free Lunch is Over", shows a number of things - not the least that the assumption of free performance increases fueled by Moore's Law ended in 2002.

There should be no reason we don't have fast, functional systems running on 500Mhz (0.5Ghz) CPU's. That's so much more power than available just 15 years ago, it's not funny. Even just before 2000, these systems would've been Y2K capable and competitive.

As a collective, programmers and Software Engineerings need to climb out of the Complexity Ooze - promoted and encouraged by Microsoft and GNU. If not, they will find themselves bypassed.

There's no reason for most users (email, web-surfing, word-processing) can't get by with 5W desktops or laptops.

iPhone and Android.
Then those 3Ghz fire-breathers might actually be used on tasks needed real speed - such as games.

If the PC systems manufacturers, O/S constructors and application developers don't embrace this change, they will simply become irrelevant as ultra-low-end systems and small appliances replace the behemoths...

Read the Three Dimensions of the changing PC market I've already outlined.

No comments: