2009/04/19

Death by Success

Being too successful leads to failure unless you are aware of the problem and carefully monitor and protect against it. Every large company faces this problem - in I.T. for example, Google and Microsoft.

2009/04/18

Microsoft Troubles - V

More on the theme of "Microsoft will experience a Financial Pothole" from a Financial perspective.

For less rigorous commentaries, there's Motley Fools piece "The Two Words Bill Gates Doesn't Want You to Hear" (A: Cloud Computing) [Article requires signing-up, but can be found from Your Favourite Search Engine]
A commentary on that piece from Carbonite, a 'Cloud' vendor.

Victor Cook at Customers and Capital has a series of pieces on Microsoft and its fundamentals.
His series on 'Brands' compares GOOG & MSFT at times.
The first post in the 'Blue Ocean Strategy' series, the second, third and fourth posts.
His analyses of a number of issues are informed and enlightening - MSFT vs GOOG, the YHOO & Double-Click acquisitions...

Cook points at a March-19 2009 piece at 'The Wild Investor' called "The State of Microsoft" that starts:
Well we are not in the 1990’s anymore, and unless you plan to hold the stock for 50 years there is really no point to holding shares of Microsoft. Here is why…
and ends with:
The bottom line is that the luster behind Microsot is no longer there. Sure, there is upside to the stock, but how much. We are in a new time where the old blue chips are no longer blue. Stocks like Cisco (CSCO), IBM (IBM), and General Electric (GE) are no longer fun or smart companies to invest in.

2009/04/17

How "The Internet Changes Everything" for Journalists

Update 1: 26-May-2009
A very kind person @ ABC took the time to read this piece and to give me some valuable comments:
  • We are paid to make editorial decisions ... This is our job.
  • news is about immediate events and happenings and it is short, brief and factual.
  • 'News' is just one kind of ABC broadcast.
    Don't confuse it with our entire output.
  • As to your specific proposals...
    - SPAM is a "dropdead" problem (my words)
    - automatically generated responses are a waste of time... listener input should be read and acknowledged by a real person,
    - but the problem is that we are all under time pressure.

So where to from here?
Appears to be end-of-the-road for this approach.



How does "The Internet Changes Everything" apply to News and Journalism at the ABC - Australia's national broadcaster?

Contents: Background, How the Internet Changes Everything and Proposal

What are the sources of Good Stories?
Do Journalists have a monopoly on sources and Perfect Judgement on story 'size' & importance?
Obviously not.

But how do folk "out of the loop" gain access to the Gatekeepers of the public media?

Consider two cases and what, if anything, has changed now if they were to be repeated:
  • The "Erin Brokovich effect": through persistent talking & listening to ordinary householders, an apparently minor legal matter became massive. The romantic story of the film has the primary evidence, "the smoking gun", only arriving accidentally, and
  • "Dr. Death" allegations at Rockhampton hospital: a set of nurses at the hospital attempted, for years, to raise their concerns internally & externally without getting any 'traction'. Meanwhile, people were needlessly being harmed, there's no media coverage and hence no political interest in investigating the claims.

Background


The ABC News website is clear, well structured, informative and completely useless and frustrating for someone like me who's not sending Press Releases, submitting news tips/vision, already known to journalists or a 'recognised industry expert'.

I spent a week trying to contact any journalist inside the ABC who might be able to pick up a small but important question that goes to the heart of the National Broadband Network Fibre-to-the-Premises proposal of the Rudd Government: the published pricing ($5,500/house) is 2-5 times higher than both the 1995 Optus/Telstra cable TV roll-out to 2-3M houses and recent FTTP installations.

Something looks very wrong, and asking this and consequential questions of both Politicians and Telco experts/consultants would make good copy and put the ABC at the forefront of this News thread.

But I can't get through... E-mails, web-form emails and phone calls (with follow-ups) have drawn a blank. I'm not some major 'name' or consultancy so my 30+ years experience in the field and a bunch of good innovations just don't count... Which is what I have to presume, because I've heard nothing back. Not even an automatic response.

From my position outside, it appears that internally ABC News operates as a set of independent 'silos' - groups that are completely isolated from one another. If you get through to anyone, they might reject it but offer no assistance in what to try next.

The next step is "go to the top". Which would be the newly appointed "Head of News", who shows up in media releases, interviews, in the Organisation Chart [PDF], but not yet in the 'Contacts' page - which curiously has only postal and telephone contact information. That implies there isn't an automatic system to update all relevant webpages. Writing to 'webmaster' should work - but from my experience, I'm disinclined to try.

If there are permanent electronic addresses to contact people in senior roles, they are not disclosed.

Ditto for any set of 'Editors', 'News Desks' or targeted 'Correspondents'.
Is it possible internally ABC News is this chaotic & unorganised?
Is this "Wall of Granite" exposed to outsiders accidental or intentional?


"The Internet Changes Everything"


Radio is a highly personal medium: Philip Adams insight is in speaking to 'the listener'. He knows he is having a personal conversation with individuals, not a group or an 'audience' as you find in theatres and sports grounds.

People listening to radio are more likely to want to continue the conversation on-line. This is facilitated on the ABC site by web-form email, forums and even a 'Complaints' facility.

The News site even has a 'Contribute' page:
If you witness a news event, the ABC wants to hear from you. We would like you to send us your newsworthy photos, videos, audio clips or even written eyewitness accounts for consideration for use on ABC News.

Can you see the assumption in there? It's insular and iconoclastic.
We find 'the news', you listen.

There's a secondary assumption:
News is only 'events' that can be represented in sound-bites and pictures. There is no allowance for informed contributions and bigger stories.

There is a simple test:
Does the system facilitate or block major public interest stories like "Dr. Death" at Rockhampton Hospital, false 'evidence' claims as in the 'Children Overboard' affair, or problems like gross waste/misuse of public money, dereliction of duty, outrageous behaviour of public officials/politicians or endemic corruption?

It is embarrassingly insular to assume that, as a publisher/broadcaster, you always know better than the entire listening audience what is going on.

That one person, using the technology well and wisely, can access and leverage community knowledge, globally, and affect a major outcome is shown by 'PJ' (Pamela Jones) of Groklaw and her influence on the "SCO case". SCO became the final licensee of the AT&T Unix codebase and sought to leverage this into a 'tax' on Linux, a re-implementation loosely based on Unix.
In the intervening years (2003 - 2007) PJ and Groklaw can be credited with unearthing and exposing many of the flaws in SCO’s case, most notably, obtaining and publishing the 1994 settlement in the USL vs BSDi case, which had been hidden from public view and played a significant role in undermining SCO’s claims to the ownership of Unix.
PJ's efforts and collaboration with the global community were instrumental in SCO losing its case. No one company, even vendors like IBM, Novell and AT&T, and certainly no consumer, had all the information nor all requisite manpower to definitively dismiss the claims.

Lesson: One person can make a difference, if they apply themselves and the technology appropriately.

The Internet is a new thing - it's not just a faster, cheaper, better way to do the same things.
If you simple-mindedly automate existing practices, you will open the floodgates and will drown in electronic verbiage.

When the 'barriers to entry', the cost in time, energy & money, of communicating are very low then people will bombard you with messages. The sheer volume and the Signal-to-Noise ratio means the content is worthless: a small army, let alone a single person, won't be able to read everything and duplicate/irrelevant information will drown out any gems therein.

Computers also hold the key to the problem - they are Cognitive Amplifiers.
They enable one person to do the work of 10, 100 or 1,000. More quickly, more cheaply and often 'better' in important ways.

The ABC News Division employs 700 professionals.
They certainly perform very well, but are they sufficient to find and research all worthy stories locally, let alone all local interest stories occurring world-wide?

What's the price to the Organisation, the Australian Media, Government/Politics and the Australian Public of missing important stories??

There are many problems to be addressed and overcome before a useful system can emerge:
  • SPAM
  • Security/malware - controlling & avoiding upload/dissemination
  • Denial-of-Service attacks and webpage and other information hacking,
  • Retaliatory attacks and deliberate mis-information by 'sources'.
  • Mischief makers, Gossips and Defamatory statements.
  • 'Personal Agendas' and Vendettas/Disgruntled persons,
  • Copyright violation and Plagiarism,
  • Nuisance, time-wasting, 'serial pests' and Vexatious persons
Ain't no bed of roses...

There are always going to be people who wish to remain anonymous - either completely or in the usual 'off the record' sense where they do not wish to be publicly quoted, but are willing provide a written statement and, if necessary, to stand by their comments in court.

One key technical method to address many of these problems is strong identification of posters.
This has to be of similar strength to X.509 client certificates for browsers with the concomitant off-line checks issuing them.
Off-line confirmation identity is essential - like checking the whitepages and calling the person, or sending an SMS - up to sighting 'photo id'.

A Proposal:

Immediately, three improvements would help address my frustrations:
  • publish role-based, not personal, email addresses for both senior positions and the various news desks/editors/specialist correspondents that must exist internally.
  • Add a new contact form for story requests, useful information and leads.
    The sorts of things needed for investigative journalists. It must include topic categorisation for automatic distribution to the news desks/story areas.
  • Automatically acknowledge all contacts - by via email or SMS or other simple means.
A backend system is needed to automatically store and distribute input. Something of this sort must already exist - it certainly is there for published stories.

Writing is a skill that must be practised.

Members of the public do not have journalistic skills, nor a sense of what journalists consider 'newsworthy', nor what is needed to successfully pitch a story - even if they have found the right person. The limited feedback I've received amounts to "just write clearly". I know how to do that in a number of domains, but have no idea what journalists want and need.

This can be addressed at 3 levels:
  • An on-line tutorial and example system.
    Including some template questions and suggestions for ways to both condense/summarise your information and to self-assess its 'importance' and 'newsworthiness'.
  • A limited (5 minutes) response by a journalist to any specific questions or advice.
  • The ability to pay for editorial help in constructing a pitch and a even a story.
    The rate would have to be $50-$75/hour for cost recovery, more to act as brake on overuse.
One very strong asset of the ABC is its 'Friends'.

With the increasing numbers of retired Baby Boomers - including ex-journalists, there should be a large pool of free labour available to review, categorise and respond to the information fire hose that would be unleashed.

The Internet means people can volunteer for an organisation without the classic problems of desk-space, real-time supervision, insurance and other entitlements. Volunteers work from home, when and for as long as they like. They could even be self-administering.

At the end of the day, it comes down to just one question:
What does the ABC consider its own and the community's on-going roles, and
how will it stay current with sociological, cultural & technological changes?

2009/04/05

Alan Kay - History and Revolution in I.T.

Alan Kay invented Object Oriented Programming around 40 years ago with Smalltalk.
Kay not only has a lot to say, his accomplishments lend him credibility. In 2003 he said:
"our field is a field that's the next great 500-year idea after the printing press"
The ACM awarded him its highest honour, The Turing Award, in 2003. The short Citation:
For pioneering many of the ideas at the root of contemporary object-oriented programming languages, leading the team that developed Smalltalk, and for fundamental contributions to personal computing.
Video of his 60min talk on the ACM site, and elsewhere, a transcript. The slides & demo used are not available.

This 1982 talk for "Creative Think" brought this reporter reaction (the link is worth reading for the list of line-liners alone):
Alan's speech was revelatory and was perhaps the most inspiring talk that I ever attended.
This is my current favourite quote of Alan Kay's from Wikiquote:
"Point of view is worth 80 IQ point"
A 2004 conversation with Kay on the (deep) History of Computing Languages is well worth reading. Here are two interesting remarks:
One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects. (starting circa 1984)
and
Just as an aside, to give you an interesting benchmark—on roughly the same system, roughly optimized the same way, a benchmark from 1979 at Xerox PARC runs only 50 times faster today.
Moore’s law has given us somewhere between 40,000 and 60,000 times improvement in that time.
So there’s approximately a factor of 1,000 in efficiency that has been lost by bad CPU architectures.
Kay, in this piece, also mentions a theme - the central problem of writing large systems is Scaling - the Design & Architecture of systems. Anybody can take lumber, hammer, saw, nails and produce some version of a dog-house. To scale up to something very large requires skill, discipline and insight - Architecture is literally "the science of arches", the difference between Chartres Cathedral and the Parthenon. Both contain around the same amount of material, the cathedral encloses ~20 times the volume and towers are 10+ times higher.
AK Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.

SF The analogy is even better because there are the hidden chambers that nobody can understand.

AK I would compare the Smalltalk stuff that we did in the ’70s with something like a Gothic cathedral. We had two ideas, really. One of them we got from Lisp: late binding. The other one was the idea of objects. Those gave us something a little bit like the arch, so we were able to make complex, seemingly large structures out of very little material, but I wouldn’t put us much past the engineering of 1,000 years ago.

If you look at [Doug] Engelbart’s demo (wikipedia) [a live online hypermedia demonstration of the pioneering work that Engelbart’s group had been doing at Stanford Research Institute, presented at the 1968 Fall Joint Computer Conference], then you see many more ideas about how to boost the collective IQ of groups and help them to work together than you see in the commercial systems today.

Other interesting pages on Alan Kay talks:

Relevance to Open Source and Paradigm shifts

Kay claims that 95% of people are 'instrumental reasoners' and the remaining 5% 'are interested in ideas'.
an instrumental reasoner is a person who judges any new tool or idea by how well that tool or idea contributes to his or her current goal.
He goes onto talk about reward/motivation and says that 85% of people are 'outer motivated' versus 15% 'inner motivated'.

Most people (~80%) fall into the 'outer motivated instrumental reasoners' group.
These people won't pick up an idea if other people aren't doing it. Which seems like a very wise evolutionary group tactic - if a little safe.

Kay, in his ACM talk, uses a contagion or forest fire model to demonstrate/claim that around 66% of a population is needed to achieve 'ignition'. To hit the tipping point where 'everyone is doing it' and the new idea takes over.

Kay also makes the observation that Big Ideas (he cites Unix) often take around 30 years to hit the streets - to become normally used.
And also wondered why after ~40 years Ivan Sutherland's ideas from his 'sketch' program from his ~1961 thesis haven't broken through.

Applying to Open Source Propagation

Putting this idea, if correct, of 'tipping point' to work in spreading FOSS :
  • find/choose communities who are high in 'interested in ideas' (artists & creatives?)
  • find a small community and intensively sell/lobby/influence it to get to the tipping point.
  • leverage these communities or even artificial environments by introducing 'outsiders' (high schoolers?) to a converted environment.
But what Kay doesn't go near is that even the best marketers, PR and media people can't predict the next fad/craze/mene - they happen because they happen.

2009/03/29

Reactionary or 'Frothing at the Mouth'?

Is my opinion "Forget the Best, Embrace the Rest" over the top, reactionary and irrelevant nonsense?

The State of Practice is beyond criticism - because there is no useful information on it.

Here are 3 questions to ask of Management theory & thought:
  • Exactly why "Management is Hard"?

  • What are the tasks of Management?
    i.e. a formal & unequivocal model of the dimensions of action and decision, resources & information required and skills/capability required of individuals and teams.

  • How to quantify the performance of individual Managers and the Management Team?
These are fundamental questions and should at least be definitively outlined in any introductory text or course - but aren't.

Which begs the question: Why aren't they addressed?

Either I'm completely off-track & uniformed or have outlined something of merit.

If this viewpoint is of merit, What then?

2009/03/28

Forget the best, embrace the rest

It appears to me that 'Homo Corporatus' (the 'management classes') rejects, seemingly actively, the need for maintaining "Lessons Learned" and adopting in practice the best theories & principles known. The Operant Methodology seems to be:
Forget the best, embrace the rest.
This isn't a little or accidental. It's endemic and universal.

2009/03/01

Inverse Turing - Modelling Human Brain with Universal Computer

Turing showed that all computer programs could be transformed to run on a very simple 'Universal Computer'.

One of the big questions this transformation allows is: Will the program Halt?
[The Halting Problem]
This, in the guise of a program to calculate the result, was used to prove some problems are not computable.

Turing also devised a unrelated idea - the Turing Test - that was at the heart of Artifiical Intelligence (AI) research for many years: If you can't see or hear what/whom you're conversing with, can you tell the difference between a real Human and a computer program?

Turning the 'Turing Test' around: Is the Human brain built from computing elements?

This isn't quite "Is it a computer?", there are two elements the structure and the componentry from which it's built.

If the Human brain is a computing device, and if its organisation and 'programming' can be discovered, then it can represented on an electronic computer. This is an unstated idea underlying the Turing Test.

The act of discovering the organisation and connections/programming could be impossible directly:
  • every human brain is unique in structure and organisation, though built from the same components (neurons) and performing the same tasks (seeing, hearing, recall, ...).
    This can be trivially shown by 'plasticity' - when a person suffers a traumatic brain injury (like a stroke or lump of steel through the head) their brain keeps functioning and often regains lost capability. Tasks normally performed in one area are taken up elsewhere.
  • Because of the microscopic nature of neurons and their axon connections and the sheer volume of them, it's unlikely a non-destructive 'read' of the whole brain will be possible.
But the idea is still very useful, even if we are unlikely to ever be able to clone our brains.

Exact computer models of brain functions/sub-systems can be built, examined and experimented with.
This is not useful in AI for recreating functions like speech recognition - electronic computers work so differently to neurons/axons that it doesn't yield insight or help for a 'reimplementation'.

For disciplines like Psychology, Sociology creating models of brain function and interactions takes study to a new level - it allows precise and repeatable initial conditions and impossible experiments to performed.

Most importantly, it removes 'hand-waving' explanations and forces exactitude and direct evaluation of theoretical models.

Computer animation tools, like MASSIVE, already implement 'autonomous agents' programmed with simple human behaviour/responses. This is a perfect platform to model and test Sociology models.


2009/01/23

Microsoft Troubles - Starting signs

Previously I've written I expect Microsoft will experience a "Financial Pothole" around 2010, this is not it, only a portent of things to come.

Microsoft reported a profit slump and job layoffs today (AP, Reuters), as did many others - Intel and e-Bay are cited.

2008/12/06

Mircosoft Troubles - IV

Previously I've posted on my conjecture that Microsoft will hit turbulent financial times in 2010: Microsoft Troubles III, Microsfot Troubles II and Microsoft Financial woes in 2010

This article in CNN Money/Dow Jones Newswire cites data on some of the early effects becoming apparent.
Sales of Windows grew just 2% in the first quarter of fiscal 2009, which ended Sept. 30, 2008. In most years, Windows posts double-digit revenue growth, according to company data.

2008/11/30

Finance, FMAA & ANAO and Good Management: Never any excuse for repeating known errors

In light of the Sir Peter Gershon's Review of the Australian Government’s use of Information and Communication Technology, here's an email I sent to Lindsay Tanner (Finance Minister) prior to the 24-Nov-07 election of the Rudd ALP government. Edited lightly, formatting only.

Date: Sun, 11 Nov 2007 15:02:40 +1100
From: steve jenkin 
To:  lindsay.tanner.mp@aph.gov.au
Subject: Finance, FMAA & ANAO - Good Management: Never any excuse for repeating known errors

Here is something very powerful, but simple to implement & run, to amplify your proposed review of government operations and can be used to gain a real advantage over the conservative parties. On 8-Nov I wrote a version via the ALP website.


Headline:
The Libs talk about being Good Managers, but they have been asleep at the wheel for the last 10+ years.

It's not "efficient, effective or ethical" to allow public money to be wasted by repeating known mistakes.

Nothing new needs to be enacted - only the political will to demand Good Governance from bureaucrats and the 'ticker' to follow through.


2008/11/29

Gershon Report - Review of Australian FedGovt ICT

The Gershon Review is good solid stuff that doesn't rock the boat, doesn't challenge current methods & thinking, nor show deep understanding of the field.

It has a major omission - it addresses ICT inputs only.
ICT is useful only in what it enables others to do or improve - measuring & improving ICT outputs is completely missing from 'Gershon'.

It doesn't examine the fundamentals of ICT work:
  • What is that we do?
    How is Computing/IT special or different to anything else?

  • Why do we do it?
    Who benefits from our outputs and How?
Here are my partial answers to these questions:
  1. Computing is a "Cognitive Amplifier" allowing tasks to be done {Cheaper, Better, Quicker, More/Bigger}.

  2. IT is done for a Business Benefit.
    Like Marketing, defining how outputs & outcomes are measured and assessed - both in the macro and micro - is one of the most important initial tasks.

Gershon doesn't address outstanding issues of the IT Profession:
  • improving individual, organisational and general professional competence and performance.
  • Reducing preventable failures, incompetence/ignorance and under-performance.
  • Deliberate, directed & focussed effort is required to institute and maintain real Improvement of the Profession. (vs 'profession-al improvement' of practitioners)
After ~60 years of Commercial Computing:
  • Are there any new ways to stuff things up?
  • Is it "efficient, effective, ethical" to allow known Errors, Mistakes, Failures to recur without consequences? [see FMAA s44]
It isn't like the Government isn't aware of the processes and instruments needed to avoid repeating Known Errors, nor the benefits of doing so.

Aviation is controlled by ATSB (Australian Transport Safety Bureau, previously Bureau of Air Safety Investigation [BASI]) and CASA (Civil Aviation Safety Authority). The USA's FAI publishes hard data on all aspects of Aviation - and mostly they improve on every measure every year. This isn't just due to the march of technology - the figures for 'General Aviation' (as opposed to Regular Passenger Transport) plateaued decades ago... This is solid evidence that Aviation as a Profession takes itself seriously - and that commercial operators in one of the most competitive and cut-throat industries understand the commercial imperative of reducing Known Errors.

Aviation shows that profession wide attention to Learning and Improvement isn't just about Soft benefits, but translates into solid business fundamentals. You make more money if you don't repeat Know Errors/Mistakes.

ATSB investigates incidents and looks for Root Causes.
CASA takes these reports and turns them into enforceable guidelines - with direct penalties for individuals, groups and organisations. CASA is also responsible for the continual testing and certification of all licensed persons - pilots, Aircraft Engineers, ...

There are 4 specific areas Gershon could've included to cause real change in the IT Profession - to start the inculturation of Learning & Improvement and the flow-on business gains.
Federal Government accounts for 20% of total Australian IT expenditure. It is the single largest user and purchaser of IT - and uniquely positioned to redefine and change the entire IT profession in Australia.
  • Lessons Learned - Root Cause Analysis of Failures/Problems
    Dept. Finance 'Gateway Review Process' on Projects.
    Needs equivalent of CASA - inspection and enforcement of standards plus penalties/sanctions - Not just reviews and suggested guidelines.
    Not just ICT staff, not just FedGovt but their suppliers/vendors/contractors as well.
    Without real & timely (personal and organisational) consequences, nothing changes.

  • Standish 'Chaos Report' equivalent - real stats on IT Projects.
    Without solid numbers, nothing can change.

  • Operational Reviews.
    How well does an IT organisation do its work?
    Critical Self-assessment isn't possible - exactly the reason work needs to be cross-checked for errors/mistakes/omissions/defects.
    C.f. Military Operational Readiness Reviews - done by specialist, impartial experts.

  • Individual Capability Assessment - equivalent of on-going Pilot etc recertification.

  • Research: Quantifying & standardising metrics and models for "Effectiveness".
    DCITA/DBCDE on macro-economic results.


The ACS describes Gerhon's recommendations as "all aimed at addressing the efficiency of ICT":
  • governance,
  • capability,
  • ICT spending,
  • skills,
  • data centres
  • sustainable ICT
Note the issue of Reducing Faults/Failures/Errors/Mistakes doesn't make the list.
Nor does the idea of institutionalising the building/improving the Profession of IT and increasing the Capability/Performance of IT Professionals.

By the DCITA/DBCDE own reports, ICT contributes 75% of productivity improvements: ICT is still the single greatest point of leverage for organisations reducing costs and improving output.

Does getting IT right in Federal Government matter?
Absolutely.

Gershon delivers 'more of the same' and could conceivably achieve its targets of 5% & 10% cost improvement

2008/07/15

Bad Science or Science Done Badly?

Is 'Science', as practiced by Academic Researchers, executed poorly?

More specifically:
Is the practice of Research as undertaken by Academics, as effective as it could be?

This posits that an aspect of "Professional Research" is intentionally increasing your capability and effectiveness.

Computing/Information Technology is a Cognitive Amplifier - exactly suited to central parts of "Professional Research" - e.g. learning, recalling and searching published papers and books.

If an individual researcher can increase their "knowledge uptake" just 7% in a year, after a decade they know twice as much, given uptake builds on existing knowledge.

What is Research about if not Knowledge: Gathering, Analysis, Representation, Taxonomy/Ontology, Management and Communication?
This field began in 1995 and is broadly known as "Knowledge Management".

2008/05/28

I.T. Strategic Planning Failures

Sue Bushell asked on "LinkedIn": What are the most common failures in strategic IT planning and how are these best avoided? What best practices in strategic planning are most effective?

My answer:

1. There are no I.T. projects - only Business Projects.
Hence changing the premise of your question:
What are the most common business process failures around I.T. solutions?
[A: Make the business run the project and take the rap if it fails.]

2. I.T. is an Industry, not a Profession.
Proof: Professions Learn: repeating Known and avoidable Errors/Mistakes isn't consequence free, as it is within I.T.

3. The complete lack of History in I.T. - both on macro and micro scales.
  • Show me any large organisation that can even list all its current projects, which is a necessary starting point for:

  • Formal "Lessons Learned" from projects and operations - known problems are avoided, known effective practices are used.

  • Jerry Weinberg wrote definitive works on Software Quality Management and 35 years ago proved that focusing on Quality results in better code, written far faster & cheaper. And it is much more reliably and consistently produced!

  • Jim Johnson of Standish Group, nearly 15 years ago started definitive research on what proportion of IT Business Projects fail and the causes of failure. This work is fundamental to advancing the Profession - but nobody else studies this field so his results can't be verified or refuted. Nor have organisations or practitioners, by-and-large, acted on this knowledge. People do argue that his results are suspect because other single-shot reports don't agree. But nothing happens to resolve this fundamental issue!

  • Software ReUse is notable in how little it is practiced. Can it be possible that nearly ever problem is completely new? Not in my experience.

4. The fundamental reason IT is used: It's a "cognitive amplifier".
Computing amplifies the effort and output of people, providing results 'Cheaper, Better, Faster'.

On the micro scale, no organisation I've heard of measures this. It's quantitative and should be calculable by any half-reasonable Management Accountant.

On the macro scale, the 'Profession' doesn't have or publish benchmarks on results (i.e. from across many organisations).

5. The 'Profession' doesn't even have a taxonomy of jobs and tasks, let alone any consistent method for evaluating and reporting the competence of, and skill level of, practitioners.
  • In a construction project you wouldn't specify "10 vehicles needed", you say "6 5-tonne trucks, 2 utes, a 20-tonne tip-truck and a bobcat".

  • If the profession can't distinguish between the speciality, competence and skill levels of its practitioners, how can the business folk?

  • If project plans don't identify the necessary the precise skills needed - implying some way to assess and rate the 'degree of difficulty' of individual tasks/components - then the right 'resources' can't be applied.

6. The almost complete disconnect between research results and practice. Enough said.

7. [Added]. The general capability of the Profession in general and young I.T. practitioners has declined greatly.
Proof: The increasing number of failed projects attempting to replace 'Legacy Systems'.

E.g. The failed A$200M Federal Government ADCNET project. I worked on the original IBM mainframe system, then found myself 15 years later sitting in the same awful basement not 50 feet away, coding it's replacement. The IBM system took 30-35 man-years (in structured assembler), just the second phase of the ADCNET system had a team of 70 for 1-2 years - and was abandoned. The best description of it is the Federal Court Judgment:
GEC Marconi Systems Pty Limited v BHP Information Technology Pty Limited
Federal Court of Australia
12 February 2003 and 14 July 2003
[2003] FCA 50; [2003] FCA 688

8. [Added] Creating Software is a performance discipline.
You have to both know the theory and be able to create good software.
Who are the Great Heros of Open Source? The guys that demonstrate they can code well.

Like Music, Surgery and Architecture, software requires head and hands to do it well.


9. [Added] Design is Everything.
This is what the Bell Labs Computing Research guys understood and what Microsoft doesn't. They invented the most cloned Operating System in the world - Unix, and then went onto build Plan 9, it's replacement 20 years later - with around 20 man-years. It was created portable and scalable, running on 6 different platforms from day 1. Of course it was incredibly small and blindingly fast. Time has shown it was robust and secure as well.

Not an accident that 15 years later Microsoft spent around 25,000 man-years on 'Longhorn', and then threw it all away! (The infamous 'Longhorn Reset' on 23-Sept-2005 by Jim Allchin)
Then spent the same again to create 'Vista' afresh from the 'Windows Server 2003' codebase.

How could Microsoft not understand what was well known 15 years prior, especially as Microsoft ported Unix to Intel in 1985?


There's more, but that will do for now.


"I.T. Governance" may be part of the Solution, but standards like AS8015 are primarily aimed at allocating blame or pushing all responsibility for failure onto I.T. and abnegating from I.T. any successes.

The 'root cause' of all I.T. failures is trivial to identify, but probably exceedingly hard to fix. These days, almost no projects should fail due to technology limitations - only practitioner and management failures.

The 'root cause' is: Business Management.

Yes, there are many problems with I.T. practitioners, but think about it...

Around 1950, Commercial Computing was born.
Some projects worked, in fact succeeded brilliantly: Man went to the moon on the back of that work just 2 decades later.

And then we have the majority or 'ordinary' projects that fail to deliver, are abandoned or under-deliver...

The first time 'management' commissioned a bunch of 'Bright Young Things' to build The Very Best Computer System Ever, they would naturally believe the nerds and their self-confidence.

After that effort failed, what would the rational approach be to the next project?

Not the usual, "do whatever you want and we'll see", but "you didn't do so well last time, how about we try smaller pieces or doing it differently?"

And when lining up for the third go-round, you'd think competent business managers (the ones writing the cheques) would put the brakes on and say "you haven't shown you can deliver results, we have to manage you closely for your own sakes."

"Fool me once, shame on you. Fool me twice, shame on me."

And who's the cause on the third, fifth, hundredth or thousandth repetition?
The people who keep paying for the same 'ol, same 'ol.




2008/03/09

Videos on Flash Memory Cards - II

My friend Mark expanded on my idea of "HD DV being irrelevant" - like phone SIM's, video stores can sell/rent videos on flash cards (like SD) sealed in a credit-card carrier.

The issues are more commercial than technical. 8Gb USB flash memory might hit the A$50 price point this year - and A$30 next year. There is a 'base price' for flash memory - around $10-$15.

This inverts the current cost structure of expensive reader/writer and cheap media. Which is perfect for rental/leasing of media - a refundable 'media deposit' works. An added bonus for content owners is a significant "price barrier" for consumers wanting to make a copy. If a 'stack' of 100 SD cards costs $1500 (vs $100 for DVDs), very few people will throw these around 'like candy'.

Mark's comments:

Y'know, the more I think of it, the more the SD-embedded-in-a-credit-card has a lot of appeal when the availability and price point for 8Gb SDs is right. It makes it easy to print a picture, title and credits/notices etc on the 'credit card' - something big enough to be readable and a convenient display format and, as you say, nicely wallet-sized. Snap off the SD and you've agreed to the conditions etc, plus the media is now obviously 'used'.

It's a useful format for other distributions too - games, software, etc (Comes to mind that SAS media still comes on literally dozens of CDs in a cardboard box the size of a couple of shoe boxes).

My complete collection of "Buffy" would come in something the size of a can of SPAM or smaller, rather than something the size of a couple of house bricks for the DVD version, or something still the size of a regular paperback for the Blu-Ray version. For collectors of such things, the difference between having many bookshelves taken up by the complete set of Vs a small box of credit card (or smaller) sized objects is significant. The ability to legally re-burn or replace and re-burn the media when it fails is critical though.
SJ: Because of the per-copy encoding to a 'key', stealing expensive collections isn't useful, unless the key is also taken. So those 'keys' have to be something you don't leave in the Video player.

You've covered the DRM aspects and better alternatives to DRM - which also means that I can burn and sign the media I might produce and distribute myself without needing to involve the likes of Sony or Verisign - although that is possible also - which protects the little producer. Include content in Chrissy and Birthday cards - you've seen those Birthday cards with a CD of songs from your birth year - why not a sample of the movies from that year, plus newsreels etc. Good for things like audio books - whole collections. And if the content on an SD gets destroyed, as long as the media is OK, it would be possible to re-burn it. Most current DVD players now also have SD readers as standard.

Surely someone has thought of it already! Part of the attraction of DVD over storing your library on a 2TB USB disk from Dick Smith is the problem of backups. DVD is perceived, incorrectly, as permanent storage. Though I notice some external USB drives now have built-in RAID 1 or RAID 5, but Joe public doesn't see the need (how come I bought a 2TB drive and I only get 1TB?).

Yeah, I think the proposition that SD or similar will become the ubiquitous preferred standard portable, point-of-sale, recording and backup storage media for photos, movies and music, has some credence. There is something to be said for - "you pick it up in your hand; you buy it; it's yours" - over - "downloading and buying some limited 'right to use' ".

2008/03/07

Service Desk and Politician e-mail

Over the last year I've penned 6+ e-mails to various Labor Party politicians - including one of my local representatives who've I dealt with for ~10 years.

And not one reply. Zero, Zip, Nada...

Rang the Good Person's electoral office today - and got various run-around responses. "Oh, I've been on holiday", "Oh, can they call you" and "they are booked solid for a month".

Yeah, right.

I first contacted my rep. last December saying "this can wait until after the School Holidays". January came and went, no reply... A follow-up email yielded nothing... A note to the support staff was replied to: "I've moved. XXX is responsible".

What I originally wanted to talk about was 3 emails I'd sent various members without even getting acknowledged. Which is strange, because in the media I've seen reports that Political Parties are now tracking every contact from a voter. Putting together, apparently, impressive profiles - and all completely legit under the Privacy Laws.

For a new Government this seems a pretty poor response, doubly so for one that prides itself on 'listening'.

The solution that I wanted to put forward to my Rep:

Use HelpDesk Software to manage constituent contacts.
Not just piecemeal, but an integrated system for all participating elected members.

Not all that hard.
It scales. It goes across the whole Party. It covers both 'aph.gov.au' contacts and via other email addresses. It copes with email, phone, fax, mail and personal contacts - and the worst of all "voice prompt systems".

The software is well known, there are many vendors and trained consultants and the marketplace is competitive. As consumers and office workers, most of us are used to the concepts and who these systems all work.

It creates a definite process - with self-imposed rules & priorities that are checked and enforced.

AND it ensures that little people like me don't just fall between the cracks.
Or if some 'critical person' falls down - work queues can get given to those who can best deal with them.

Imagine getting a tracking number back from your local Pollie, and being able to automatically check where it is up to - and just when you should expect an answer. Wow! Just like they worked for us and were trying to use the technology responsibly...

It would do a service for our erstwhile representatives - you know, the ones we pay to work for us:
  • They could become more efficient - by delegating work, not needing to deal with "whatever happened to" requests, and identifying common themes and selecting the most efficient way to respond.
  • They could make a very exact case for additional clerical support from the Parliament - or even have a pool of paid staff doing the grunt work.
So I'm not holding my breath while waiting for anything different to happen.

The Internet Changes Everything - but Politicans and their ways.

2008/03/06

Who cares about HD DV?

Talking to a friend at lunch today, the topic of "Blu-Ray" vs "HD DV" formats came up...

I think "Blu-Ray" may take the market, but it won't be much of a market.
There are just too many competitors for moving around video files:
  • DVD format disks - still good for 8Gb (dual layer). Drives & media are cheap.
  • flash memory - 2008 sees A$50 for 8Gb on USB (less on SD card)
  • A$300 for 750-,1000Gb USB hard-drives. Under $1/DVD.
  • Internet download. With ADSL 2+ giving 5-10Mbps for many.
My thesis to my friend was "Video stores may well go for SD cards". Pay a refundable deposit for the flash card, and a fee (rental or ownership) for the content. Video stores can pre-burn large numbers of movies - and if you want a 'special' - they can make one for you in 20 minutes.

His response: "they could package them like SIMs - in a snap-off credit card-sized holder". Which is better than any idea I've had on packaging.
And it fulfills the most important criteria:
fits comfortably in a pocket (now a wallet)


Practical problems:
  • How to stop people copying the flash and resealing it?
  • Some sort of effective copy-protection system would be good.
  • Flagging 'ownership' or usage conditions of a movie. Not so much DRM, but 'this is property of XXX'
These problems can be nicely solved by users having their own "Key Card" with a digital identity and an encryption key.

The flash needs a 'fuse' that is broken when the card is freed. Preferably an on-chip use counter that can only be factory reset.

To issue a movie to a customer, the encoding key of the video (if present) would be combined with the users key - and the resulting unique key written on the card. Players need both the card and user key to decode and play the movie.

That same process also tags the card with the current owner.
You lose it, it can come home to you.

Because the content can be locked to a particular ID, the raw content can be stored on disk without the movie studios giving away their birth right.

Summary:
I think 120mm disks are going to follow the floppy disk into the technology graveyard.
They will have certain uses - like posting something on cheap, robust media.

With the convergence of PC displays and Home Theater, the whole "Hi-Def TV" problem is morphing. Blu-Ray - can't wait to not buy one.

2008/02/08

The Open Source Business Model

This post by Dana Blankenhorn on ZDnet is the best answer I've seen to the question "Why Open Source?".

He says 'plumbing', I'd say '(Unix|Open Source) is the Universal Glue'.
And the on-going Open Source Business Model is "support" for those that need/want 'certainty'.

Which if you are the CIO (read: 'my arse is on the line') for somewhere with a high dependence on I.T., is only Good Governance (or "common sense"). You can't make key staff stay, nor mandate they never get sick or burn-out and "go sit on a beach" - and after '9/11', all Business Continuity plans have to account for covering people as well as systems and networks.

That's it - Business I.T. is all about the Data (or "all about XXX, stupid" to be Clintonesque).
Open Source tools are usually about manipulating data or providing services - like Apache, e-mail, DNS, firewalls and IDS, ...

Open Source is here to stay: use it, don't deny or fight it.

This Business Model, 'support for essential tools', is robust and on-going.
Whatever systems you use in the Data Center, you'll always have the need to provide many services and interface disparate systems and data formats.

The model also applies to embedded 'Appliances' and dedicated devices, like firewalls - or commercial web-hosting services. They are based in whole or part on Open Source.

You'll note this model has very limited application to the client-side - the 'Desktop' or End-User compute platform.

"Free Software" from GNU et al is about an ideological stance and subsumes all other goals to this.

"Open Source" is pragmatic and about getting on with the job. It makes sense for large vendors, like IBM and HP, to support it. Customers can feel confident and secure - because the source and tool-chain are freely available from multiple sites, they cannot be held to ransom or 'orphaned' by unpredictable events or capricious decisions.

"Open Source" starts from the premise that "IT is done for a Business Benefit" - that you build software, systems and services for the use of others, not your own amusement and benefit.

Business supporting software has to meet Professional standards/criteria - good design, clear documentation, reliability, robustness and very few errors/defects - with the unstated driver of Continuous Improvement.

Never new features for their own sake or to create 'forced upgrades', always making the code more stable, usable and useful.
Commercial considerations, by definition, are always subsidiary to technical. If the user community doesn't like changes - they aren't forced to upgrade and in an extreme case, can 'fork' the code, internally or publicly: just do it how they want.

2008/01/19

Human Response to Cognitive Workload

Context: This piece started as a question to a researcher in Psychology.

There's a topic I've been trying to find research results for some time.
I call it "(Human) Cognitive Response to Workload".

There is a bunch of qualitative data for "Physiological response to Workload" available - e.g. US Navy for "stokers" working in different levels of heat.

I found Prof. Lisanne Bainbridge in the UK. She's retired now. Her field is 'mental load' and couldn't point me at research in the area or help me properly phrase my question.
She pointed me at Penny Sanderston, now Prof. at University of Queensland.

What I'm interested in is any information to apply to Software Developers and other Knowledge workers:
  • In the short, medium & longer term (day, week, year) how do you maximise cognitive output?
  • What roles do sleep, recreation & holidays play in 'recharging' cognitive abilities?
  • For different levels (degrees of difficulty) of cognitive task (or skilled manual task) what are the optimum work rates and duty cycles? (ratio of work/rest)
Related areas are Rates of Error & 'tiredness' effect on maximum cognitive task.
[James T. Reason has very good work on "Human Error" and "Organisational Error". His work is used extensively in Aviation and Nuclear safety. He originated "the swiss-cheese" model of accidents.]

2008/01/01

Solving 'Spam'

It never ceases to amaze me, the Politician attitude to Porn and 'Spam' & it's friend, malware.

Porn is "bad, bad, bad" and Pollies show very high interest - including policy & legislation.

Lots of angst & trashing around about eradicating something that 2,000+ years of writing/publishing shows can't be controlled/legislated away. The physical publishing world & (cable) TV show that the only effective is means of control is to allow-but-license.

Same as tobacco. Never going to eradicate it, only control it.

'Restricted Content' access can only be controlled iff:
  • every page is 'classified' at source (meta-tags),
  • an unforgeable Internet 'proof-of-age' card/system is created,
  • there are criminal penalties for subverting the system, forging identities or misclassifying pages,
  • there are no legal jurisdictions outside 'the system' [e.g. on the high-seas],
  • all browsers enforce 'the rules',
  • and browsers can't be built/written to ignore 'the rules'.
i.e. It is impossible to eliminate 'restricted content', and possibly provably so...