Compare the local IT failures with these comments from Infoq. Author site.
This is best illustrated by the findings from the US Department of Defense (the DoD).
The DoD analysed the results of its software spending, totalling an eye-watering $35.7 billion, during 1995.
They found that only 2 per cent (2%) of the software was able to be used as delivered.
The vast majority, 75 per cent, of the software was either never used or was cancelled prior to delivery.
The remaining 23 per cent of the software was only used following modification.
That would suggest that the DoD actually only received business value from $0.75 billion of its expenditure – nearly $35 billion of its expenditure did not result in software that delivered any immediate business value.
 The results of the study were presented at the 5th Annual Joint Aerospace Weapons Systems Support, Sensors, and Simulation Symposium (JAWS S3) in 1999.
- systemic problems with Project Management: projects are much more than "mere technical problems".
- not just the Public sector
- no difference between outcomes & work in Public and Private Sector
- just not visible in Private sector.
There were also a few wonderful comments along the lines of spin & misdirection:
"there are no problems, you're just missing the central point, it's all wondrous"
Is current technology not able to meet the specs?
If neither, then:
- where are the Audit Office investigations, like Bureau of Air Safety investigations?
- where are the *penalties* for individs, mgrs and companies for repeating Known Errors, Faults, Failures?
- where are the consequences for Public Servants, PM's and line Mgrs, for allowing stuff-ups?
After 60+ years of IT foul-ups, why aren't those responsible for the "Checks and Balances" _preventing_ them or ensuring "It won't happen again"???
Not just an IT and management failure, but systemic failure of auditors, investigators, political process and compliance enforcers:
FAILS Govt own rules for bureaucrats - "efficient, effective, ethical" use of Govt money and resource.
Software and Systems are "hard".
But that's NO reason to repeat or allow stuff-ups that are predictable and preventable.
Can good software/systems be written: Ask NASA, ask Airbus & Boeing.
You can *bet your life* it's not just possible, but practicable.
Where is the outrage _and_ action (investigation, censure, retraining) by the Professional Bodies responsible and by those responsible for training Computing, Software and IT professionals?
Could you imagine _any_ of the Engineering Disciplines NOT reacting to on-going Professional Failures of this magnitude and importance? I can't.
Can you imagine _any_ Government Agency running without IT systems these days?
They are _critically_ dependent on their IT systems, running and being correct, for daily operations.
What bit of "we can allow IT failures" is Professional or Ethical?
@Glen (speaks of "fatcats" and other sources of waste and over-runs)
Not fatcat, demonstrably _incompetent_ and negligent.
Here are the basic management and project questions:
- If we are doing this project "for a business benefit" in the same way we invest in marketing, how will we measure its _specific_ business benefits over the _full_ life of the project?
- How hard is the project, how long will it take _our_ people in _our_ environment to do it?
- this presumes a database of local IT projects completed, estimates and actual.
- and a database of the _productivity_ of each individual, team and Project Manager.
- What are the specific capabilities, both task area and degree-of-difficulty, of each IT worker that will be involved in the project?
- Do we have critical shortfalls on particular task areas or in high-degree-of-difficulty tasks?
(Do we have the folks we need to do the hardest bits properly?)
- Have we done any projects before with similar or overlapping specs?
- what do we need to repeat prior SUCCESSes?
- what do we need to do to avoid previous FAILURES?
These seem to me these are basic and necessary questions that should be answerable in any business whose operations are dependant on its IT & Systems and have been doing the work for more than a decade.
You've accurately stated most of the problems with IT projects and its Project Management.
These have all been researched and solved many times over.
At least once in each decade from 1970 onwards.
Agile, XP and "Design Thinking" are just the latest go-round.
The problem isn't the finding solutions, but remembering them and getting them used by this generation and the next.
This is a failure of IT management, the Profession/Professional Bodies and Education.
It is also a gross failure of line management - allowing IT projects to keep stuffing up.
Everyone is allowed a first mistake, but after the first big or small IT disaster, an engaged and competent manager would ask two questions:
- why won't it be different _this_ time?
- what have you changed so that it _will_ be different?
Line-management and senior mgrs love to assign blame and berate IT as incompetent and worse.
But they are the people who've abrogated their responsibility and not just allowed, but encouraged, on-going IT disasters.
Ask the parent of any 3-yo: what matters most, what you _say_ or what you _do_?
At some point, management have to take responsibility for their lack of oversight and process improvement.
NO differences between Public & Private sector.
same Project Managers
same technical people.
‘Same’ doesn’t mean ‘like’, it means ‘identical’ as in the same individuals cycle through all the big firms and work for public/private.
I’ve travelled enough and set out to meet IT practitioners from all over.
My evidence is this failure of IT as a Profession is the norm across _all_ the developed world.
- Y2K had universal impact.
- as does malware.
Did Westpac get roasted for throwing away $500MM-$1000MM on "CS90"?
Do the shareholders _know_ anything changed?
Did any of the people responsible suffer consequences?
Is CS90 used as a local example in Uni courses on Project Management?
*No* to all...
On Standish and their CHAOS report.
- created a lot of interest and controversy when it came out
-while they skipped one early year, Jim Johnson and his team have continued to release new survey data every year since, allowing change tracking.
- despite the apparently calamitous state of the Industry _nobody_ has replicated the series
- even so, the veracity & methodology of CHAOS has been roundly criticised and attacked in the literature.
What's wrong with this picture? everything...
Those who say "it couldn't be _that_ bad" are either uninformed or disingenuous.
The thing is, there is credible, tested research in IT, going back to the early _1970's_ that says, not only is it _less_ work to "Do it Once, Do it Right", it's also quicker and cheaper.
I cannot understand why the _best_ work isn't picked up, while the industry still endlessly chases the newest fad & fashion - with unproven or questionable results. It's upside-down.
[responding to "you're talking about Lean and "validated learning" "]
You're exactly right, but a don't show good knowledge of historical efforts.
This stuff is a) well-known and b) is shown to work and c) its always been _easier_ and _cheaper_ to do it right.
Von Neuman got it right and created a high-performance, high-quality culture, but this was lost. Why?
Your current understanding and language are _correct_ and I applaud you & your teams if you use them.
But they are restatement and rediscovery of previous work going back at least 4 decades.
My on-going points are twofold:
- since early 1970's how Software can be done "better/cheaper/faster" has been studied/researched and good techniques discovered and widely disseminated,
- BUT the known, effective approaches are not widely adopted nor do known "best practices" get taken up AND made stick.
What's going on is NOT a Profession in action, only a disorganised "Industry".
And the deeper questions:
- Since the 1968 NATO Conference on Software Engineering, is there _any_ reason for Universities NOT to turn out graduates who aren't trained and experienced in _known Best Practices_?
- For the last 40 years, is there _any_ reason that Management has allowed I.T. project to fail from Known Causes? It's OK for Professionals to make mistakes, but _never_ to repeat them.
- As Government and Large Business have become increasingly dependent on Computing & ICT, is there _any_ reason that their "Checks and Balances" processes & groups _haven't_ starting collecting basic project & outcome data or started investigating failures and imposing "consequences" on individuals and organisations repeating, or allowing, known errors, faults and failures?
I appreciate your Professionalism, insight and understanding and that you've dedicated yourself to Software Improvement - you should be applauded & recognised for that.
I trying to draw attention to something much larger that's going on and that it's been noticed and _documented_ as a massive problem for around 50 years...
As Larry Constantine observed in 1994:
There is NO Software Crisis. Crises are time-limited, this has been going for _decades_, it's a slow train-wreck.
There is a systemic problem that keeps recreating the same poor cultures, same worst-possible management and same predicable, preventable and unavoidable failures.
This has all been said before:
so what aren't the people that can _fix_ things, the business owners and govt ministers, taking notice?