Because what is outlined is incomplete:
They have failed to address the root cause of cyber-attacks: vulnerable and error-filled Operating Systems and poor Application Software. Fix the weakness, stop the compromises before they happen, spend the money on where it can do good, not support "Business As Usual".Cleaning up the mess and containing damage after the fact is exactly wrong: it's attempting to catch the horse after its has bolted.
The massive operational workload, complex network designs/configuration and sophisticated tools and devices for Intrusion Detection, Prevention and Response are all based on the same flawed concept and reasoning that led to the "Blue Screen of Death" being blithely accepted on desktops and servers for a decade or more. Not on the computers I've run for the last 4 decades...
This is now a flawed, even counter-productive, attitude:
Software is hard and naturally error prone. Vendors "do their best" but "nobody is perfect". "That's what happens"...Faulty Software can never, ever be Secure Software. They are the Foundations on which everything else are built. Systems, like buildings, cannot be safe or dependable if built on weak or soft foundations.
The Tower of Pisa is the shining example: nothing can be done to fix the building or prevent its inevitable fall unless and until strong, stable foundations are in place.
The minimum standard for Reliability, Errors, Security capabilities and Functionality that all commercial Operating Systems and Software should now meet is:
The have to equal or better what can be acquired freely. If free source code is available and is better on important metrics, such as security/reliability, then vendors can just use it and improve it. If they don't, it should be argued they fail one of the most important tests of Consumer Law: commercial products must be of merchantable quality.Here's the current reality:
Software is hard but can, and is, designed and built to be robust and secure. This is the minimum acceptable standard currently.
The proof is the many different mobile computing platforms available that have made mobile devices the most prolific computing platforms around: iOS from Apple, Android from Google and the plethora of "appliances" for storage, networking and communications.
What they have in common: they are POSIX-compatible and the source code is available, either fully or partially.The many free Operating Systems available boast two important security properties:
- they are provided as source code, allowing concerned users to look for themselves, and
- their designs are solid and robust: vulnerabilities are limited and quickly fixed & distributed.
So if its out there, free and fully documented supported, why hasn't every major desktop and server vendor selling to Government critical industries, such as Law Enforcement, Healthcare and Infrastructure, implemented it?
Why after a decade is the NSA's "Security Enhanced" environment not standard on every server and critical desktop? It's not because it's hidden, unavailable or expensive...
So why haven't all Australian Governments set a date for compliance based, as a minimum, on what is freely available? We now know the Federal Government is hugely concerned about cyber-security for itself and the private sector... So what's going on? Ignorance, Trepidation at "Being First", a Failure of Imagination, supporting Vested Interests or Complacency by the relevant bureaucrats are just some of the possibilities that spring to mind.
There is no excuse for any Government Agency to insist on "Industry Standard" systems, tools and applications when cyber-security is now their highest priority and these products are known to be defective and security "sieves", not just slightly flawed.
The security experts within DSD know all this, so why doesn't the National Security Advisor promote and advise "Prevention and Strengthening Systems and Software" as part of the cyber-security response?
It's cheaper, surer, faster and more reliable than first allowing vulnerable software onto servers and desktops, then trying to detect and contain preventable intrusions.
How will this attitude/approach translate to the new world of Cloud Computing? Very, very badly.
Here are some of the other issues I think should be mandated:
- No "binary only" software allowed on Government servers, desktops or devices. This fails the ironclad rule of Trust: if you didn't build it from the ground-up, you cannot trust it.
- Ironically, we should consider US-based network suppliers as untrusted as Huawei and other Chinese national vendors, iff they don't supply the full-source code and tool-chains for their systems.
- If hardware vendors don't supply the source-code for drivers, they don't get to sell their kit to the government.
- If the complete software for network connected devices, like Printers, Faxes, Copiers, isn't supplied, tested and built by the DSD or the like, they should not be connected to any government network.
- The minimum standard for Operating Systems on business servers and Cloud servers is NSA's "Security Enhanced" system.
- The minimum acceptable level of errors and vulnerabilities on Desktop Operating Systems and Software is that achieved by free Operating Systems and Software.
- Documents are Read-Only with internal integrity and validation checks.
- Word-processing files are not documents, they are only for the preparation of documents.
- They don't come inherently tamper-proof or tamper-evident.
- Unrelated features like 'macros' are known, massive security holes.
- Complex and security-riddled file formats, likes MS-Word, should be banned from all government networks and uses, because a complete, scalable secure typesetting language is both free and available: LaTex.
- If you separate Documents and their Preparation, then the internal file-format is irrelevant.
- Version Control of Documents and their preparatory files is critical, just as it is for source code. LaTex is 100% Ascii text and unable to carry malware. It can also be trivially stored and managed in Version Control systems.
- If authors need to collaborate on documents, this requires a different tool/methodoloy. Not unlike Google Docs.
- Ban attachments from internal Government email, allow only simple text, no HTML.
- e-mail is for business communication, potentially each message is read by thousands.
- And most of all, large word "documents" that contain trivial text, like notices.
- Links to pages/files/documents, not the items themselves, need to be communicated.
- Email systems are good at sending messages, not at storing many copies and variants of files: that belongs on File Servers.
Not only are Australians competent in developing Free and Open Software, what we do is arguably amongst the best in the world:
- SAMBA, the most secure and robust implementation of the Microsoft File and Printing protocols was developed initially in Canberra 20+ years ago by a (then) PhD student, who now leads the team around the world supporting and developing it.
- IBM has just a few "Linux Technology Centres" in the world. One of them, Oz-Labs to its friends, is in Canberra staffed by a group with a number of recognised world-leaders.
- The first and only Operating System formally verified at the code level, not the specification, was developed at NICTA/UNSW. Australia is unarguably the leader at the highest level of Software Security possible.
There will be more that I don't know of.
- the need for Secure Systems and Software,
- demonstrated capability, resources and achievement,
- widespread talent, training and teaching facilities and a plethora of world-class innovations,
- some of the best people in the world at Software and Security, and
- the robust attitude to go its own way and not be dictated to by foreign powers, especially their commercial arms.
It would be a much stronger and more robust approach, preventing compromises before they happen.
And it's a technology and process we could sell to other mid-tier Countries around the world.
So why is it missing from the National Security Advisors' agenda and the Gillard Governments' National Strategic Plan?