Horace Dediu of asymco has written two recent posts on Microsoft:
I find Dediu's work outstanding: he provides hard data for the points he argues...
Thirty Years in I.T. Theories, Ideas, Opinions.... Leveraging knowledge of the past to understand now. @SteveJCbr & stevej.cbr@gmail.com
2012/06/23
2012/06/20
Cyberwar: paper-tiger or real threat?
Marcus Ranum, renowned IT Security expert, has interesting views on Cyberwar.
It's a lot more nuanced and subtle than "One Big Attack".
Where I diverge: a Big Event is a great distraction for really 'interesting', subtle actions - and can be just as simple as the First Worm by Morris. Oooops, it wasn't meant to do that...
Things you should read or view:
It's a lot more nuanced and subtle than "One Big Attack".
Where I diverge: a Big Event is a great distraction for really 'interesting', subtle actions - and can be just as simple as the First Worm by Morris. Oooops, it wasn't meant to do that...
Things you should read or view:
- RSA Conference, 2012, On Cyberwarfare
- Fabius Maximus posts on cyberwar
- You must Be >this< Tall To Play Cyberwar (has DoD grown enough yet?)
- Cyberwar: The Pentagon Cyberstrategy
- Cyberwar: About Stuxnet, the next generation of warfare?
- Congress Authorizes Pentagon to Wage Internet War [Wired. Ryan Singel]
- M.R. on rearguard-security: Cyberwar
NBN, stuxnet and Security: It's worse than you can believe
What did US Intelligence tell the Australian Government about Real Network Security when a chinese vendor was vetoed as supplier of NBN (central?) switches?
Now that we have O'bama admitting "we did Stuxnet, with a little help", we know that they aren't just capable and active, but aware of higher level attacks and defences: you never admit to your highest-level capability.
Yesterday I read two pieces that gave me pause: the first, the US Navy replacing Windows with Linux for an armed drone was hopeful, the other should frighten anyone who understands Security: there's now a market in Zero-Day vulnerabilities.
The things the new-world of the NBN has to protect us against just got a lot worse than you can imagine.
Now that we have O'bama admitting "we did Stuxnet, with a little help", we know that they aren't just capable and active, but aware of higher level attacks and defences: you never admit to your highest-level capability.
Yesterday I read two pieces that gave me pause: the first, the US Navy replacing Windows with Linux for an armed drone was hopeful, the other should frighten anyone who understands Security: there's now a market in Zero-Day vulnerabilities.
The things the new-world of the NBN has to protect us against just got a lot worse than you can imagine.
2012/06/18
NBN: Will Apple's Next Big Thing "Break the Internet" as we know it?
Will Apple, in 2013, release its next Game Changer for Television following on from the iPod, iPhone, and iPad?
If they do, will that break the Internet as we know it when 50-250MM people trying to stream a World Cup final?
Nobody can supply Terrabit server links, let alone afford them. To reinvent watching TV, Apple has to reinvent its distribution over the Internet.
The surprising thing is we were first on the cusp of wide-scale "Video-on-Demand" in 1993.
Can, twenty years later, we get there this time?
If they do, will that break the Internet as we know it when 50-250MM people trying to stream a World Cup final?
Nobody can supply Terrabit server links, let alone afford them. To reinvent watching TV, Apple has to reinvent its distribution over the Internet.
The surprising thing is we were first on the cusp of wide-scale "Video-on-Demand" in 1993.
Can, twenty years later, we get there this time?
2012/06/17
NBN: Needed for "Smart Grid" and other New Century Industries
With the release by IBM of "Australia's Digital Future to 2050" by Phil Ruthven of IBISworld there is now some very good modelling to say "The Internet Changes Everything", with some Industry bulwarks of the past set to disappear or radically shrink and others, "New Century Industries" (my words), that don't yet exist at scale, will come to the fore of our economy.
Previous pieces that link the NBN/Smart-Internet with "Negawatt" programs are now more relevant:
Previous pieces that link the NBN/Smart-Internet with "Negawatt" programs are now more relevant:
test of gdocs
Groklaw's "Curing the Problem of Software Patents"
Link on Sites... "Curing the Problem of Software Patents"
Link on Sites... "Curing the Problem of Software Patents"
2012/06/13
What is Software? Why does it matter?
Software is the stuff that runs on computers.
It's invisible, intangible and isn't even 'isomorphic': exactly one form for one thing. There's an infinite number of representations of a single function, program or thing, producing the identical result.
Computers are active cognitive processing engines.
They can be electronic, mechanical, relays or "wetware" like brains.
If you don't know what you're working on and with, then:
It's invisible, intangible and isn't even 'isomorphic': exactly one form for one thing. There's an infinite number of representations of a single function, program or thing, producing the identical result.
Computers are active cognitive processing engines.
They can be electronic, mechanical, relays or "wetware" like brains.
If you don't know what you're working on and with, then:
- You can't answer: What is it that we Do?
- Nor can you talk about: What's the best way to do what we do? How do we assess results?
- Nor: What is unique to what we do Professionally? What is the stuff that is ours alone?
The wonderful Pamela Jones of Groklaw persuaded a very insightful Intellectual Property (I.P.) expert, Michael Risch, to contribute a piece: "Curing the Problem of Software Patents".
I attempted a contribution countering the notion that "Software is Mathematics" and it was lost in the noise.
I can't demand respect for having practiced I.T. for longer than 50-75% of the respondents have been alive, nor quote my "hours of experience" like pilots or enumerate the really big, really tough systems I've built, saved or worked on.
Nor can I meaningfully demonstrate the breadth and depth of my Professional Library,
nor, like professional pilots, show my type-certifications and specialist training and ratings.
On Wild Wild Web, my experience and insight is lost because the I.T. Profession has no insight into itself nor engages in any systemic introspection. If you've written a 100-line Vbasic or C programme, you've qualified to espouse your opinion on all things Software...
Discussing "Software Patents", or the Intellectual Property rights over Software, is a deep and complex issue. One that engaged the courts for over two decades of its 50 year history. Around 1970, Thompson and Ritchie of Bell Labs patented the "Set UID bit" of the first Unix system [and placed the patent in the public domain when issued.] This was before there was a clear acceptance of Software Patents.
But why does nobody comment that the very stuff in question, "Software", isn't just not well-defined, there is no commonly accepted definition and description.
My thesis is that Software cannot be adequately covered by existing I.P. regimes, because it is something completely new and different.
Computers were described in the early days as "Thinking Machines", capitalised.
They do execute the cognitive steps encoded in software, but don't yet "Think" as people might, but do small-t 'think'.
This is my hierarchy of the I.P. protections we have now:
- Things we make, artefacts. [Trade Marks and Registered Designs]
- a Creative Expression, potentially copyable, more than simple artefacts. [Copyright]
- Processes, Designs and Methods for creating 'stuff'. [Patents]
Software has elements of each one of these:
- it appears as an artefact, such as on an optical disk or USB drive, as a string of bits that can be executed directly on a real or virtual piece of hardware. The binary form.
- It appears as copyrightable source code, a creative expression of a person or team.
- The essential ideas and innovations can be described and enumerated in a Patent Specification.
But Software is only software when it is running, when it is a series of cognitive steps being executed on a processing engine.
Stored on disk, it is only latent: the string of bits could represent anything, could be interpreted or executed in many distinct ways.
Software does not arrive out of the blue: it results from conscious, directed cognitive effort.
The classic Software Engineering process includes meta-layers and processes:
- Business Case or System Description
- Requirements
- Functional Specification
- Detailed Software Design
- the source code, libraries and tool-chains to create it,
- the functional test suites
- Debugging tools and test/acceptance teams
- Version Control, Release Management and Deployment systems.
- Defect, request and upgrade tracking systems
- Project Management systems
- Quality and Performance management support.
One of the essential characteristics of modern computers is they are General Purpose Computers.
They are empty vessels capable of solving an infinite array of Problems, waiting for specific Software to be loaded and run, making them for that time perform defined cognitive tasks.
The major advance circa 1940's was the "Stored Program".
The CPU (Central Processing Unit) was capable of executing 'instructions', or micro cognitive steps, but prior to the "Stored Program", only what was fixed at construction time.
Storing "executable code" in memory, treating it like loadable and modifiable data, created a completely New Thing on the planet:
a constructed machine that was a Programmable General Purpose Computer.
"software" can't be disassociated from the platform that it runs on.
And importantly, it only exists when running, the dynamic execution of cognitive steps by an engine marks what it is, nothing else.
The 3 basic building blocks of Software written in functional not declarative languages are:
- sequential statements [mathematical calculations and other actions]
- repetitions and loops [actions]
- conditionals [predicate logic]
These are constructs unique to Software, they are not Logic or Mathematics.
Alan Kay described writing software as "building a convincing proof".
But it is more than just proving a hypothesis, its a result that goes further:
It combines Logic, Actions and Algorithms.
The next breakthrough was I/O (Input/Output).
Whilst a CPU could process cognitive steps stored in memory and solve not just an individual problem, but all problems of a class, this had no use without being able to communicate the results or have the problem input communicated to it.
The four elements described, CPU, Memory (RAM), Persistent Storage and I/O, are common and necessary characteristics of the platform on which Software may run.
Software has the potential to embody any and all cognitive tasks of a human mind that we can precisely describe. Is there any reason it can't Learn and Adapt, fully emulating a human brain?
Already the best chess players are embodied in software. There are symbolic solvers for maths and logic problems, and automated reasoning solving Formal Proofs of problems (like Software itself).
There is a confusion between using Mathematical/Logical proofs to show that a specific piece of Software (often the source code) on a specific platform meets a Formal Specification, and hence Software must be a Mathematical construct.
I posit that Software is composed of cognitive actions and a mathematical/logical proof can be derived to show it conforms to a specification of actions.
Software can be self-modifying, the trivial current example is "on-the-fly compilation".
Mathematical Algorithms are not.
Software can Learn and Adapt in exactly the same way that the human mind does.
This is not Consciousness. That is a whole other discussion.
Once someone has a precise, testable definition of that, perhaps it can be embodied in Software.
The Turning Test addresses this problem, roughly as "If it walks like a ducks, quacks like a duck, ..., it's a duck" [or person].
This is an extremely powerful and deep point. Nothing similar applies to any branch of Mathematics, Logic or reasoning:
The Turing Test explicitly says we cannot tell the difference between Software executing on a machine platform and a human mind. All we can evaluate is responses and executing Software can be indistinguishable from a living person.
Importantly a piece of Software cannot directly determine upon what platform it is being run, if it is being run directly on hardware or emulated, being interpreted or a human is carefully and slowly stepping through some code examining it or understanding it.
The field of Artificial Intelligence (AI) has for over 4 decades attempted to understand the human brain in finer and finer detail and build Software for electronic compute platforms that precisely emulate those human behaviours. Including Learning and Adaption.
Any theory of Software must allow for self-modifying and evolving code.
Software isn't a branch of Mathematics and Logic. It is far, far more.
- It's dynamic, not a static proof or algorithm.
- It's invisible, intangible and infinite representations exist for the same functionality.
- It performs I/O, stores results and changes its behaviour.
- It interacts with the Real World and suffers 'bit rot'.
- Software is a sequence of cognitive actions, not a proof or algorithm on a page.
My objection to Software Patents is two-fold:
- Software is enumerated cognitive actions. If you can't Patent Ideas, why can you patent thought processes that lead to ideas or solve problems?
- It's at a level above Designs etc embodied in Patents. It's a completely different entity.
More to follow...
2012/06/05
Cyberwar: Bush/O'Bama authorised Stuxnet
We've crossed a Internet Security Rubicon: the USA admits to combined cyber-attack operations with Israel against Iran's nuclear enrichment program.[NY Times]
The Washington Post's "Zero Day" series says a lot more.
It's a very important event when a government goes public with its most-secret security or intelligence programs: it took over 4-decades after WWII (and the 'Spycatcher' court case) for news of just part of the Allied SIGINT activities to become public.
The work of Bletchley Park, the home of Alan Turning's biggest contribution, was kept secret to the point of allowing mass casualties rather than give it away.
The only reason I can think of for O'Bama to publicise the USA's active, and successful, practice of cyber-attack is they think they've developed protections against it.
The Washington Post's "Zero Day" series says a lot more.
It's a very important event when a government goes public with its most-secret security or intelligence programs: it took over 4-decades after WWII (and the 'Spycatcher' court case) for news of just part of the Allied SIGINT activities to become public.
The work of Bletchley Park, the home of Alan Turning's biggest contribution, was kept secret to the point of allowing mass casualties rather than give it away.
The only reason I can think of for O'Bama to publicise the USA's active, and successful, practice of cyber-attack is they think they've developed protections against it.
Subscribe to:
Posts (Atom)