Murdoch, Google and Microsoft

Rupert Murdoch anti-Google strategy seems to becoming clear. Previously, I was puzzled at his public stance, it seems to make no sense.

This Register piece, and there are many other sources, starts with:
Rupert Murdoch is in talks with Microsoft over his plans to delist his newspaper websites from Google.
Classic "My enemy's enemy is my Friend" thinking.
It's not really so, they are only a temporary ally at best with a fragile common cause.

The danger lies in your "friend" deciding they don't need you anymore or worse, turning on you once the main enemy is gone or the battle is lost.

I still think Murdoch's strategy is seriously off the mark.


I.T. Failure == Corporate Failure

Stephen Bartholomeusz writing in Business Spectator, 18 Nov 2009, on the ASIC court case over the collapse of One.Tel.

Bartholomeusz neatly summarises the root cause of the failure:
Unhappily, its billing systems didn’t work, so it piled up debtors, while its competitors responded to the cut-price strategies.
He goes on to say:
While professing publicly that the group was on-track to be cash-positive..., internally One.Tel appears to have had little control or understanding of its cash flows or the mounting issues created by its billing system
and finishes:
Whatever Rich might claim, One.Tel wasn't a successful company, unless success is measured by revenue, not cash flows or execution.

This is the first case I've noticed where the immediate cause of failure of a large, public company  has been it's I.T. systems. The root cause is poor management with an inability to execute - or to understand and control it's I.T.

The field of "Software Engineering" is 40 years old now.
How could this foreseeable and preventable failure have happened with competent professionals, especially if Software Engineering had achieved it's aims?

There is a multiple tragedy hidden here:
  • Software Engineering has failed to impress it's primary market: Business Management.
  • Educators and Researchers are not, as a matter of course, going to analyse this failure and use it as a case study. Compare the 1974 explosion at Flixborough or the 1970 collapse of the Westgate bridge during construction.
  • IT practitioners aren't going to be informed by their Professional Societies of the causes and preventing a recurrence.
  • Business Management and I.T. practice remains "Consequence Free".
If a Billion Dollar Failure isn't a notable event and worthy of preventing recurrence, then what is?
Why are ASIC, the ASX and the Federal and State Governments silent on this point?
If not their job, then whose?

Imagine if QANTAS had a fire at a maintenance facility and lost $1B of buildings, plant and equipment. You know absolutely the company, multiple regulators and all the professional bodies would actively investigate the matter.

They would be looking for "root causes" of this event, other problems, ways to fix the system, processes & procedures to prevent or early-detect this class of problem again and co-incidentally if any individuals were responsible. Not just front-line grunts, but if anyone in management  (up to the CEO) was culpable, negligent, incompetent or asleep-at-the-wheel.

The absolute tragedy here is not the loss to these investors (employees, vendors, customers, ...) but that nothing is going to change, that this massive loss bought nothing.

What is more galling to me is that nobody in the Press, Government, ASX, Investment bodies, Judicary or Regulators thinks anything more could or should be done...

Microsoft Troubles - VI, First Words

"On the latter, Microsoft is hoping Windows 7 will pull it out of a financial hole" by Charles Arthur, Sydney Morning Herald, Nov 5, 2009. First time I've seen in popular press the actual words "financial hole" w.r.t. Microsoft.

Previous piece - "Microsoft Troubles V"


Why Yet Another ReOrganisation won't improve the Public Service

The Rt. Hon. Ken Rudd PM has suggested on the News that he'll be seeking to improve the Federal Public Service. There's talk of a special Centre at the ANU to train people up too.

Rudd might end up with a bunch of tests, metrics and new programs & processes, but I can guarantee it won't amount to a hill 'o beans. The one thing known about Bureaucracies is their ability to Resit Change.

Read C. N. Parkinson ("Parkinsons Law" etc) for a view from the 1950's and some definitive economic analysis of the ultimate Bureaucracy: The UK's Ministry of Defence. After WWI, ships and fighting men - the essence of the Navy - declined dramatically. The Bureaucracy 'running' them increased overwhelmingly...

Why? Because the primary purpose of Bureaucracies is themselves, not producing outcomes.

The 38 episodes of the BBC's "Yes Minister" from the 1980's are a timeless tutorial for budding bureaucrats, Public or Private sector.

Wind the clock back a 100 years when Gilbreth and Taylor pioneered "Scientific Management", or in todays' language "Evidence Based Management": sound, practical and carefully researched approaches that produced outstanding results while treating employees as Real People. It is notable for both not being embraced and being mischaracterised as 'evil'. The book is on-line and a revelation to anyone willing to read a little.

Parkinson stated his First Law in 1957 as:
Work expands so as to fill the time available for its completion.

His 1960 Second Law stated:
Expenditure rises to meet income.
Which might be an interesting topic for Dr Ken Henry, Secretary of Treasury, to address in his Review of the Taxation System...

Have either of these hypotheses been disproved in the last 5 decades?

In 1970, Parkinson published a 10-year follow-up on his work. The book had sold impressively, but had anything changed? Not discernibly.

Has Rudd proposed anything that recognises or addresses the central problems?
Not in public...

The people who rise to the top of Bureaucracies, Public or Private sector, are very good at what they do. This is exactly why we had the Enron, Global Crossing,e tc collapses around 2000 and within the decade, Sub-Prime Meltdown and GFC. Senior Bureaucrats overwhelmingly look after themselves and their own positions, not their stakeholders. It's the inverse of Fiducary Duty.

The trouble is What they do, not finding & training highly competent people.
There's over a century of ignored Management Science to prove the point.

What would work?

Consequences, direct and personal, for poor performance.
This needs two separate parts: an expert, independent investigator and a powerful compliance & enforcement body. Behind this needs to be a repository of Known Faults, Failures and Errors. Repeating a Known Problem without Consequence is the antithesis of Good Governance.

The Government knows very well how to do this and exactly what would be required: it already does precisely this for Aviation. The ATSB (formerly BASI) and CASA (formerly CAA).

It'd cost about the same to run as the National Audit Office, indeed, the Investigator would be the ANAO.
The compliance and enforcement powers, and the body to enact them, already exist in the FMAA - Financial Management and Accountability Act. s44 states:
(1) A Chief Executive must manage the affairs of the Agency in a way that promotes proper use of the Commonwealth resources for which the Chief Executive is responsible.

"proper use" means efficient, effective and ethical use that is not inconsistent with the policies of the Commonwealth.
All the machinery is there...

Will politicians actually hold their Senior Bureaucrats to Account?


To do so would deprive pollies of their core modus operandi:
arbitrary and capricious decisions and policy changes.

There cannot be lasting Bureaucratic Reform without Political Reform.

But we'll have a lot of "colour and movement", many press releases & back-thumping and spend a lot of money achieving precisely nothing...

Here's something from the end of 2007 which has never been acknowledged...


When 'ping' fails

In Networking, the 'performance objects' are links, usually end-to-end, consisting of many elements - like ethernet segments, switches, routers/firewalls, long-distance circuits and security scanner devices, laid on top of Telco/backbone services that provide dynamic and asymmetrical routes.

The most frequently used measure is 'ping' time - ICMP “echo request” packets, often 64 bytes long.
Firewalls, routers and underlying networks filter/block classes of traffic, implement traffic & flow rules and attempt "Quality of Service". There are many common rulesets in use:
  • blocking 'ping' outright for 'security reasons'. Stops trivial network scanning.
  • QoS settings for different traffic classes. VoIP/SIP gets high priority, FTP traffic is low, your guess on HTTP, DNS, time (NTP) and command line access - SSH, Telnet, ...
  • traffic profiling on IP type (ICMP, UDP, TCP) and packet size (vs link MTU).
  • traffic prioritisation based on source/destination.
Real-time voice needs many small packets, would like low latency/delay, and no jitter, and can stand some packet loss or data errors. FTP relies on TCP to detect packet loss & retransmit them. It likes big packets and attempts to increase bandwidth consumed through TCP 'fast-start' etc.

The only time 'ping' is accurate is within a simple ethernet segment - no rules, no firewalls, no 'traffic engineering', no link losses, no collisions, ...
Otherwise, it's a dangerous and potentially very misleading measure.

'Time of Flight' for UDP & ICMP packets is only measurable when you control both ends of the link and can match headers. Not something most people can or want to do.

TCP packets can be accurately timed - sessions are well identified, packets can be uniquely identified and they individually acknowledged. It is possible to accurately and definitively monitor & measure round-trip times and effective throughput (raw & corrected) of links and connections at any point TCP packets are inspected - host, router, firewall, MPLS end-point, ...
I'm not aware of this being widely used in practice, but that's a lack of knowledge on my part.

This is not a tutorial in Networking or TCP/IP.
Neither am I a Networking expert. I'm demonstrating that even with my knowledge, "tools ain't tools" (meaning not all tools and methods are equal),
and that using just 2 metrics, 'bandwidth' & 'latency' to characterise links is simplistic and fraught. As professionals, we have to be able to "dig under the covers" to diagnose & fix subtle faults.

Consider the case of TCP/IP over a noisy satellite phone link, the type you buy from Intelsat for ships or remote areas. The service notionally delivers a data channel of 64kbps, but is optimised for a digital voice circuit, not IP packets. The end-end link has per-bit low latency on/off the link, long transmission delays, nil jitter and limited error-correction (forward-error-correction (FEC), no retransmit) - nice for telephony. These links also have buckets of errors - which voice, especially simple PCM, easily tolerates & can even be smoothed or interpolated out with simple equipment - which will be there to handle echo cancellation.

Say you're on a ship at the end of one of these links.
People are complaining that email 'takes forever' and they can't download web pages.
You run a ping test out to various points - and everything is Just Fine.

What next?

The most usual response is 'Blame the Victim' and declare the link to be fine and 'the problem' to be too many people sending large messages, 'clogging up the link', and too much web surfing. You might set quotas and delete large emails. That might work, or at least improve things marginally.

Radio links, especially back-to-back ones, each crossing 36,000km to geostationary satellites, are noisy.
If the BER is 1:100000 (1:10^5, under the hoped for 1:10^6) and you're using the default ethernet MTU of 1500 bytes, you'll get an error every 1-2 seconds. No worries, eh?
1500 bytes = 12,000 bits = 5.3 packets/second. Or 1 in 10 packets. Hardly noticeable.
TCP/IP has a minimum overhead of 40bytes/packet (less with Van Jacobson compression).
The data payload per packet is 1460 bytes for the ethernet default MTU.

Sending a 1.25Mb file (10Mbit), that's ~856 raw sent packets and 103 errors, or ~12% retransmissions. Of those 103 resends, 12 get errors as well and are resent, and 1.5 errors of those go onto a 3d round...
Or ~115 errored packets, or 14% errors on raw packets. Just a minor problem.
There's a probability of >1 error per packet, but I don't have the maths to solve for that.

The effective bandwidth (throughput) of the link, using 970 * 1500-by packets to move 1Mb in 182 seconds, is 55kpbs. Quite acceptable.

What if there's a corroded connector or tired tracking gimbal and you get a 3db change in SNR and the BER doubles? (That's a guess, not science.)
856 raw packets and 250 1st round errors, 62 2nd, 15 3d, 4 4-th, 1 5th = 333 resends. An almost 50% increase in total number of packets needed to send the file. 223 seconds and 45kbps.
Doubling the BER again (4:10^5 or 0.004%) increases the 1st round error rate to 400 packets, or 47% - 750 retransmits in 10 rounds. 301 seconds and 33.1kpbs. Half-speed.

Back to the original BER, if you were running 'jumbo frames' (9000 by) locally & these went down the link as is, you get 0.8 packets/sec and have a 72% chance of an error in a packet. One in four of the packets would get through unscathed. 140 'jumbo frames' are sent raw, 350 packets are needed with 16 rounds of retransmission.
The file takes 400 seconds at 25kbps - a hefty penalty for forgetting to configure a switch.

The problem is that packet size amplifies error rates.
A change in BER of 0.001% to 0.004%, undetectable by the human ear, halves the throughput fo TCP/IP.
Using an MTU size of 168 (128 data + 40 TCP/IP overhead) gives good performance at a BER of 1:10^4, trading 25% protocol overhead for link robustness.

'ping', using default 512 bit packets, won't detect the error.
But who'd think the MTU was a problem when standard diagnostics were reporting 'all clear'?

Summary: In difficult conditions, the BER doesn't have to drop much for link throughput to significantly degrade.

This example is about simple-minded tools and drawing incorrect conclusions.
The Latency of the link was constant, but the Effective Bandwidth, throughput, changed because of noise or link errors.

Surely that proves the Myth: Latency & Bandwidth are unrelated.

Nope, it proves that link speed and throughput bear a complex relationship to each other.

If you had been measuring TCP/IP statistics (throughput & round-trip-time) at the outbound router, or using 'ping' with MTU of 1500, you'd have seen the average latency rising as throughput dropped. All those link-layer errors & subsequent retransmits were causing packets to take longer.

But a simple low-bandwidth radio link isn't the whole story.
It's a "thin, long pipe" in some parlances.
What was special about that link was:
  • no link contention, data rate was guaranteed transfer rate.
  • synchronous data


Rupert Murdoch - Fool or Genius?

Does Rupert Murdoch know something the rest of us don't?
The recent news is that News Ltd would start charging for on-line access to its newspapers.

Not a good idea.
Experienced Journalist & commentator, Alan Kohler also thinks so...

First, Rupert is not in the business of selling 'news', quality journalism or not.
He sells Advertising.
Just like Google and friends. But apparently not nearly as well as they do on-line.

There are people who sell 'news', and they are going strong.
Organisations like Reuters, Associated Press, Bloomberg, AAP, ... The wire-services.
The same ones that sell to Google, businesses, TV and Mr Murdoch's newspapers.

News Ltd doesn't sell journalistic content (news): like every major newspaper, it has always given away its content. Exactly the same as Free-to-Air radio and TV.

The "value proposition" to most newspaper customers, News & Stories, is a Free Good.
Major papers actually cost their publishers to sell. Newsagents typically keep the full "cover price" of the local major papers. Perhaps this is why Fairfax Ltd lists "Newsprint and Ink" as its single biggest expense.

Publishers make their money from the advertising they sell (Classified and 'Display' or general).
They set their advertising rates on the estimated number of readers - not copies sold/distributed. (There's a whole industry 'auditing' circulation & readership).

Small advertisers will always be 'price-takers', while the large regular advertisers can negotiate.

On-line breaks many/all the Newspaper assumptions:
  • no intermediaries with good 'passing trade' to find customers
  • exact counts, not estimates, or readership
  • exact counts of advertiser hit-rates (count links followed)
  • targeted/niche audiences, not "broad spectrum" mass market

Browsing the 1200+ entries for 'newspapers' in the Australian Yellow Pages, these groupings seem apparent:
Business, Trade & Industry, Lifestyle, Sports, Political, Special Interest, Community/Local, Regional & Rural, Ethnic, Language and Religious,
and versions of the "Trading Post".
A newspaper without content, pure advertising, the ideal for the business side of newspaper.

Second, a long time ago newspapers were the source for capital-N News - timely, important, factual.
They broke stories, 'scooped' one another, had many editions during the day and dealt in "the facts m'aam, just the facts" as Joe Friday might say.
The sort of thing shown in 1930's Black and White movies.

By the 1970's, newspapers had comprehensively lost the race as the first news source.
"Watergate" showed they could still 'scoop' other media with investigate journalism, but the Vietnam War played out on the nightly TV news.

When Ted Turner started CNN, the game changed - Free-to-Air was usurped.
The 1991 Gulf War had CNN "reporting live from Baghdad" and assumed the mantle of "first news source".

These days it is a tussle between Cable News and on-line services to be "first".
And that race has always led to problems with accuracy and false/fabricated stories.

An editor who is under pressure "to be first" can be manipulated into publishing without good fact checking. When there was considerable effort & expense in rolling the presses, the downside ensured more caution. In the on-line world, nearly all barriers to production are eliminated alongside "instant" publication. An editorial mistake is much more likely and potentially much more damaging to a large publisher, the Drudge Report non-withstanding.

Newspapers have been providing Opinion & Analysis for a couple of decades.
Any pretence they are cutting edge or breaking stories in real-time is a "fools paradise" and delusional.

News Ltd has great content produced by many great people and serves a faithful cohort of consumers. It just isn't 'news' they are selling.

Third, there are many good free alternatives for news, on-line and not, to newspapers.

Google pays for wire-services and gives away the content.
Publicly funded media - radio, TV and on-line - have a mandate to provide services with public monies. The BBC and Australian ABC have large news rooms and international reporters.
The ABC alone has 700 people in its News Division providing current content for all its outlets.

How does a newspaper, with at best 300 journalists, compete with a better resourced competitor who's content is free?

Not on news - only with other types of content and other incentives - like DVD's and special offers...

Fourth, there are just 3 workable Revenue Models.
Revenue options are:
subscription/donations and cover-price + advertising.
(pre-paid vs 2-part charging)

There are only 4 Revenue Models possible in this scheme.
Fully free can't self-support itself, so there are only 3 workable models.

Murdoch is complaining the Revenue Model that has worked well in the physical world for approaching a century doesn't work on the Internet. Who'd have thought?!

The wire-services thrive and on-line advertising is booming.
The only people out of step are the firms running newspapers.

They could have acted in 1995 to move their advertising on-line, but didn't.
Were they blinkered or lacked 'vision'?
Was it a sound business decision based in part on not canabalising their main cash flows?

Things are how they are...
There seems little to be gained from now analysing the reasons for non-action.

There are other issues that have to be resolved when moving to on-line services.
  • Paper is simple and always "Just Works', modulo getting wet.
    Attempts to stream printed news electronically have been widely successful outside of offices. Radio serves the travelling public well. Printed media is cheap, available and can be forgotten without dire consequences. Some section of the population may read the news on their Kindle or iPhone on their morning commute, but it won't be a large audience.
    Neither will there be much call for $10 newspapers...

  • Serving "The Diaspora": An important function of newspapers is allowing non-resident locals to "keep in touch with home". Australia shows that people may permanently emmigrate and never return home, but still identify strongly with their country of origin. This fuels our strong ethnic newspapers. For people who've only moved towns, a daily or weekly "fix" of their hometown newspapers fills a strong need. They even pay a premium.

  • Niche buyers. Most buyers throw most of a newspaper away. They are very specific & selective in their needs and uses of the massive content provided. There are better ways to serve many of those niches on-line. Like classified advertising is better served by e-bay and 'trading post'. It's fast, current and cheap - plus very efficient for the reader. The service does the searching and the reader can be contacting a seller within minutes of loading the site.

  • Network effects and the tipping point. When a product has reached around 40% market penetration, it 'suddenly' becomes popular and quickly saturates the market. This happened in 1984 with Group-3 fax and then around 1996 with The Internet/World Wide Web. Newspapers need to be keenly aware of their competitors - when the end comes, it may be frightenly fast.

  • Copyright and Libel Laws. The journalists union has spent a very long time negotiating what rights the publisher & content-creator have. This all has to be done again in an on-line world. The other side of the coin is commercial protection of journalists against Libel or defamation actions. The publisher wears the risk once the editor decides to print. Those named know that a newspaper can afford to and will defend itself. If journalists are personally exposed to litigation, justified or not, they will sensibly withhold contensious pieces. Why wreck your life for a decade or more, as happened to Chris Masters over the "Moonlight State" and other pieces? For many, the price is too high.

Lastly, What would work?

This is an argument in three parts: as a society we need 'quality journalism', news rooms aren't cheap, and are there models we could follow?

The media as "The Fifth Estate" is an important and necessary part of any Democratic government. A Free Press is a necessary part of Open and Transparent government.

But whither Investigative Journalism. There is a lot of TV reportage of politicians doing 'door stops' or in stage-managed events. And a lot of 'tabloid journalism' on TV.

Would Woodward and Bernstein now be funded for their lengthy Watergate investigations?
Would any editor allow it to be published these days?
I think Watergate is less likely to be reported these days for many reasons and the Drudge Report and other gossip sources do not fill the gap.

"Quality Journalism" has to be nutured & supported for us to have stable, prosperous societies.

The ABC states it has 700 people in its News Division.
On-line sources suggest major newspapers have ~300 journalists in their newsrooms. [This information isn't in the Annual Reports I scanned.]

What would it cost to run such a news room?
$30M a year in wages, $10M in wire-services, $10-15M for bluidings and systems.
Marketing & Sales probably $30M. Accounting and collecting subscriptions: $5-10M.
Publishing on-line would add another $20-30M, with an overall 30% Gross Margin required to fund upgrades, depreciation and dividends.

Perhaps $150M/year in revenue, or $3M/week.
The Sydney Morning Herald has an audited circulation of 210-360,000 and readership from 850,000-1,100,000 [without SunHerald, $1.80 and 480,000/1.25M]
Previous comments: "Internet Changes Everything: Newspapers".

Even if you achieved 500,000 individual subscriptions, a weekly price of $5+, versus the $1.40/weekday and $2.40 Saturday for the paper version.
I believe that's far beyond the consumer 'price point' for a single publication.

So what models are out there that might work?
'Cable TV' provides content aggregation, common marketing services & subscription and billing.

It has severly impacted Free-to-Air TV over the last 4 decades for many reasons.
One of the big factors I believe is allowing content providers to focus on their strengths and the Cable Service Provider (Foxtel in Australia) to focus on the technical and retail/customer relations and support business.

The public are offered content aggregated into affordable and desirable 'packages'.
They can decide the utility to them of each package and compare the cost to other forms of entertainment. A$30-$50 seems to be the price point.

Cable TV for content providers removes barriers to entry and avoids competition between technical delivery methods. The customer wants the service and isn't interested in the technology per se. This model allows & promotes small, new entrants with serving specialist or highly targeted niches.

The revenue returned to content providers is unknown to me. Large studios are not 'price takers' and have significant negotiating power as 'headline products'.
Presumably small niche providers get a return based on consumer views.

This shared infrastructure and 'content packages' seems ideally suited to on-line delivery of paid content - which doesn't have to be limited to news or 'quality journalism', but certainly includes them. Plus we have the natural providers already operating with large, high-quality customer lists: Cable Service Providers.

The technical implementation for an "on-line Channel Service" is simple. Though the ABC iView experience suggests that collaborating with ISP's and allowing unmetered content is necessary. [Australian ISP's impose download quotas on broadband].

Customers already have some sort of PC (Windows, Mac, Linux, ...) and look after their own broadband connection.
A controlled, universal 'player' is required - happily companies like VMware already provide, free, a basic product that work across all platforms, the "VMware Player", which can run pre-built systems with embedded applications, "Virtual Appliances".

The only necessary work is tailoring a VPN or similar and distributing the required registration/connection keys. Foxtel already has the infrastructure in place to source, distrubute, service and support hardware & devices.

That's not a big leap...
And one where services can be packaged in a series of packages with many different price-points.
All those free Community papers, plus a Major Metro Papers: $5/month?
Add a speciality or trade paper, all the Major Metros, a financial services 'feed' and an alert service like 'Media Monitors' for a company of 75 people: $lots.

How does this proposal site with the Newspaper assumptions:
  • intermediaries with good 'passing trade'
    Exactly what the Cable TV companies do.

  • exact counts, not estimates, or readership
    Page counts and the precise subscriber unequivilacly identified & grouped.
    Near-perfect marketing information, and a perfect, undisputed source of revenue figures for 'page hits' revenue scheme.

  • exact counts of advertiser hit-rates (count links followed)
    You don't have to sell advertising, and many content providers would not,
    But if, like the Trading Post, you did... Trivial.

  • targeted/niche audiences, not "broad spectrum" mass market
    Tailored content per source, niche & specialist sources, remembered preferences and interests... Near perfect for subscribers and providers alike.

  • Normal print-media space restrictions are lifted: Content providers can provide additional "in-depth" material easily & cheaply.

  • Individual content providers can use the Channel Service to provide archive, search and print-on-demand services. Leaving each party doing what they do best.

  • 'Leakage' of content can be controlled with the Virtual Appliance.
    It may be configured to only allow 10 pages a day to be printed... With extras purchased.

  • Anyone interested in expensive periodicals, Academic Journals or hard-to-find books?
    With controlled access and clear charging regimes in place, there is no issue about denying or destroying copyright.
    In another day, this might have been called "Your Local Library".

On top of this, additional options allowing subscribers to pre-pay to view or print normally inaccessible content. The Channel Service Provider doesn't become a credit provider - in fact gains by holding the prepaid money - which it never need return and might even expire, like pre-paid mobiles.

Importantly, 'micro-payments' are avoided. They are very, very hard to get right and consequentially expensive. That's why we've never seen Visa and Mastercard move on this market.

But moving 1cent 'funny money' from your pre-paid balance to a vendor - very cheap.
It's the basis of prepaid mobiles.
With the business-friendly upside of all the unused payments that expire - a tidy 5-10% profit.

In summary: Do I think Murdoch is wrong-headed in charging for access to his newspapers?

Do I think an on-line service offering this facility effectively, efficiently and profitably can be constructed?

Will anyone read & respond to this piece and the proposal?
Who Knows :-)


Internet Changes Everything: Newspapers

The News Broadsheet was a pivotal element of the 1776 American Revolution, eventually becoming enshrined in the First Amendment to the US Constitution.

Newspapers were integral to the Twentieth Century rise and evolution of Western Democracies. Without "frank and fearless" reporting (and an engaged electorate), governments can quickly spiral out of control.

We're now 15+ years into the "Internet Revolution", so where are Newspapers in their journey on-line? Significantly, nobody seems to have discovered a "secret sauce" to generally monetise News and the work of Journalists in the way that Amazon and Google etc have monetised books and on-line ads. Those who make money from writing seem to do so from direct subscriptions - the on-line form of "newsletters".

In my first job, there was a direct and obvious connection between the 5g +/- 0.01g of sugar being analysed and 'the business': Our analyses determined payments for 500 or 1,000 tonne lots. Getting it wrong wasn't an option. My place in the scheme of things was self evident.

The world of Software and I.T., even after 60 years, still doesn't have that direct & obvious link.
I.T. shares many traits with Journalism:
  • Both are "Performance Disciplines", like Music, Art, Surgery and Gymnastics.
  • "Effortless Performances" (as in 'making it look easy') take a lot of skill and experience. The public and more often now, management, have little appreciation of the process & skills.
  • Input Effort and Results bear no discernible relationship.
  • Quality is Everything, but seems impossible to measure.
  • Although both are central and necessary to the businesses they support, they are managed as "Cost Centres", with seemingly no attempts at connecting outputs with Profit.
  • Both deal in intangible and invisible "stuff" - information. Often with tight deadlines and very fast decay in product "usefulness".
  • Both share a central problem: Effort/Inputs are decoupled from Income/Results.
To research this, I chose "Fairfax Media" (ASX:FXJ), a major Australian player, because some of their "Mastheads" are over 100 years and "The Sydney Morning Herald" used to be legendary - known worldwide for its "Rivers of Gold".

Getting even rough numbers to judge demand and price-points seems very hard.
For any on-line business to succeed, revenues have to support costs. While serving bit-streams may be considerably cheaper than printing & shipping paper, what will people pay for it and how do you get money off them? How do you draw in more subscribers - what are your Marketing & Sales channels?

Newsagents keep the full "cover price" of most newspapers. For magazines and most other products, the publisher gets half with the distributor & newspaper splitting the rest.

The content of newspaper, News etc, is why people buy newspapers.
The price people are willing to pay for the whole paper indicates the "ultility" they get.
But how do you arrive at a value-in-use of just the News component?
Are there synergist effects in operation?

The output of Journalists, "News", is essentially a "Free Good" to both consumers and the business. To the publisher, each newspaper printed is only a cost - which they actively attempt to minimise, whilst revenues are from advertising and independent of pages printed. The connection is the "rate card" - people will pay more in advertising for wider circulation & readership. The connection is anything but direct and immediate. Meanwhile, "content" (& "classified" adverts) brings in readers, but the supply-demand curve is generally unknown.

Makes the economics of "free" community publications more obvious. The distribution costs are only slightly more than to newsagents and the claimed readership is maximised.

It also says why businesses are so very happy with "advertising only" publications, like "Trading Post". Pure profit and no content producers to wrangle! These business translate on-line very well - but lose all "display advertising" to corporates like retailers looking for mass market advertising.

Audited Circulation figures aren't too hard to come by and "The Press Council" publish a good snapshot of the whole Print Media scene, but there are only tangential references to journalist "head counts". Fairfax journalists feel pressured by staff cuts and are taking action.

To come up with viable business models for on-line News, especially when competing with "Free" on-line services like Google or Free-To-Air broadcasts (radio & TV), you need both sides of the Accounting Equation:

Profit = Revenues - Expenses.

Other marketing data is needed to determine whether Size Matters (only one global Google and Amazon) or Little & Local works or some combination in between...

Whatever the result, there is one guaranteed loser: newsagents.
They are the traditional Marketing & Sales Channel for newspaper.
What becomes of them in an on-line world?
Can their access to "passing trade" and existing business relationships be leveraged?

What we do know is that people are very happy with "Free" on-line content: using search engines to point to unbilled on-line newspaper articles.
Newspapers initially tried to force reader registration, even though content was "free" people went elsewhere. Fees were charged to access "archives" or additional material made available to paid subscribers.
News Ltd's upcoming experiment in charging for on-line content will be watched very carefully throughout the industry.

What would it cost to run an adequate newsroom? Do the 15 journos of brisbanetimes.com.au reported by the Press Council provide adequate local coverage? Various sources suggest major metropolitan news have 200-300 journalists, while the national broadcaster, the ABC, have 700 people in their "News Division" (which is how many journalists?).

Any 24/7 operation requires shift-work. To provide a minimum staffing of 2 people, needs a team of ~12. Numbers build very quickly as more sections & coverage is needed.

Fairfax's Annual Report (2007) says they made a profit of ~A$500M ($447M EBITDA) on A$2.3B sales.
Around one third of sales were from New Zealand. Overseas and non-print revenues were unclear.
The Australian Digital operation made A$37M on sales of $137M.

Their ~10,000 employees were their largest single expense: A$700M.
Paper and ink came in second at A$270M.
Sales and Promotions were A$88M.
Communications: A$17.5M
I.T.: A$15M and
News services: A$12M

Fairfax reports $8B of Assets - $6B of which is "intangibles" - "goodwill" and "value of mastheads". They've around A$850M in Property, Plant & Equipment - the physical assets needed to produce newspapers.

But how many journalists were there and what does a single "high-quality" newsroom cost to run? That's not going to change for an on-line News service.

What will an average subscriber pay for an on-line News service?
What additional content are needed, and what synergies exist?
How many on-line subscribers will sign-up for each different offering?
Can the existing subscribers be converted to on-line subscribers? What would help in the transition?
What are the different attributes that subscribers value and what premiums will they pay?

This says that monetising on-line News services may be hard and their general lack says solutions are still not obvious.

Other comments, observations and relevant factoids:

Newspapers serve important subsidiary functions, like being "the paper of record" for Births, Deaths and Marriages as well as public events, political speeches and disasters, crises and more.
"Public Notices" of many types are published - from bankrupts to probate on wills to personals.

Newspapers have come to be the definitive textual mass-communication device.
There appears to not yet be any on-line equivalent.
In the USA, recent failures of long-running mastheads says there isn't that much time left to find a good answer.

The SMH sells ~212,000 copies Mon-Fri (@ $1.40), 364,000 on Sat ($2.40) and [Sun-Herald] 505,000 on Sun. ($1.80). It has increased circulation in recent years. No figures available for pages of advertising sold.
The Melbourne Age has comparable sales figures, though Sunday sales are roughly half the SMH.

'Readership' of the SMH is estimated at 853,000 Mon-Fri and 1,116,000 Sat.
Papers are shared around.

These figures suggest that newsagents & supermarkets make around $1.5M/week for Mon-Fri sales and the same again on weekends. Removing that income stream would leave a very unhappy sales channel...

Payments and Subscriptions types:
  • Pre-paid or billed subscriptions. Like Home Delivery
  • On-demand use: Ad-hoc or occasional purchase
  • Prepaid access to articles.
  • Business or family sharing. Limited copies shared between many.
  • Public access - libraries.
  • "Media Monitors" - clipping services across many sources relevant to a business.

What do Newspapers do?
  • Inform
  • Educate
  • Entertain, and
  • Surprise and Delight.
Newspapers tell you things you need to know, especially things you didn't realise you needed to know.

What comprises 'stories'?
  • facts
  • expert analysis
  • commentary
  • opinion, and
  • interviews
Professional journalism brings brevity and precision. Stories are {complete, correct, concise} and hopefully {consistent}.

The value to the reader is collection and pre-screening: reducing a mountain of data and facts to quickly and easily accessible information. Journalists classify and prioritise stories, letting the reader minimise time used and maximise information found.

This "Usability" principle extends to presentation, layout and story structure.
The fonts and columns widths are chosen to best suit human factors.
Modern printing added pictures to text - an important aid to readers and writers alike.
The use of white space, headlines and graphics/pictures increase readability & accessibility.
Stories follow the "inverted pyramid" - the most important facts first, tailing off. It means readers can quickly scan articles and find the most relevant/useful to them, and sub-editors can easily trim articles to fit by removing trailing paragraphs.

It is instructive to read a 100-yo paper without these modern features - they look dense and impenetrable. The small number of pages would've been a blessing.

Professional Journalists bring special factors to collecting stories:
  • access to people & organisations, like police, politicians, CEO's, ...
  • funding for travel, communication and fees - like FOI requests or corporate data.
  • Researchers, archives and access to expensive subscription services.

The users of News services look for many different benefits and uses:
  • text, images, audio and video
  • Mobile access
  • Alerts and "Instantaneous news items" (the latest stories)
  • Distraction, Relaxation and passing-time (as in commuting)
  • Social settings - Cafe and Brunch
  • Search and information: Browsing classified and focused searches
On-line challenges:
  • distribution format. PDF's or HTML?
    How do I read it on the train?
    Does it read well on a laptop or iPhone?
  • enforcing DRM. How to limit copying? What to do if copied illegally?
  • on-going subscription rights:
    If I've bought a "paper" once, do I have permanent access to it, even after my subscription ends?
  • Competing with Free-To-Net services. Seems hard, have to provide additional value.
  • Catering for local community news
  • Accepting input from the general public.
  • The Blogosphere and non-professional writers. They can't be held to journalistic standards and ethics, can't be reprimanded or censured and will publish/repeat unsubstantiated gossip and rumour.

"non-News" Newspaper functions that Online services may need to duplicate:
  • "paper of record" function: Public Notices, Births-Deaths-Marriages
  • Mass-market display advertising.
  • Marketing and Sales channels. How to grow new business?
  • Access to Printed copies - eg. weekly summaries.
  • Searchable service provider directories and classified adverts.
    Implies RSS-style alerts & monitoring.
On-line service challenges:
  • Monetisation. Advertising, subscription or sponsor based? Other?
  • Niche marketing. By location, interest, community, employment sector.
  • Tiered Subscriptions: free, basic, premium, target area, search tools, ...
    Profits can be maximised by segmenting services with multiple price-points.
  • Organisational access.


Telco pricing and market 'price elasticity'

There's a counter-intuitive effect with marginal cost of Production Factors, like energy (and Teleco services) - using the factor more efficiently, consumes more of the resource. Because you make more profit, lower prices, produce more and demand for the resource increases. The Khazzoom-Brookes Postulate/Jevons Paradox:
"energy efficiency improvements that, on the broadest considerations, are economically justified at the microlevel, lead to higher levels of energy consumption at the macrolevel."
The structural reason is simple: the market is highly price elastic, so decreasing prices a little lifts total sales considerably. In economics, this is a well solved problem for non-monopoly markets, "Profit Maximisation" occurs when MR = MR (Marginal Revenue equals Marginal Costs). [For monopolies, MR = 2*MC, IIRC.]

In the 70's & 80's at O.T.C., we made record profits each and every year - by exceptional marketing and sales strategy, which included dropping prices every year. [The TV adverts series, like the 'Memories' campaign, won many awards.]

This was driven by the technology: Moore's Law drove the per unit cost of services in both cable and satellite down exponentially.

Profits margins increased because full cost reductions weren't passed on. By the mid-80's it was cheaper from the East Coast to call London or New York than use Telstra to call Perth ($1/min).

This is a lesson Telstra Management never learnt and was obviously lost when O.T.C. was subsumed into 'the Borg' in 1992: passing on part of the Moore's Law savings, Revenue and Profits both increase.

Economic theory is very clear on this point:
Traditional (Premium) Telco Pricing isn't just 'bastardry', it kills your profits which will eventually kill your company.


Mircosoft can't write O/S code

This cartoon lampoons Windows 7.0 Beta. Eerie.

It underlines for me that Microsoft is crap at writing Operating Systems code.
O/S's need to be correct, secure, robust (resilient to errors internal & hardware) first and foremost. Only after that look to features and 'performance'.


Costs of NBN - National Broadband Network, Fibre-to-the-Home

Connecting Australia with broadband/fibre systems is closer to implementing Cable TV than a new Phone network. The $43B estimate is 3-5 times too high based on Cable TV rollouts.

That difference completely changes the economics of the project.


Death by Success

Being too successful leads to failure unless you are aware of the problem and carefully monitor and protect against it. Every large company faces this problem - in I.T. for example, Google and Microsoft.


Microsoft Troubles - V

More on the theme of "Microsoft will experience a Financial Pothole" from a Financial perspective.

For less rigorous commentaries, there's Motley Fools piece "The Two Words Bill Gates Doesn't Want You to Hear" (A: Cloud Computing) [Article requires signing-up, but can be found from Your Favourite Search Engine]
A commentary on that piece from Carbonite, a 'Cloud' vendor.

Victor Cook at Customers and Capital has a series of pieces on Microsoft and its fundamentals.
His series on 'Brands' compares GOOG & MSFT at times.
The first post in the 'Blue Ocean Strategy' series, the second, third and fourth posts.
His analyses of a number of issues are informed and enlightening - MSFT vs GOOG, the YHOO & Double-Click acquisitions...

Cook points at a March-19 2009 piece at 'The Wild Investor' called "The State of Microsoft" that starts:
Well we are not in the 1990’s anymore, and unless you plan to hold the stock for 50 years there is really no point to holding shares of Microsoft. Here is why…
and ends with:
The bottom line is that the luster behind Microsot is no longer there. Sure, there is upside to the stock, but how much. We are in a new time where the old blue chips are no longer blue. Stocks like Cisco (CSCO), IBM (IBM), and General Electric (GE) are no longer fun or smart companies to invest in.


How "The Internet Changes Everything" for Journalists

Update 1: 26-May-2009
A very kind person @ ABC took the time to read this piece and to give me some valuable comments:
  • We are paid to make editorial decisions ... This is our job.
  • news is about immediate events and happenings and it is short, brief and factual.
  • 'News' is just one kind of ABC broadcast.
    Don't confuse it with our entire output.
  • As to your specific proposals...
    - SPAM is a "dropdead" problem (my words)
    - automatically generated responses are a waste of time... listener input should be read and acknowledged by a real person,
    - but the problem is that we are all under time pressure.

So where to from here?
Appears to be end-of-the-road for this approach.

How does "The Internet Changes Everything" apply to News and Journalism at the ABC - Australia's national broadcaster?

Contents: Background, How the Internet Changes Everything and Proposal

What are the sources of Good Stories?
Do Journalists have a monopoly on sources and Perfect Judgement on story 'size' & importance?
Obviously not.

But how do folk "out of the loop" gain access to the Gatekeepers of the public media?

Consider two cases and what, if anything, has changed now if they were to be repeated:
  • The "Erin Brokovich effect": through persistent talking & listening to ordinary householders, an apparently minor legal matter became massive. The romantic story of the film has the primary evidence, "the smoking gun", only arriving accidentally, and
  • "Dr. Death" allegations at Rockhampton hospital: a set of nurses at the hospital attempted, for years, to raise their concerns internally & externally without getting any 'traction'. Meanwhile, people were needlessly being harmed, there's no media coverage and hence no political interest in investigating the claims.


The ABC News website is clear, well structured, informative and completely useless and frustrating for someone like me who's not sending Press Releases, submitting news tips/vision, already known to journalists or a 'recognised industry expert'.

I spent a week trying to contact any journalist inside the ABC who might be able to pick up a small but important question that goes to the heart of the National Broadband Network Fibre-to-the-Premises proposal of the Rudd Government: the published pricing ($5,500/house) is 2-5 times higher than both the 1995 Optus/Telstra cable TV roll-out to 2-3M houses and recent FTTP installations.

Something looks very wrong, and asking this and consequential questions of both Politicians and Telco experts/consultants would make good copy and put the ABC at the forefront of this News thread.

But I can't get through... E-mails, web-form emails and phone calls (with follow-ups) have drawn a blank. I'm not some major 'name' or consultancy so my 30+ years experience in the field and a bunch of good innovations just don't count... Which is what I have to presume, because I've heard nothing back. Not even an automatic response.

From my position outside, it appears that internally ABC News operates as a set of independent 'silos' - groups that are completely isolated from one another. If you get through to anyone, they might reject it but offer no assistance in what to try next.

The next step is "go to the top". Which would be the newly appointed "Head of News", who shows up in media releases, interviews, in the Organisation Chart [PDF], but not yet in the 'Contacts' page - which curiously has only postal and telephone contact information. That implies there isn't an automatic system to update all relevant webpages. Writing to 'webmaster' should work - but from my experience, I'm disinclined to try.

If there are permanent electronic addresses to contact people in senior roles, they are not disclosed.

Ditto for any set of 'Editors', 'News Desks' or targeted 'Correspondents'.
Is it possible internally ABC News is this chaotic & unorganised?
Is this "Wall of Granite" exposed to outsiders accidental or intentional?

"The Internet Changes Everything"

Radio is a highly personal medium: Philip Adams insight is in speaking to 'the listener'. He knows he is having a personal conversation with individuals, not a group or an 'audience' as you find in theatres and sports grounds.

People listening to radio are more likely to want to continue the conversation on-line. This is facilitated on the ABC site by web-form email, forums and even a 'Complaints' facility.

The News site even has a 'Contribute' page:
If you witness a news event, the ABC wants to hear from you. We would like you to send us your newsworthy photos, videos, audio clips or even written eyewitness accounts for consideration for use on ABC News.

Can you see the assumption in there? It's insular and iconoclastic.
We find 'the news', you listen.

There's a secondary assumption:
News is only 'events' that can be represented in sound-bites and pictures. There is no allowance for informed contributions and bigger stories.

There is a simple test:
Does the system facilitate or block major public interest stories like "Dr. Death" at Rockhampton Hospital, false 'evidence' claims as in the 'Children Overboard' affair, or problems like gross waste/misuse of public money, dereliction of duty, outrageous behaviour of public officials/politicians or endemic corruption?

It is embarrassingly insular to assume that, as a publisher/broadcaster, you always know better than the entire listening audience what is going on.

That one person, using the technology well and wisely, can access and leverage community knowledge, globally, and affect a major outcome is shown by 'PJ' (Pamela Jones) of Groklaw and her influence on the "SCO case". SCO became the final licensee of the AT&T Unix codebase and sought to leverage this into a 'tax' on Linux, a re-implementation loosely based on Unix.
In the intervening years (2003 - 2007) PJ and Groklaw can be credited with unearthing and exposing many of the flaws in SCO’s case, most notably, obtaining and publishing the 1994 settlement in the USL vs BSDi case, which had been hidden from public view and played a significant role in undermining SCO’s claims to the ownership of Unix.
PJ's efforts and collaboration with the global community were instrumental in SCO losing its case. No one company, even vendors like IBM, Novell and AT&T, and certainly no consumer, had all the information nor all requisite manpower to definitively dismiss the claims.

Lesson: One person can make a difference, if they apply themselves and the technology appropriately.

The Internet is a new thing - it's not just a faster, cheaper, better way to do the same things.
If you simple-mindedly automate existing practices, you will open the floodgates and will drown in electronic verbiage.

When the 'barriers to entry', the cost in time, energy & money, of communicating are very low then people will bombard you with messages. The sheer volume and the Signal-to-Noise ratio means the content is worthless: a small army, let alone a single person, won't be able to read everything and duplicate/irrelevant information will drown out any gems therein.

Computers also hold the key to the problem - they are Cognitive Amplifiers.
They enable one person to do the work of 10, 100 or 1,000. More quickly, more cheaply and often 'better' in important ways.

The ABC News Division employs 700 professionals.
They certainly perform very well, but are they sufficient to find and research all worthy stories locally, let alone all local interest stories occurring world-wide?

What's the price to the Organisation, the Australian Media, Government/Politics and the Australian Public of missing important stories??

There are many problems to be addressed and overcome before a useful system can emerge:
  • SPAM
  • Security/malware - controlling & avoiding upload/dissemination
  • Denial-of-Service attacks and webpage and other information hacking,
  • Retaliatory attacks and deliberate mis-information by 'sources'.
  • Mischief makers, Gossips and Defamatory statements.
  • 'Personal Agendas' and Vendettas/Disgruntled persons,
  • Copyright violation and Plagiarism,
  • Nuisance, time-wasting, 'serial pests' and Vexatious persons
Ain't no bed of roses...

There are always going to be people who wish to remain anonymous - either completely or in the usual 'off the record' sense where they do not wish to be publicly quoted, but are willing provide a written statement and, if necessary, to stand by their comments in court.

One key technical method to address many of these problems is strong identification of posters.
This has to be of similar strength to X.509 client certificates for browsers with the concomitant off-line checks issuing them.
Off-line confirmation identity is essential - like checking the whitepages and calling the person, or sending an SMS - up to sighting 'photo id'.

A Proposal:

Immediately, three improvements would help address my frustrations:
  • publish role-based, not personal, email addresses for both senior positions and the various news desks/editors/specialist correspondents that must exist internally.
  • Add a new contact form for story requests, useful information and leads.
    The sorts of things needed for investigative journalists. It must include topic categorisation for automatic distribution to the news desks/story areas.
  • Automatically acknowledge all contacts - by via email or SMS or other simple means.
A backend system is needed to automatically store and distribute input. Something of this sort must already exist - it certainly is there for published stories.

Writing is a skill that must be practised.

Members of the public do not have journalistic skills, nor a sense of what journalists consider 'newsworthy', nor what is needed to successfully pitch a story - even if they have found the right person. The limited feedback I've received amounts to "just write clearly". I know how to do that in a number of domains, but have no idea what journalists want and need.

This can be addressed at 3 levels:
  • An on-line tutorial and example system.
    Including some template questions and suggestions for ways to both condense/summarise your information and to self-assess its 'importance' and 'newsworthiness'.
  • A limited (5 minutes) response by a journalist to any specific questions or advice.
  • The ability to pay for editorial help in constructing a pitch and a even a story.
    The rate would have to be $50-$75/hour for cost recovery, more to act as brake on overuse.
One very strong asset of the ABC is its 'Friends'.

With the increasing numbers of retired Baby Boomers - including ex-journalists, there should be a large pool of free labour available to review, categorise and respond to the information fire hose that would be unleashed.

The Internet means people can volunteer for an organisation without the classic problems of desk-space, real-time supervision, insurance and other entitlements. Volunteers work from home, when and for as long as they like. They could even be self-administering.

At the end of the day, it comes down to just one question:
What does the ABC consider its own and the community's on-going roles, and
how will it stay current with sociological, cultural & technological changes?


Alan Kay - History and Revolution in I.T.

Alan Kay invented Object Oriented Programming around 40 years ago with Smalltalk.
Kay not only has a lot to say, his accomplishments lend him credibility. In 2003 he said:
"our field is a field that's the next great 500-year idea after the printing press"
The ACM awarded him its highest honour, The Turing Award, in 2003. The short Citation:
For pioneering many of the ideas at the root of contemporary object-oriented programming languages, leading the team that developed Smalltalk, and for fundamental contributions to personal computing.
Video of his 60min talk on the ACM site, and elsewhere, a transcript. The slides & demo used are not available.

This 1982 talk for "Creative Think" brought this reporter reaction (the link is worth reading for the list of line-liners alone):
Alan's speech was revelatory and was perhaps the most inspiring talk that I ever attended.
This is my current favourite quote of Alan Kay's from Wikiquote:
"Point of view is worth 80 IQ point"
A 2004 conversation with Kay on the (deep) History of Computing Languages is well worth reading. Here are two interesting remarks:
One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects. (starting circa 1984)
Just as an aside, to give you an interesting benchmark—on roughly the same system, roughly optimized the same way, a benchmark from 1979 at Xerox PARC runs only 50 times faster today.
Moore’s law has given us somewhere between 40,000 and 60,000 times improvement in that time.
So there’s approximately a factor of 1,000 in efficiency that has been lost by bad CPU architectures.
Kay, in this piece, also mentions a theme - the central problem of writing large systems is Scaling - the Design & Architecture of systems. Anybody can take lumber, hammer, saw, nails and produce some version of a dog-house. To scale up to something very large requires skill, discipline and insight - Architecture is literally "the science of arches", the difference between Chartres Cathedral and the Parthenon. Both contain around the same amount of material, the cathedral encloses ~20 times the volume and towers are 10+ times higher.
AK Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.

SF The analogy is even better because there are the hidden chambers that nobody can understand.

AK I would compare the Smalltalk stuff that we did in the ’70s with something like a Gothic cathedral. We had two ideas, really. One of them we got from Lisp: late binding. The other one was the idea of objects. Those gave us something a little bit like the arch, so we were able to make complex, seemingly large structures out of very little material, but I wouldn’t put us much past the engineering of 1,000 years ago.

If you look at [Doug] Engelbart’s demo (wikipedia) [a live online hypermedia demonstration of the pioneering work that Engelbart’s group had been doing at Stanford Research Institute, presented at the 1968 Fall Joint Computer Conference], then you see many more ideas about how to boost the collective IQ of groups and help them to work together than you see in the commercial systems today.

Other interesting pages on Alan Kay talks:

Relevance to Open Source and Paradigm shifts

Kay claims that 95% of people are 'instrumental reasoners' and the remaining 5% 'are interested in ideas'.
an instrumental reasoner is a person who judges any new tool or idea by how well that tool or idea contributes to his or her current goal.
He goes onto talk about reward/motivation and says that 85% of people are 'outer motivated' versus 15% 'inner motivated'.

Most people (~80%) fall into the 'outer motivated instrumental reasoners' group.
These people won't pick up an idea if other people aren't doing it. Which seems like a very wise evolutionary group tactic - if a little safe.

Kay, in his ACM talk, uses a contagion or forest fire model to demonstrate/claim that around 66% of a population is needed to achieve 'ignition'. To hit the tipping point where 'everyone is doing it' and the new idea takes over.

Kay also makes the observation that Big Ideas (he cites Unix) often take around 30 years to hit the streets - to become normally used.
And also wondered why after ~40 years Ivan Sutherland's ideas from his 'sketch' program from his ~1961 thesis haven't broken through.

Applying to Open Source Propagation

Putting this idea, if correct, of 'tipping point' to work in spreading FOSS :
  • find/choose communities who are high in 'interested in ideas' (artists & creatives?)
  • find a small community and intensively sell/lobby/influence it to get to the tipping point.
  • leverage these communities or even artificial environments by introducing 'outsiders' (high schoolers?) to a converted environment.
But what Kay doesn't go near is that even the best marketers, PR and media people can't predict the next fad/craze/mene - they happen because they happen.


Reactionary or 'Frothing at the Mouth'?

Is my opinion "Forget the Best, Embrace the Rest" over the top, reactionary and irrelevant nonsense?

The State of Practice is beyond criticism - because there is no useful information on it.

Here are 3 questions to ask of Management theory & thought:
  • Exactly why "Management is Hard"?

  • What are the tasks of Management?
    i.e. a formal & unequivocal model of the dimensions of action and decision, resources & information required and skills/capability required of individuals and teams.

  • How to quantify the performance of individual Managers and the Management Team?
These are fundamental questions and should at least be definitively outlined in any introductory text or course - but aren't.

Which begs the question: Why aren't they addressed?

Either I'm completely off-track & uniformed or have outlined something of merit.

If this viewpoint is of merit, What then?


Forget the best, embrace the rest

It appears to me that 'Homo Corporatus' (the 'management classes') rejects, seemingly actively, the need for maintaining "Lessons Learned" and adopting in practice the best theories & principles known. The Operant Methodology seems to be:
Forget the best, embrace the rest.
This isn't a little or accidental. It's endemic and universal.


Inverse Turing - Modelling Human Brain with Universal Computer

Turing showed that all computer programs could be transformed to run on a very simple 'Universal Computer'.

One of the big questions this transformation allows is: Will the program Halt?
[The Halting Problem]
This, in the guise of a program to calculate the result, was used to prove some problems are not computable.

Turing also devised a unrelated idea - the Turing Test - that was at the heart of Artifiical Intelligence (AI) research for many years: If you can't see or hear what/whom you're conversing with, can you tell the difference between a real Human and a computer program?

Turning the 'Turing Test' around: Is the Human brain built from computing elements?

This isn't quite "Is it a computer?", there are two elements the structure and the componentry from which it's built.

If the Human brain is a computing device, and if its organisation and 'programming' can be discovered, then it can represented on an electronic computer. This is an unstated idea underlying the Turing Test.

The act of discovering the organisation and connections/programming could be impossible directly:
  • every human brain is unique in structure and organisation, though built from the same components (neurons) and performing the same tasks (seeing, hearing, recall, ...).
    This can be trivially shown by 'plasticity' - when a person suffers a traumatic brain injury (like a stroke or lump of steel through the head) their brain keeps functioning and often regains lost capability. Tasks normally performed in one area are taken up elsewhere.
  • Because of the microscopic nature of neurons and their axon connections and the sheer volume of them, it's unlikely a non-destructive 'read' of the whole brain will be possible.
But the idea is still very useful, even if we are unlikely to ever be able to clone our brains.

Exact computer models of brain functions/sub-systems can be built, examined and experimented with.
This is not useful in AI for recreating functions like speech recognition - electronic computers work so differently to neurons/axons that it doesn't yield insight or help for a 'reimplementation'.

For disciplines like Psychology, Sociology creating models of brain function and interactions takes study to a new level - it allows precise and repeatable initial conditions and impossible experiments to performed.

Most importantly, it removes 'hand-waving' explanations and forces exactitude and direct evaluation of theoretical models.

Computer animation tools, like MASSIVE, already implement 'autonomous agents' programmed with simple human behaviour/responses. This is a perfect platform to model and test Sociology models.


Microsoft Troubles - Starting signs

Previously I've written I expect Microsoft will experience a "Financial Pothole" around 2010, this is not it, only a portent of things to come.

Microsoft reported a profit slump and job layoffs today (AP, Reuters), as did many others - Intel and e-Bay are cited.