Is the 80-20 Law of Probability Still Relevant?

In the focus of this picture, you may notice that two of the three artists are left-handed.  That is not surprising. History shows a substantially higher percentage of left-handed people are creative compared to the rest of us. However, with only 10% of the world being left-handed, and the recognized creators for the future likely to be in their ranks, should we dismiss as just followers, the 90% of communities that are right-handed?

It is also reported that 20% of road accidents are caused by drivers with excess  blood alcohol levels. Given the remaining 80% of accidents are caused by sober drivers, it could be argued, that it is safer to be drunk.

With the creative paradox and the humorous chestnut a close reality, is it fair to suggest that the 80-20 rule disciples may be wrong? When it comes to the macroeconomics and efficiency concepts for handling revolutionary change, we may need to flip the focus to the 80% and just leave the already proven 20% alone.

With Big Data now ubiquitous and possible for all we are seeing this happening more and more. The emerging revolution and re-forming of the Information Technology industry is targeting the 82% of organizations who still lack visibility on a line of business profitability.

Of the 18% who do, they have long since learned the powerhouse value of using deep dive access analysis. Governments agencies, tax departments and the like all now link our transactions and metadata to make economic decisions and ensure compliance.  In the Insurance industry, I learned this week from a discussion with Mike Plaxton CEO at FWD Life, that the track record of achievement there with their leap-frog market share growth in the last 3 years, has been due to this style unconventional approach. Adopting a hand-off model on what works, with systems to manage both the big data parts and the whole, has let them stay in control while they press on with their market competitiveness and grow ambitions.

Business and Large Organizations know the latency of post qualifying silo processes means getting results that way is far too slow. For example, it is no longer valid is waiting for Finance to prepare reports to assure integrity. Even simple pervasive technology on an android phone is able to link profit performance directly to product sales right at the cash register stage. So integrating performance responsiveness is not longer just for elite big budget businesses.

However as big data business engines are now unstoppable, a paradigm shift is still underway to move Small to Medium business thinking to take on big data operating mode to survive. That means getting them to leave behind conventional indirect reporting methods such as unconnected spreadsheets that are costing them dearly.

Businesses that continue with these dinosaur methods, that lack direct responsiveness analysis, risk becoming extinct as competitors and markets outdate them.

The landscape of business software market is also changing. The cash cow days of  “Traditional” business intelligence-based performance management tools are also over. These products, with out-dated expensive lock in licensing models, are now almost commodities and to survive must merge into higher level service models with either on-premises or cloud managed delivery options.

Those that do will tap into the highly intuitive multifaceted small to medium business market. That has completely different needs, including how they see and manage things like supply chain based costing and delivery performance.

The spreadsheet which for 3 decades has given power to the user must also adapt, being all too often limiters to scaling up. The vital business intelligence buried in these complex labyrinths of manipulated data diminishes the value in these unsecured unstructured reports.

For an example of the already powerful, misplaced link to data sources function in Excel is not exploited. As an integrated analytics function, its unstructured flexibility fails as people prefer remaining simple and miss the secondary benefit.

Most everyone wants timely analysis reports delivered directly from their own transaction data systems. Hence the all too often dominant and highly inefficient disparate middleware processing including the reporting role of standalone massaged Excel is all but an already extinct process. Facebook Google and other social media that have become giants with enormous analytics value have also turned marketing on its ear. The challenge for the business software industry is adapting to these changes making it all much more seamless and simple to own and use with the big data able to be tamed and harnessed.

The professional services industry tied to this, to deliver the capability,  also needs to change. Traditional slow consulting that “borrows your watch to tell you a new way to read it” have all but changed forever too. Rapid deployment methods with support based models have been replacing traditional long gestation “system life cycle developments”

The so-called journey to get end users get up and running fast, is days, not months, This previously time-consuming processes is happening so much faster and often too in a highly responsive live environment.

Business intelligence, so-called, is also no longer just fancy word only big business understand. As high-quality insights and responsiveness have become a norm in live interactive data dashboards delivered on hand devices, everyone, even the kids, is able to use and understand  how get information for value.

As these changes give ever more visibility to see whom and what is delivering the best value,  it will become part of everyday life to uncover the secret of the elite mantra in the 19th century Vilfredo Pareto concept of efficiency.


Updated from a 20/80 hypothesis first published By Gordon Wood on May 4th, 2014

2 thoughts on “Is the 80-20 Law of Probability Still Relevant?

Leave a Reply