The business intelligence project assembling line.

Business Intelligence Stack

In business intelligence projects, assembling the data is large part of the deal as you go about building an independent data warehouse or set one up in Business Intelligence reporting software itself.

The Pareto theory toclip_image002 focuses on 20% of effort to get 80% of the value does not work in business intelligence projects. You need to get 100% of the data right and cleaned up before it can be mapped and reported consistently.

Even cutting corners by doing a departmental business intelligence may work for a while. But in the end if it is not linked to the published finance results it will be soon be undermined and will eventually end in the scrap bin of good ideas that did not fly; along with those who attempted to fly them.

Using an enterprise level business Intelligence project as a focus to clean up data in itself is a great way to go. The benefits are most often immediate. In fact to get those jobs done any other way almost never happens as the business continues to struggle on with hybrid decision processes that hinder progress.

Building a consistent data-warehouse in fact is the real job in setting up business intelligence reporting. The problem there is cleaning up is often costly and the perception is by most that there is no problem or cleaning adds no value, so it gets left out of the value discussion.

As a Finance or IT manager how often have you heard, “We already have a good data warehouse and all we need is a Business intelligence tool to do our reporting from it.” clip_image004

As one who lives in both worlds and in my case as a consultant too it actually tells me to beware to do much more proving work on the business case. And do more project due diligence work before locking in on contracts. 

In the case of the CFO or Business Vice President  championing the work, this should also be done before seeking the budget and letting procurement loose to go find a vendor and a competent consultant. Finding the bad data story after the fact, when the installed team and software is on deck, is far too late. No-one cares by then as they know you are between a rock and a hard place with little or no way out.

And if the business is performing well and the goal is for faster decisions information to maintain momentum, then  time to delivery as goal is seen as the most critical project driver. Then things like consistency are automatically assumed to be there or must be dealt with regardless of other change impact considerations.  But of course in this type of case given performance is relative in terms of these constraints, problems that most often show up are on the enterprise’s source systems themselves. So then the paradigm shift as a goal is stalled and the project, which may get off to a great start, quickly falters as these issues surface.

So it actually does take a great deal of resolve, process change management and team work to agree consistent business rules at all levels to solve the issues. That is at the heart of it all. And it is what can trip you up if you don’t stay focused and get it under control. The rest of the work is really just technical and is quite straight forward with limited risk.

Hence in well thought through projects where the value of cleaning is recognized and the correct value focus is brought to bear, it can pay for the project many times over. In such cases this should be recognized as a business benefit to be targeted and not just left as a by-the-way or left out as a buyer beware tactic to get unknown issues solved and cost savings on the cheap.

And in the end who cares what BI tool we use. The truth is if you don’t use one you are foolish as the disciplines they bring alone are worth the money, I should quickly add that this only applies if they are setup by people who actually do know what they are doing. I have seen too many IT selves with software still its shrink packed box never opened. Or when it does it is badly used by installation novices. As one of my software vendor contemporaries said to me recently

“if you don’t have a competent data management team included in your BI project, then I hope you don’t choose my software as I don’t need the reputation”

As too often we aim for the utopian state to exploit what comes only after the hidden work is done. getting there is often actually where the value is as your conversations across the business sort-out the issues in a more natural way. So seeing it as a burden and a delay to project is folly. Doing this will invariably cause frustration and loss of focus and may cause it to even falter and/or fail.

More to the point business leaders who provide budgets for this work, who may also have been part of the evolution that unknowingly or clip_image006otherwise create the issues, invariable underestimate what it will takes to fix them. They must understand it takes momentum and motivation to get the tough and dirty job of cleaning done. And that business intelligence is about their future and not just some fancy reporting process that sends emails on delinquent performance and helps cuts the costs of doing things in spreadsheets.

As sponsors entitled to see more visible progress to the end game solution that they approved the budget for, they  should call to account project managers to bring to attention any value that gets the money back earlier than expected. By simply enforcing standards and making data process improvements before the project is even completed will delver this.

Hence the value is in the understanding the secondary benefit of cleaning up data and continually selling the value of the process it takes to get that work done is vital.

It is not just all about setting up dashboards and dials to help focus and understand the data but also about having consistent data that has universal acceptance and integrity.  This combination in turns allow business intelligence to be used to create an intelligent business

For many of us doing this do we really need to rethink our mission and how we manage.



In a related post Failing-address-data-quality-and-consistency here are some very key points

Don’t fall into these traps. Don’t assume anything about the state of the data. The areas where data quality and inconsistency problems lurk:

  • Data quality within systems-of-record applications may be “masked” by corrections made within reports or spreadsheets created from this data. The people who told you the data is fine might not even be aware of these “adjustments.”
  • Data does not age well. Although data quality may be fine now, there’s always the chance that you’ll have problems or inconsistencies with the historical data. The problems can also arise when applications like predicative analytics need to use historical data.
  • Data quality may be fine within each systems-of-record application, but may be very inconsistent across applications. Many companies have master data inconsistency problems with product, customer and other dimensions that will not be apparent until the data is loaded into the enterprise Data warehouse.

7 thoughts on “The business intelligence project assembling line.

  1. There are certainly a lot of details like that to take into consideration. That is a great point to bring up. I offer the thoughts above as general inspiration but clearly there are questions like the one you bring up where the most important thing will be working in honest good faith. I don?t know if best practices have emerged around things like that, but I am sure that your job is clearly identified as a fair game.

  2. Interesting discussion of the quality of data and need to clean-up the data before employing any business intelligence. Thanks for touching on some of the managerial aspects of it.

    Indeed, Cleaning data to obtain quality data, is important, as performing high dimensional analysis, requires consistency and availability and relevancy of data multiple dimensions (Curse of dimensionality).

    Yet, this are not all the considerations to data clean-up. One might also expect to be frustrated when integrating data from difference sources (flat files, excel, SQL) or with data having difference regular expressions.

    Best idea nonetheless, is to be intimately familiar with one’s data, and always plan ahead before deciding on a BI project.


    1. Hi Timothy

      Thanks for your very insightful comment. Yes I agree it is not all beer and skittles and especially in the areas you allude to when considering varied and often cobbled sources.

      One thing that occurs to me as I think about your comment. is adding new sites and adding new channels etc. This can be a bit of a nightmare if you don’t have a good base with a multilevel design. And of course growing businesses tend to do that with great regularity. And when they do, the data you get is always a mystery bag of varied sources and in structures that never fit. Plus it can also often be hostile in terms of being open to collection as you try to get it under control so investment is not lost.

      That presents a whole new set of issues that may not have been thought thru in the design at the get go. Failure there may see the BI system go the way of many before it have in favor of a new vendor who is willing to take on the issues. Hence getting a good design is vital to enable growth and not have it constrained. I beleive data no matter what the source should be able to be squeezed into a BI system at the very least at a consistent summary level, until the back end detail feeders can catch up.

      I am sure you would have some deeper experience in this arena which I would love to hear more about.

      Best wishes


  3. Pingback:

Leave a Reply