In business intelligence projects, assembling the data is large part of the deal as you go about building an independent data warehouse or set one up in Business Intelligence reporting software itself.
The Pareto theory to focuses on 20% of effort to get 80% of the value does not work in business intelligence projects. You need to get 100% of the data right and cleaned up before it can be mapped and reported consistently.
Even cutting corners by doing a departmental business intelligence may work for a while. But in the end if it is not linked to the published finance results it will be soon be undermined and will eventually end in the scrap bin of good ideas that did not fly; along with those who attempted to fly them.
Using an enterprise level business Intelligence project as a focus to clean up data in itself is a great way to go. The benefits are most often immediate. In fact to get those jobs done any other way almost never happens as the business continues to struggle on with hybrid decision processes that hinder progress.
Building a consistent data-warehouse in fact is the real job in setting up business intelligence reporting. The problem there is cleaning up is often costly and the perception is by most that there is no problem or cleaning adds no value, so it gets left out of the value discussion.
As a Finance or IT manager how often have you heard, “We already have a good data warehouse and all we need is a Business intelligence tool to do our reporting from it.”
As one who lives in both worlds and in my case as a consultant too it actually tells me to beware to do much more proving work on the business case. And do more project due diligence work before locking in on contracts.
In the case of the CFO or Business Vice President championing the work, this should also be done before seeking the budget and letting procurement loose to go find a vendor and a competent consultant. Finding the bad data story after the fact, when the installed team and software is on deck, is far too late. No-one cares by then as they know you are between a rock and a hard place with little or no way out.
And if the business is performing well and the goal is for faster decisions information to maintain momentum, then time to delivery as goal is seen as the most critical project driver. Then things like consistency are automatically assumed to be there or must be dealt with regardless of other change impact considerations. But of course in this type of case given performance is relative in terms of these constraints, problems that most often show up are on the enterprise’s source systems themselves. So then the paradigm shift as a goal is stalled and the project, which may get off to a great start, quickly falters as these issues surface.
So it actually does take a great deal of resolve, process change management and team work to agree consistent business rules at all levels to solve the issues. That is at the heart of it all. And it is what can trip you up if you don’t stay focused and get it under control. The rest of the work is really just technical and is quite straight forward with limited risk.
Hence in well thought through projects where the value of cleaning is recognized and the correct value focus is brought to bear, it can pay for the project many times over. In such cases this should be recognized as a business benefit to be targeted and not just left as a by-the-way or left out as a buyer beware tactic to get unknown issues solved and cost savings on the cheap.
And in the end who cares what BI tool we use. The truth is if you don’t use one you are foolish as the disciplines they bring alone are worth the money, I should quickly add that this only applies if they are setup by people who actually do know what they are doing. I have seen too many IT selves with software still its shrink packed box never opened. Or when it does it is badly used by installation novices. As one of my software vendor contemporaries said to me recently
“if you don’t have a competent data management team included in your BI project, then I hope you don’t choose my software as I don’t need the reputation”
As too often we aim for the utopian state to exploit what comes only after the hidden work is done. getting there is often actually where the value is as your conversations across the business sort-out the issues in a more natural way. So seeing it as a burden and a delay to project is folly. Doing this will invariably cause frustration and loss of focus and may cause it to even falter and/or fail.
More to the point business leaders who provide budgets for this work, who may also have been part of the evolution that unknowingly or otherwise create the issues, invariable underestimate what it will takes to fix them. They must understand it takes momentum and motivation to get the tough and dirty job of cleaning done. And that business intelligence is about their future and not just some fancy reporting process that sends emails on delinquent performance and helps cuts the costs of doing things in spreadsheets.
As sponsors entitled to see more visible progress to the end game solution that they approved the budget for, they should call to account project managers to bring to attention any value that gets the money back earlier than expected. By simply enforcing standards and making data process improvements before the project is even completed will delver this.
Hence the value is in the understanding the secondary benefit of cleaning up data and continually selling the value of the process it takes to get that work done is vital.
It is not just all about setting up dashboards and dials to help focus and understand the data but also about having consistent data that has universal acceptance and integrity. This combination in turns allow business intelligence to be used to create an intelligent business
For many of us doing this do we really need to rethink our mission and how we manage.
In a related post Failing-address-data-quality-and-consistency here are some very key points
Don’t fall into these traps. Don’t assume anything about the state of the data. The areas where data quality and inconsistency problems lurk:
- Data quality within systems-of-record applications may be “masked” by corrections made within reports or spreadsheets created from this data. The people who told you the data is fine might not even be aware of these “adjustments.”
- Data does not age well. Although data quality may be fine now, there’s always the chance that you’ll have problems or inconsistencies with the historical data. The problems can also arise when applications like predicative analytics need to use historical data.
- Data quality may be fine within each systems-of-record application, but may be very inconsistent across applications. Many companies have master data inconsistency problems with product, customer and other dimensions that will not be apparent until the data is loaded into the enterprise Data warehouse.