The formula for BI failure
In our field of work, we have encountered many times where Organizations have spent millions in implementing self-service BI tools but ending up not using them as expected.
Encouraging a culture of self-sufficient data mining, ensure data driven decision making whenever possible are among the key objectives why an organization will do this. A fully functional tool is also expected to significantly reduce the time spent by people in creating routine reports and reduce human errors.
When an implemented tool is not used as expected, Organizations typically take two routes.
Organizations which have lots of money – Investing in another tool hoping for the best
Organizations which don’t have that much money – Pause the project indefinitely
When we investigated why this is happening we found three very simple mistakes, happening in the process.
Tool Misfit – The nature of analyzing data, is part and parcel of the organizational culture itself. How the leaders would like to see and evaluate information becomes the reporting practice or an organization. Some leaders prefer digging data deep, visualizing the information in tabular/ numerical formats whilst some prefer visualizing the data graphically, see the relationships at abstract level to make quick calls. And when we consider the BI tools, some tools are developed with better capabilities in data crunching, whilst others have excellent graphical features. We talk about how two key tools compare in terms of their different features and strengths in a separate article. Understanding your organization’s way of interpreting data is key before finalizing a tool.
Tool Implementation process ignores the Business Users – The business users are the ones who know the best what information they need to make decisions/ reports, how frequently they would like that to be updated, and what summarization level they need the data. If the business users are not consulted when implementing the tool, the information it captures will be of minimal use for the users to create visualizations on top. As a result, we have seen most of the business users being stuck in using older way of doing reports, mostly manually while a well capable tool is kept idling
No proper user training – Again, this is about disregarding the actual users. Not investing enough “time” in training the users formally and on the job results in poor user migration. User training needs to be a fun and pressure free activity done over a considerable period of time, where they need to be encouraged using the tool at their will than on order.
If your organization has already made any of above mistakes and would like to know what you could do to rectify the same, we can suggest following, other than replacing the tool
If the mistake you have done is inadequate training, start a training program. You could start with one or two days hands on class room training, but what is more important is getting them to practice the tool on the job, hand holding for a period on issues that can arise while developing their reports, and more importantly allocating sufficient time formally and not on the “after everything else is done” basis.
Developing own reports from day one can be a daunting task. You could start with getting some of your key reports professionally designed and developed, so that your people can start “using” the tool, subsequently migrate towards modifying the reports and then developing from the scratch
Re-design and re-develop how your data is connected to the tool, this time get the insights from your business users