The terms ‘dashboard’ and ‘scorecard’ have been used interchangeably since their inception. So too, is the concept of using a dashboard for reporting versus performance management.
To clarify, a scorecard is a dashboard with a integrated performance management methodology – such as Balanced Scorecard or SPI.
A scorecard is also used as a reporting snapshot of performance, using symbols and icons to summarise the status of progress across multiple perspectives.
This differs greatly from a performance dashboard which relies more heavily on graphs and tables, rather than icons. A performance dashboard focuses on trends and pattern detection rather than the current status communicated by a scorecard. That is not to suggest that icons and conditional formatting are not used in a dashboard to alert users to specific areas of concern; however their main communication device is more graphical in nature than a scorecard.
Dashboards focus more on key performance indicators [KPI], rather than simple metrics more often used in scorecards. A KPI is a metric + target or other comparative. Such targets or benchmarks serve as indicators of a desired state, against which performance is measured. KPIs targets typically have final as well as interim targets, such as monthly sales targets. Where tolerances are set, these may also be coupled with alert functions which activate when a tolerance threshold is crossed.
Most dashboard portfolios start out as tactical management reporting dashboards. From here, the lead management team typically request a more performance oriented dashboard. This triggers a conversion of existing reporting dashboards into performance dashboards – incorporating change management in both cultural and data presentation semantics.
Performance dashboards are extremely contagious – once users recognise the difference between reporting and performance management they demand more and more predictive capability to identify new trends and patterns in their strategic performance and business operations. Using predefined drill paths based on logical dimensions, users can explore underlying detailed data and run the reports they require to meet individual needs.
Getting To Go
The hardest parts in developing a dashboard are:
Gaining consensus on KPI definition – often the selection of KPI meets little resistance, until one attempts to gain agreement over the calculation and business rules. Further along the dashboard roadmap, KPI hierarchies must be defined to identify relevant KPI relationships and KPI parent-child associations and contributing indicators. KPIs link the business infrastructure with the technical infrastructure – defining the cubes that are built and business rules incorporated into application systems and databases. Remember that KPI have a lifecycle – review dashboards regularly to ensure all KPI are still relevant, and that hierarchy and relationships still support the current business structure.
Design – secondary to KPI definition, users attempt to drag along out-dated design concepts used in past reporting dashboards. Taking users through a brief seminar on performance dashboard design concepts goes a long way to overcoming this hurdle. The need for too many KPI on a single view can be accommodated using small multiple graphs or separating out the dashboard into a tabbed series – focusing relevance on the most important KPI. Simplicity is the overriding design element in all dashboards. When designing a dashboard – use agile, iterative cycles that bring IT and the business together. In just about every dashboard project I do, there is at least one KPI requested for which there is no data available. The assistance of IT to identify this early on is paramount to the overall deliverability of the dashboard project.
Data – gaining access to accurate data, and integrating it to a data store seems simple enough. On average, a dashboard will pull data from eight different sources; including relational data warehouses, packaged applications such as financial, CRM and HR systems, Excel spreadsheets, third party API and data marts. In spite of the range of sources, the amount of data requirement to support performance dashboards in relatively small, when compared to the average size of a data warehouse. Most contain less than 50G of data; only around 25% store up to 250G of data.
The value of real time data only applies to businesses that have activity cycles in minutes, rather than days or weeks. The refresh cycle of data feeding a dashboard depends upon the unique needs of each business. Often I meet requirements for real time or hourly latency, when daily or even weekly more than meets the needs of the business.
Most metrics are refreshed on a daily basis – only high volume sales organizations and large manufacturing plants require real time or near real time data.
Planning the Dashboard Infrastructure
Regardless of how many business groups are defining dashboards – plan to develop all dashboards on a single platform that leverages a unified data integration information structure.
This data infrastructure should incorporate ETL tools appropriate for BI use, as well as data quality and metadata management tools and processes.
By incorporating check in/out and version control, the BI team can support the deployment of development, testing and production environments to support multiple concurrent dashboard projects.
When defining the dashboard portfolio and user groups, plan for rapid growth. Expect the initial footprint to be 15-20% more than initially forecast. Following initial deployment, at a minimum plan for a 20% increase in users; 15% increase in queries and four to five new data sources every year.