Application Use Case
Each application should be evaluated in terms of its primary use case, reporting, analysis or audit. Applications should be restructured to provide optimum performance for its use case.
Best Practice
Data marts should have a relatively short lifespan which correspond primarily to the dynamism of the business environment or the analysis required to achieve a specific set of business objectives. In this way users will get optimum performance.
Multidimensional cubes should not be designed to ‘boil the ocean’ nor only, at the other extreme, to support very specific reports. Instead they should retain enough information so that they can be properly useful with a 1 year lifespan in mind.
Report, Analyze or Audit?
When selecting a reporting tool or optimizing a data mart, I have found that users are looking for three broad categories of output from their queries. There are lots of different ways to break these three categories down, but I think of these as three different types of queries in terms of what the end users actually do.
Report
When I speak of a report, I am thinking that the user wants to see data in a
familiar format. Any time a user asks for a particular number in a particular
place and he has an idea what that number is going to be, then he is asking for
a report. A report in the traditional sense means headers, rows and columns but
that's not necessarily the key. A report can almost always be consumed at a
glance and key parts of that report are known.
If you sit a piece of paper with numbers on it in front of a manager, it is a standard report. That manager will look at one number and then another and rather quickly come to a decision about whether things are OK or out of shape. This generally happens so fast when a manager is accustomed to a report that we may assume that they are not paying attention. What I have found is that they are actually making several decisions based upon what they see on the page, and in certain cases it may lead them to go to another page. That is almost considered a 'drill down' and it is often mistaken for analysis. But it is not, rather it is a related report or a detailed report. In the end and in either case, the consumer of the report knows where they are going and the answer they are seeking is generally of the sort, Good, Bad or OK.
That is way I consider dashboards to be reports. More specifically, stoplight charts, thermometers, line graphs, trend bars and the like are all reports. They may be interactive, they may be fixed, but in the end what you come up with is a value judgment on a particular part of the business. The difficulty is that it may or may not be a useful part of the business. It might not tell you nearly enough. But you need reports to be dependable, reliable and formally defined. When you get down to fonts and colors and positions on the page, you're definitely in report-land.
Audit
Audits are detailed reports that match up to something. The end result of an
audit is increased confidence in the quality of the data. This is something
that managers are not generally interested in, but may come to discover. Audits
are all about reconciliation.
The difficulty with audits is that you are generally looking into minutia that doesn't drive the business. It's more like counting fish rather than weighing them. Audits are particularly useful in developing an operational understanding of the business. So you will find that DW requirements that allow for adhoc queries into deep detail are generally fresh to the business and/or have a disconnect between representations of the data in more abstracted reports and the people who are close to that part of the business.
You will often find a disconnect on requirements generation over MOLAP and ROLAP or multidimensional and relational technology over the question of audit. Users who are accustomed to looking at simple green bar data and making successive filters on queries and storing those are more thinking in the Audit paradigm. The idea of aggregating 10 million rows upon load to small aggregate dimensions may be alien to them. I've also found a reliable indication of the Audit mindset when I have a requirement to show the metadata of the calculated metrics.
Audit detail is always going to be a requirement but it is one that diminishes over time when the business implications of the operational data is clear and the feedback process is institutionalized. In the meantime, get ready to audit individual employee hours, individual journal entries, individual SKU counts and the like.
Analysis
Analysis differs from reporting and auditing in that you don't know exactly
what your analysis is going to yield before you begin it. A multidimensional
analysis is configured to draw one through a series of dimensional factors
which contribute to the size of a particular metric. That is to say it is a
continuous disaggregation process which gives the user an idea of the causal
factors. Ideally, analysis is the reverse of planning. But in any case an
analysis is not a result but a process.
Most generally analysis leads one to exceptions and causalities. So exception reports and alerts can be said to be analytical but only to the extent that they can be indications of multiple contributors. In any case, an Analysis doesn't need involve the user drilling into deep detail if the implications of those details are known. Metrics can be established that reveal deep exceptions at a summarized level.
Analysis requires a known framework. You have to know generically what kinds of exceptions and correlations you are going to expose through your analytical process or else your results may not be actionable. You can end up navigating through huge heaps of data which give you indications then all you can do is say 'so what?'. That kind of analysis is exploratory but not often a good use of IT resources, unless you are indeed mining data.
Data mining is analysis of data that allows you to make connections. These connections should be incorporated first into your planning process if you intend to do something about them.
Batch vs. Interactive
Hyperion has a set of tools which are optimized for producing batch reports. Using Hyperion Production Reports (PR), highly specific and optimized reports from relational and / or multidimensional sources can be published and distributed via email in many forms including PDF. When there are books of reports to be produced on a daily or weekly basis, this can be far more efficient use of compute resources than generating the same reports via custom methods like VBA.
Within the context of migrating to an officially supported version of the Hyperion suite, IT should take advantages of program features rather than custom programming when the Hyperion version has all of the same capabilities.