Sessionize is a term I haven't heard or used in 8 years. Evidently it's now done nice and neat.
Back in 01, but actually before then I was working with the Hyperion guys over at eCRM and one of the things we had to do for our new product was build in a sessionizing function. Our new product was Website Analysis, many years before its time.
I have a moment to study and so I'm going to see if my brain can actually handle Hadoop. So this is the first video I found. I get it.
Now that I'm with a serious and world-class Oracle consulting shop, I get to vent my bile about all of the Oracle stuff I've had to deal with over the years. There are lots of reasons to dislike Oracle. If you're IBM you dislike them because old DB2 heads with VLDB experience on mainframes want to outperform on the DW side and Oracle clutters up the joint with all their application stuff and client server stuff. If your an Essbase guy you dislike them because they're Oracle and you don't get a shot at OLAP until their star schemas fail. "If Oracle can't do it, it must not be possible".
Well I've had to deal with the latter for a while, but along the way I had plenty of success, especially when I get a chance to do a trial and showoff Essbase with the customer's own data in under a week. This was my specialty and I never lost a trial. Not coincidentally there were two products that I was always able to slam, they were nVision - the viewer to Peoplesoft data and Oracle Discoverer, which was essentially what we considered at Hyperion to be the product competitive to Cognos Powerplay and what is now Hyperion Web Analysis.
For the first time in my life, I actually got to see Oracle Discoverer, and I am pleased to say that I get it. In fact, I've been able to decode a good deal of the OBIEE product offering with a few hours' exposure to Discoverer and Daily Business Intelligence aka DBI. It is rather what I expected it to be which is a web-based version of pretty much what Hyperion does with "IR" aka Interactive Reporting, aka Hyperion Intelligence, aka Brio Query. The advantage that Discoverer has is that it is able to use a fairly vast library of materializable views which are in the bowels of every Oracle App. In other words it seems to have a massive advantage over that thing I used to know as the BO Universe.
What nobody has ever done, and probably needs to start doing like NOW, is get some Essbase heads like me, for whom MDBMS design is second nature, together with some Oracle Apps people who understand those things they call Application Responsibilities (or is is Business Areas) - basically that huge tangle of Oracle Apps tables from which a Discoverer guy would cobble together those reports that Discoverer does well. It would be brilliant. I can't tell you how long it's going to take for me to get the time to do this, but when it happens, watch out.
What Oracle has apparently done, is taken the tactic of putting together a massive pile of templateable views and tables so that super users can put together a fearsome array of simple reports into a big fat pile of web based reporting objects. In other words they have reverse-engineered by brute force the sum total of their best guess of what people want to see and made it relatively easy to get Discoverer at the metadata and thus made all their apps reportable. What nobody has done is gone at this from a multidimensional design perspective - true OLAP - and gone at the data in bigger chunks which would be applicable to a true hub and spoke datamart strategy. The result is that you have more reports than anybody could possibly want, but still a very high cost of ownership to get the very specific custom query spaces that real customers eventually want once they get over the fact that they have BI.
Obviously the solution is Essbase, and when I (or somebody like I) can get enough people in the Oracle world to see this, the disco balls will sparkle and it will be a whole new ballgame. I'm bold enough to say that all of the analytical goodness of 'Siebel Analytics' or whatever that technology is called (maybe Noetix views?) does not hold a candle to the pure power of the Essbase engine. The only advantage that it has is that, according to the above tactics, it is already in place and ready to generate queries from Heck. But my nickel says no way anything pulling from materialized views or even cached versions of those views are going to be as fast as Essbase cubes.
My gut says that as Fusion matures, Oracle is going to allow customers to have it their way. That means they will build architectural paths through Essbase as well as paths around it. Sales guys will sell what they know how to sell. But the customer who understands how to build a warehouse from the mart backwards is going to have a huge advantage if they take the Essbase path. Why? Because Essbase was talking to desktop based widgets three years ago. All the API stuff is done, debugged and proven. Little Hyperion only exploited this in two or three major applications. Big Oracle can do it everywhere.
For those of you who are new to Essbase but familiar with DW, let me try to be concise as I say Essbase is a fully functional MDBMS with multi-user concurrent read/write access with security down to the cell level. It scales. It partitions. It parallelizes. It calculates forwards and backwards. It allows input at any level of any hierarchy and I know a company (which I happen to work for) that is chock full of people who have 7plus years of experience designing and building this monster. It is quite simply THE survivor of the OLAP wars and it is now in version 11, a bona fide Oracle property.
There's one catch of course. Like Teradata VLDB experts, people who know it aren't cheap. Then again we're not stupid either. The biggest problem for a guy like me with 12 years of Essbase is that most customers don't dream big enough to keep me awake. So I'm kind of on a mission to get to the point where I get to do some interesting sized projects.
My bottom line is this, and if you think about it, it should make perfect sense. If you can go to Google Finance and get your publicly traded company's stock price and topline financials faster and easier than you can from your internal systems, you are doing something radically wrong. The news is that the BI of the future is here today - it's just kind of in various pieces in the massive Oracle product stack and in the minds of the massive Oracle sales and consulting worlds.
By Stephen Brobst and Joe Rarey
DATA WAREHOUSING IS A JOURNEY. The most successful data warehouse implementations deliver business value on an iterative and continuous basis. Each iteration builds upon its predecessor to increase the business value proposition for information delivery. In recent years, the evolution of data warehousing has reached a new pinnacle with the deployment of decision support capability throughout an organization and even beyond its conventional boundaries to partners and customers.
One of the most exciting pieces of software to come down the pike in many years is one picked up in a recent acquisition by Hyperion Solutions. It's one of the reasons I have to be fairly jazzed about the kinds of systems I'll be able to build in the coming months. Formerly called Razza, it's Master Data Manager.
If you had asked me a month ago what was the best way to make money in the Enterprise Computing business, I would have told you Master Data Management. I wouldn't have used that precise term, I would have probably said something like this:
One of the biggest problems for me, in building systems with the tools I have is always the political problem of getting all the people talking the same language. A significant reason why DW initiatives fail is because the metadata is all over the place and everybody spends too much time chasing the data down rather than analyzing it. All I need are my tools (speaking of Essbase outlines) and then I get functional people and technical people speaking the same language, because everybody can see how the numbers and entities roll up. The reason Informatica is making all kinds of money in this space is because they promise to solve this problem.
Well here's what IDC says.
Master data management is a challenging, long-standing problem. But recent attention to business performance management and compliance represent a new opportunity to deal with the issue in a way that can improve both information accuracy and organizational agility.
With Hyperion's MDM, I believe the problem has been solved. As soon as I get a copy I'll get deep into the details, but basically this is a collaborative tool that will allow enterprises to manage all of their dimensions, whether they change slowly or quickly, back through history.
Imagining the worst spaghetti possible, a partial migration between ERP systems without the benefit of ETL, a MDM Server would get everyone on the same page. How many times have I had people squawk about the complexity of Peoplesoft Trees and complain that their reporting systems use one drill down and their interal reporting systems use another and that the Business Objects Universe was painstakingly coded with another? And how many times have I had to be the one to reverse engineer all that rot and put into my systems? Too many to count.
I'm going to have a field day with this tool. Believe that.
I met Scott today and he raised the question of real-time analytics. Hmm.
I should take the opportunity to talk in general terms about what I know and think as regards this matter. The first is my theory of Industry Speed. I basically say, that depending upon which business you are in, you need and more importantly, your imagination for analytics is limited what you do. This sounds like a common sense observation, but it should be noted that this is the first idea that is ignored in a sales situation.
Successful people like successful people, and nobody wants to be limited by technology. So it is likely that a technical person is going to want no limits, on sure-fire way to kick OLAP to the curb is to raise the question of real-time processing, or as Scott put it, why can't we just get Essbase to query directly into JDE?
This is only a reasonable request in the context of demand for real-time analysis. As a nice to have, it's too expensive. But I want it. Why shouldn't I want it? Why should I stick a second tier of databases between myself and my operational real-time data? The answer is materiality.
How much is your management willing to pay to know now? And how long are they willing to wait for you to build this thing that will allow them to know now? How critical is it to the direction of your company that you provide this service. Like the old chestnut goes, sometimes working smarter to avoid working harder is just too much hard work. The last thing you want to do is spend money that doesn't augment human intelligence. You need to find a sweet spot.
Here's my ramp on Industry Speed for Analytic & BI Apps. The lower the tier, the more demanding they are.
Telemetry & Remote Sensing
Banking & Finance
Aerospace & Engineering
Restaurant & Hotel Mgmt
The cost of analytical applications is going down, but that is the technology factor. What's keeping the price high is the amount of specilization is inherent in the way the tools are maturing. I find now, that in order to get the kind of job done that one person did in 1995, at least three people are doing now. In any case, the technology is not the hardest nut to crack, nor is staffing. It is the ability to comprehend and capture the initiative from management about clear definitions of the business, the domain to be studied and the speed and accuracy necessary for process improvement.