BI is Business Intelligence. At this particular moment in history, BI is seen as the more technical aspect (tools & techniques) in Management Computing.
So let's start at the beginning.
Back in the mid 80s, before the advent of the Pentium class computer and the acceptance of pre-emptive multitasking operating systems in the business world, a cat named William Bowen (no relation) published a cover story on Fortune called 'The Puny Payoff from Office Computers'. He started off:
HAVE THE MILLIONS of computers purchased by U.S. businesses brought any overall improvement in productivity? Surprisingly, the best information available says no.This collides with many people's beliefs regarding computers. Without computers, present-day credit card operations, check processing, and airline reservation systems would be unthinkable. But the figures indicate that, on a national scale, business's investment of hundreds of billions of dollars in computers and computer-aided communications has failed to bring about a discernible improvement in productivity. ''The puzzling thing,'' observes economist Martin Neil Baily, a senior fellow at the Brookings Institution think tank in Washington, D.C., ''is that the computer revolution has not yet paid off in productivity growth as did the earlier generations of innovation.''
So far productivity has grown more slowly in the computer age than it did before computers came into wide use.
The accepted wisdom was, at this time, that the typewriter business would remain profitable, business managers did not want to type and would not learn to use a mouse, and that any computer with more power than a pocket calculator could only be useful to engineers and scientists.
This contradicted the expectations of people like me, who were invested in the ideas of 'informating' the workplace. These ideas basically originated out of the theories expounded in The Rise of Managerial Computing, by John Rockart. Along with the other cats at CISR and the theories of John Seely Brown, there was a technical vanguard of folks who believed passionately that computing could bring efficiency to business decision making. Technically, there was a distinction made between Strong AI and Weak AI. Weak AI was also known as the 'Augmentation School' as proposed by Doug Englebart.
In my own career, fortunately cutting my teeth at Xerox, I was in direct contact with the tools and technologies that were enabling these theories. I played enough with tools like LOOPS and XDE (and even the HUMBLE Expert Shell) to recognize the great potential. I actually did some Viewpoint programming in Mesa back in those days.
As we at the Xerox Systems Group looked to make the 'office of the future' we experimented with any number of support strategies and tactics. The one that managed to stick was the idea of the 'IC', the Information Center. But one thing was immediately clear, if the IC was going to function properly, we had to address Bowen's complaint. What emerged from the work done in the mid-eighties was an interdisciplinary approach dedicated to improving business via the judicious use of computing technology. Principles I developed with these things in mind are still valuable to this day.
It bears repeating that in those days, most computing was done on mainframes via subserviant terminals. Most corporations understood this paradigm, but they did not understand the value of distributed peer networks. Napster is not the first group of upstarts to encounter resistance. At XSG, our calling card was decentralized, client server, object oriented, networked computing. Each of those qualifiers was fought tooth and nail, and our noses were rubbed into every failure. But we were still all about End-User Computing. How could we make it win? By breaking it down to cases, application by application. I won my first crystal trophy building a system adhering to my now golden Pledge.
We were determined to make productivity gains in the 'front office', among analysts, managers and those we called 'knowledge workers'. This followed Englebart's ideas about augmenting human intelligence with tools that aid thinking. Back office computing was dreary and boring to us - cutting payroll checks, keeping track of telephone bills, these were things that could be done on stupid mainframes with primative languages like COBOL and elementary data structures like VSAM. We were were trying to get people to work smarter through advance user interfaces. We had to think of data in more ways than just fields & sorts. We were engaging people at their desks in order to make people more productive.
The Cubegeek in 1988 at Xerox Systems Group in El Segundo. Note the 6085 workstation with the 19 inch monitor on the right.
A lot of time and effort was spent understanding how people approached their work, how they organized their physical work, how they would interact with the computer. We were engineering things in real-time. Over time all of these disciplines came to be known as Executive Information Systems.
At Xerox, and certainly most other corporations, the new front office productivity efforts centered around Finance organizations. Numbers were something we could do. Memos were something we could do. Reports were something we could do, and so we expected our first payoff would be in Finance and Accounting. In various ways, these corporate staffs already knew that computing could save them time and money. Backoffice computing was already calculating payroll, etc. Plus, these guys held the pursestrings. They could buy systems when they felt like it. Sure there were many advances in CAD/CAM during this same period. In particular at Xerox, we were undergoing a complete transformation of the management culture from top to bottom. Xerox' CEO David T. Kearns had hired McKinsey & Company to instigate a program called Leadership Through Quality. The most important thing about the Xerox reorg with regards to BI is that they implemented a practice of cascaded management objectives. The point was that employees throughout the organization should understand at every moment how their work contributed to the priorities of the company.
Without getting too deeply into it, Quality Improvement struggled through the 90s against something called Re-engineering. Among the management consultants, there were two schools of thought. The Deming School and the Hammer School. Coming from Xerox, I was part of the Deming School. With regards to a systems implementation philosophy a quality improvement perspective means learn from what you know and get better. It's about evolution rather than revolution.
So by 1990 I had a very clear picture of what I was doing in Managerial Computing. I was building EIS and EIS was about putting LAN enabled computers in front of managers, analysts, and other knowledge workers. It meant having a strong set of productivity benchmarks to insure that the investment would be worth it. It meant taking what people know already and improving upon that by augmenting their intelligence. Not making the computer think for them. Not putting simple rows and colums of numbers on paper (hell you could do that with greenbar). Not saying everything you know is wrong, here's how we're going to re-engineer your function. EIS was part of the inevitable evolution of computing from all the knowledge being centralized in the mainframe to taking the knowledge from the people of the organization and leveraging their ability to do the company's business.
So for me the theory and tasks were clear. Now it was all about communicating that, and waiting for the technology to grow up.