By Jay Morreale, Bank of America
As the amount of data gathered and produced within an organization grows daily, users are challenged to make decisions based on this information in shorter and shorter time periods. While traditional Business Intelligence (BI) tools provide valuable access to this data and offer standardized reports that translate this information into key snapshots on the state of business, they are typically narrow in their application. Business users who have questions not covered by one of these reports must either rely on IT experts to help customize another one, or cobble together their own report through spreadsheets, user supported databases or other tools at their disposal.Over the past 10 years, the growing demand to enable a larger swath of business users to accelerate their decision-making has made one thing abundantly clear: BI as we know it is not keeping up with the rapidly evolving user demands for faster, more flexible and more user-driven decision-making tools. It's clear there is a widening gulf between what traditional BI tools deliver and the emerging market for true analysis capabilities.
To complicate matters, vendors across the spectrum-from BI to search to business process management-are using BI and analytic terms interchangeably to describe what are actually two distinct categories. To understand how they differ and how the emergence of analytics-part and parcel of BI-has developed to address a more on-demand, decision-making enterprise, we need to look at the genesis of BI, the forebear of analytics technology.
The early years During the '90s, business managers looking for data on key business metrics had to request reports from their IT staff, who pulled the data from a few transactional systems. Special-purpose operational data stores were created and the IT expert would figure out where data resided, create a data store for analysis, and then put together a hand-coded report to deliver to the executive. Early reporting products, such as Crystal Reports, emerged to give our IT friends a way to write the underlying report and deliver it to the user.
But these reporting tools were still IT intensive and expensive. Depending on the complexity of the reports and the size of the data stores, the process times could be days if not weeks. And this all assumed that the data was in a clean format ready for reporting. If cycles were required to clean the data, time frames could be much longer. As a result, these reports were limited to high-level management. They, in turn, had to know exactly what they were looking for and think through their query in advance before asking IT to compile the data. If they got what they needed, it was worth the effort. But, more often than not, several iterations were required before the right information was extracted.
Despite the obstacles, early BI tools represented a major breakthrough in business. For the first time, risk teams at financial firms, for example, could pull together data from disparate parts of a large organization, including trading and lending businesses, and view that data together to get a better picture of exposure to interest rates.
The "middlemen" years Hooked on the idea of accessing information from transactional systems, the BI space exploded. To serve the growing demand and cut back on the frustration users experienced asking questions and waiting for answers, we saw the creation of dedicated "information workers" or "business analysts" that served as the middlemen, attempting to bring IT closer to business requirements.
It was also during this time that the data warehouse emerged as a centralized, dedicated database for reporting that primarily ran on relational databases. Though information retrieval was still slow, they offered more flexibility and could deal with relatively large amounts of data. Efforts made to improve upon performance produced a variety of cube technologies, which served as the middle ground between the data and the business user through pre-configured reports. Cubes significantly accelerated the delivery times and provided drill-down type capabilities, but if a business user's question varied from the cube specifications it was back to the drawing board for IT.
The automation of reports through BI tools gave companies a competitive advantage, and the BI vendors that rose to prominence during this period refined the processes that made BI a must-have for global enterprises.
However, as demands grew from a broader base of business users looking to tap BI, the heavy reliance on IT created a bottleneck. These users had a variety of questions that couldn't be answered by the standard report, and existing tools made no allowance for those who wanted to analyze external data in conjunction with their reports. The demand for information was increasing at a far faster rate than IT staff could keep up with or technology budgets could support.
Thanks to the now ubiquitous spreadsheet and end-user database technologies, power users began taking things into their own hands, extracting and decoupling corporate data and blending it with their own local data sources to produce views of information to meet their specific needs. While presenting obvious risks to the business, it signaled that organizations wanted more flexibility and more responsiveness.
BI gives way to analytics Granting users the ability to explore and analyze data and make practical decisions based on what they find is a major departure from traditional BI, which served only to provide a quick snapshot or dashboard of key business metrics. The next generation of solutions, however, will need to do just this.
In the last 10 years, consumers have become much more demanding and sophisticated in their expectations of IT. The Internet has been a driving factor in these rising expectations. Great graphical tools for charting and analyzing stock price history can be found on dozens of Web sites. Users want to take advantage of this technology on their own business data! The endless days of waiting for reports to be delivered are long gone. Users want to bypass IT as well as the middlemen to get at the source. They want to be able to ask any type of question and, more specifically, explore the data. This has meant a new generation of business users that expect on-demand access and analysis of information so they can make decisions that will lead to more revenue, new markets and competitive advantage.
To answer this need, we've seen a proliferation of special-purpose, off the shelf BI applications in the last decade. But still many of these solutions were predicated on traditional BI architecture that was still too complex to meet user requirements and had a heavy reliance on IT organizations to create vast data consolidators.
The bottom line is that BI had to evolve.
And it's happening. We see the industry moving from a reporting-centric delivery model to a more analysis-oriented approach.
Newer "analytic" BI not only gets at the data, but presents it in user experiences that allow for unanticipated user-driven questions and exploration. Beyond that, the new BI increasingly incorporates real-time information streams and powerful statistical modeling to reduce the volume of data so that results are more relevant to the question. Such a high degree of customization means that rapid analytic application development will displace custom software development and commercial applications.
So what do these new analytic solutions look like?
Requirements for a new age Analytic software must deliver high-speed, visual and interactive analysis from a variety of data sources. It must be intuitive and it must be fast - it must work at the "speed of thought". It must provide users the ability to visually interact with datasets. It must allow users to access internal organizational data from corporate data stores as easily as external data sources provided via Internet connections or other sources. Most importantly, it must be able to answer the "what-if" questions. And it must be able to do all of this without requiring iterations through a technology organization.
Analytic software also must be extremely adaptable and handle huge amounts of continuously changing data. This doesn't mean that we need to present our business users with more data. In fact, it's just the opposite. Analytics must statistically pre-process enormous amounts of information rapidly so that business professionals are working confidently within the most relevant subset of information that applies to their decision-making situation.
Between the spreadmarts and countless apps running through the organization, the last thing enterprises want is another specialty tool to manage. Analytic tools need to be scalable and manageable to satisfy a wide range of analysis applications, while being friendly to IT.
The end result will be an explosion of analytic applications in areas you could not imagine addressing with traditional BI. Some will be easily created by end users, others will be constructed by BI experts and distributed to thousands, and others will be built by IT using rapid application development technologies. In virtually every case these applications won't be known by BI terms like Dashboards and Reports-they'll be known by the problem they address in the language of business.
As we reach deeper into the organization, users will not say, "I need to run this report." They'll say, "I need to run my 'collateral analysis app'" or "My 'credit sensitivity app' indicated that we have a problem with that client." The big difference is that these "apps" will have been developed by the power users who understand the organizational functions they support. They will combine data from repositories like a position risk database, a settlement system and trade agreement database, and be able to show, for example, how exposure is affected as credit ratings change. This can be visually shown as the credit rating slider is moved up and down.
In the end we will all move investments from traditional BI to analytics because better decision-making is a competitive advantage that businesses can't do without.
When we deliver good analytics to our organizations, we'll see the view of BI change. Instead of merely "acquiring a better understanding," it will be about "making better decisions" and this perspective shift will benefit all of us.
The technological needs, capabilities, and expectations of large organizations have shifted dramatically in the past 15 years. Large scale database platforms like Teradata, Neteeza, DB2, Oracle and others provide rapid access to massive datasets using parallel processing. OLAP and traditional BI technologies, such as Cognos, Hyperion, and Business Objects, have delivered users the ability to navigate easily to preconfigured reports. And now next-gen applications, for example, TIBCO's Spotfire, are delivering the ability to visually create and analyze data in ways that predefined models would never allow. These will allow the users in the enterprise to build and use applications in rapid fashion that will enable organizations to make decisions based on better information in less time.
Editor's Note: Bank of America actively uses some of the technologies mentioned in this article.
Jay Morreale is first vice president, enterprise credit and market risk technology, with Charlotte, N.C.-based Bank of America.