Monday, September 21, 2009

Want to make BI pervasive? It's the culture, stupid

Original: http://www.networksasia.net/content/want-make-bi-pervasive-its-culture-stupid?page=0%2C1

Business intelligence software may have been around for several decades, but it remains an esoteric niche in most companies, according to an analyst.
Unfriendly corporate cultures, not the BI tools or apps themselves, are preventing BI from becoming pervasive.
"The technology has been around for a long time. It's the people that often get in the way," said Dan Vessett, an analyst with IDC Corp.
IDC recently conducted a study of 1,100 organizations in 11 countries measuring how pervasive BI is in companies, what factors helped make it more pervasive, and what "triggers" data warehousing architects and IT managers can use to the further the spread of BI in their companies.
In a speech at a conference in Chicago, Vessett said IDC measured BI's pervasiveness via six factors:
  • Degree of internal use. According to IDC, that was between 48% to 50%.
  • Degree of external use, or how much it shared data with vendors or customers. Sharing BI data keeps customers loyal, Vesset said. And canny BI users in industries such as retail can sell that data to generate non-trivial revenue, he said.
  • Percentage of power users in a company. The mean was 20% in surveyed companies.
  • Number of domains, or subject areas, inside the data warehouse. Over five years, the average at surveyed companies grew to 28 from 11.
  • Data update frequency. While real-time updates can be indicative of heavy dependence upon BI, "right-time data updates" is more important. "Daily, weekly or monthly could be sufficient," he said.
  • Analytical orientation, or how much the BI crunching helped large groups or the entire organization make decisions, rather than isolated individuals. "The fact is that most individuals and companies are not data driven. They still rely more on experience rather than analytics," Vesset said.
According to Vesset, these factors in descending order had the most impact on BI pervasiveness:
  • Degree of training, not in the BI tools -- "the vendors do a pretty good job" -- but in the meaning of the data, what the key performance indicators (KPIs) mean, etc.
  • Design quality,or the extent to which IT-deployed performance dashboards are able to satisfy user needs. Satisfied users will talk up the BI software, creating "BI envy" in other employees, helping spread the software's use. Unsatisfied users will go around IT and use Excel or some SaaS applications. Prominence of the data governance group.
  • Involvement of non-executive employees.
  • Prominence of a performance management methodology.

Vesset also listed a number of potential "triggers" for BI projects that IT should take advantage of:
  • Arrival of new executives, who, if unsatisfied with the type of reports or analyses delivered, may help sponsor a new project.
  • Need to comply with new legislation.
  • Introduction of performance management methodology.
  • Corporate reorganizations, including mergers and acquisitions.
  • Changes in the organization's growth, such as when a fast-growing company slows down and then begins focusing on improving its profit margins.

Saturday, September 5, 2009

Smarter Buying: Business Intelligence and Performance Management Software

Original: http://www.smartertechnology.com/c/a/Technology-For-Change/Smarter-Buying-Business-Intelligence-and-Performance-Management-Software/?kc=STNL07212009STR3

Here's what matters when selecting a product and supplier. Hint: Price isn't everything.

Boil down the scores of industry polls of CXOs over the last few years, and you get a remarkably consistent message: We need to become a smarter, faster, more efficient (and cheaper) organization. No surprise this now-familiar mandate has been adopted by numerous industry vendors.

Nowhere is this truer than the hot markets for business intelligence and performance management software and services. IDC forecasts global sales of BI software tools will grow from $7.5 billion in 2008 to $10.2 billion by 2013, or about 6.3 percent annually.

It makes sense: Who doesn’t want to make better organizational decisions? Run a tighter, more effective outfit? (Especially, as vendors are eager to point out, during an economic downturn.)

Finding the most appropriate decision-making software will take some thoughtful decision-making of your own. As with much in IT (and life), the answer to “what’s best?” depends heavily on your particular situation.

  • What strategic vendors have you committed to?
  • What’s your budget?
  • Staff expertise and availability?
  • Are you looking for an easy-to-use solution for a group of nontechnical business users? A master analytic engine for the entire enterprise?
  • Are you willing to try a lesser known small but maybe more affordable supplier?

Very basic stuff, but here as elsewhere, they are make or break considerations.

That said, the list below shows what your IT peers look for when buying BI/PM software and services. It’s from an e-mail and online survey of 1,380 qualified buyers of products in the space conducted for Ziff Davis Enterprise by Preference Research in January.

Respondents were asked whom they now buy from, and whom they’d consider and recommend. Predictably, the list of BI suppliers that respondents would strongly/very strongly consider for next purchase was dominated by the Big Five in the space: Microsoft, Oracle, Business Objects, Cognos (IBM) and SAP. Predictably too, each vendor had particular strengths: IBM had best combo of features and reputation, Microsoft had best user familiarity, Oracle and SAP scalability.

On price, respondents all wanted cheaper products (stop the presses!). Fortunately, they also said price mattered less (ranked seventh overall) than good ROI (third).

Here’s what buyers told us what matters in choosing BI/PM. How does that square with your checklist?

Attributes "Very Important" to Selecting Vendor for Short List:

  • Usability: e.g., intuitive interface: 71.8%
  • Performance: e.g., speed, stability, lower rates: 63.1%
  • Value for the dollar: i.e., good return on investment: 50.6%
  • Features: e.g., uniqueness, depth, superiority: 49.6%
  • Technical support/service: e.g., telephone, Web, on-site assistance: 49.3%
  • Scalability: e.g., ability to handle high-volume usage: 44.7%
  • Price: i.e., relatively inexpensive: 41.8%
  • Low cost of operation: e.g., ease of deployment/maintenance: 40.0%
  • Familiarity: e.g., convenient because you or others already use it: 18.9%
  • Reputation: i.e., positive word-of-mouth, news or reviews: 18.7%

Operationalizing Business Intelligence

Original: http://www.smartertechnology.com/c/a/Smarter-Strategies/Operationalizing-Business-Intelligence/?kc=EWWHNEMNL07092009STR7

Linking business intelligence with business results gives workers out in the field better tools to drive day-to-day operations and customers better ways to make informed purchases.


At the advent of business intelligence, the idea was to put the right data and analytics in the hands of people who could make actionable changes that improve the way business is done. Somewhere along the line, that simple idea grew muddled.

BI systems grew up to be scattered across enterprises with the wind, complicated and difficult to use by even the business analysts. As enterprises assess how to move forward with their BI efforts, one of the driving forces of these initiatives is to make BI simpler, easier to access by a wide range of workers. In short, organizations want to bring BI back to its philosophical roots.

“One of the promises of BI when I started was empowering decision-makers and knowledge workers. It was to create pervasive BI and leverage BI for everyone,” says Dyke Hensen, chief marketing officer for PivotLink, who calls himself an old BI "oak tree" after 20 years in the space. “The problem is that over the years a lot of these offerings became very complex, very bloated and expensive.”

Hensen cites figures from The Data Warehousing Institute annual survey that showed the median cost just to maintain BI applications alone clocks in at around $235,000 per year. In his company’s case, Hensen says the goal is to reduce the cost of maintenance by offering BI capabilities via a SAAS (software-as-a-service) model to reduce not just the hardware and software costs, but also the number of employees needed to maintain systems.

However, according to Nimitt Desai, business intelligence and data warehousing lead for Deloitte Consulting, many organizations can’t even feasibly begin to leverage SAAS until they begin to consolidate their BI efforts. One major problem enterprises face today is the sheer number of BI applications spread out over an organization. Desai says it is common to see enterprises running well over 100 analytic environments that they must report against.

“When you have hundreds of systems, then SAAS is a myth,” Desai says. “But if you have a smaller amount of sources, I feel there is a big push in that direction.”

This drive to consolidate sources is much more of a possibility today than even two years ago with the push by major ERP vendors to help bring BI out of the cold and under a larger operational umbrella. Acquisitions such as the SAP pickup of Business Objects last year are a sign of where the BI space is headed.

According to Wayne Eckerson, director of research and services for The Data Warehousing Institute, this shift to bring together operational systems and BI just makes sense.

“It is kind of odd that you have to switch contexts, if you're an operational worker, from an operational app in order to open a dashboard or a report to understand what the impact was of what you just did or see the context of an action,” Eckerson says. “There is definitely an opportunity for vendors to take that gap out of the BI office and embed BI right into operational applications.”

Brian Kilcourse, managing partner for RSR Research, agrees that this "operationalization" of BI is one of the most impactful intelligence trends sweeping through enterprises at the moment. He says he’s seen lots of anecdotal evidence of how a shift to better embed BI within operations gives workers out in the field better tools to drive day-to-day operations or customers better ways to make informed purchases.

“We’re seeing a lot of companies injecting actionable information into operational processes in just-in-time fashion,” Kilcourse says.

For example, in one case study Kilcourse analyzed, he witnessed Virgin Megastores offer its store managers a strong way to improve sales. BI systems there were integrated with up-to-the-minute in-store sales so that managers could see how hit titles were selling in comparison to other hits with similar sales. The intelligence match-up compared the first few days of release of one title with other releases that had similar sales starts and gave managers the ability to project outward. It also offered actionable analysis that enabled workers to pair up other overstocked albums with hot sellers in endcaps to move otherwise stationary products.

Even though Virgin closed its retail stores for other reasons entirely, Kilcourse says this application of operational BI is too good to go ignored.

“They were basically doing a kind of a product mashup on the sales floor in more or less real time based on the signals they're getting from sales as they're occurring,” he says. “So they're basically doing shelf resets based on the fact that one title is flying off the shelves and they want the other one to fly with it.”

Kilcourse says that these sorts of initiatives help organizations better adopt a sense-and-respond type of mentality. He also believes that better embedding BI into operations provides very good back-end benefits.

“One of the big values of it is that the operational systems or the processes can then deliver back some information to the business intelligence system that says, ‘This is what happened after you responded.’”

Wednesday, August 26, 2009

Six Steps to Agile BI

Original: http://www.smartertechnology.com/c/a/Technology-For-Change/Six-Steps-to-Agile-BI/

As organizations try to find better ways to meet the business needs of users who seek more actionable and relevant information to help them do their jobs, the adoption of agile development is popping up on more BI dev teams’ radars.

As organizations try to find better ways to meet the business needs of users who seek more actionable and relevant information to help them do their jobs, the adoption of agile development is popping up on more BI dev teams’ radars.

According to Ken Collier, senior consultant in business intelligence and agile product and project management for Cutter Consortium, an IT advisory firm comprised of numerous independent consultants across North America, the main principle behind agile development is an iterative, evolutionary drive toward progress. As an expert in agile BI, Collier encourages many of his consulting clients to progress in baby-steps, producing bite-sized deliverables in rapid-fire, two-week cycles. The idea is to be constantly rolling out new features on the fly, offering a more nimble response to BI users’ needs and wasting less time on features that will never be used.

“It’s not a sequence of things that you must do in order to call yourself agile,” Collier says. “It’s really much more of a set of values, principles, behaviors and attitudes supported by some very well understood practices like test automation that make it possible.”

While the agile philosophy has taken much of the development world by storm, in-house BI dev teams are still taking their time to adopt the mantle of agile for producing intelligence applications. Collier believes agile BI is about where the rest of the software community was four to five years ago in respect to agile adoption.

“My sense is that database folks, and especially the data warehousing folks, don’t believe that the agile methods fit very well for them,” Collier says, “like there is something special about what we do in business intelligence that’s unique and different and harder.”

He disagrees. He believes that BI can benefit greatly from agile adoption and he’s here to explain to Smarter Technologies six essential tips for making it happen.

Involve Users Early and Often

Customer collaboration is one of the most critical elements to making agile a successful strategy.

“If I had to only choose one thing that I would really knuckle down on, it would be getting involvement early from the user community and having them be engaged and involved every week throughout the process,” Collier explains.

As he puts it, under the traditional model of BI system development, the dev team solicits requirements up front from the user community, developers put their heads down for many months of work and then they go back to the user community with features only to hear, "Well, that’s not really what I meant when I said I needed this."

By maintaining a constant stream of communication with the user base, developers can ensure that users get the features they need, the way they need them. Collier realizes that it may be a challenge keeping up this level of collaboration.

“We see that a tension—not an unhealthy one, at that—can develop between asking for involvement and engagement from users while also respecting that they’ve got their own work to do and they’re not full time on this project,” Collier says. “That’s why I say it’s one of the single hardest things to get right.”

In spite of this tension, Collier says that BI teams should not simply depend upon business analysts to act as advising surrogates for the customer team. While analyst input is critical, if you want the entire user community to use the BI systems then you’d better get input from representatives throughout that community.

“If you’re getting weekly acceptance of new features from your business analyst group and then you roll out a system live only to find out that your BA’s aren’t really the accurate voice of the users, that can be a problem,” he says.

Transitioning to agile development will require a definite shift in mentality for those team members who have been working in BI for a long time.

“I often go into companies where the BI team is saying, ‘Well, job No. 1 is to collect all of this data from these different source systems, merge it together, cleanse it and get it all prepared. Then we can start talking about what features we want,’” Collier says. “In my opinion, that is totally backwards.”

Agile development is feature-driven development, he explains. That’s what makes it so effective at meeting users’ needs.

While the quality of data and data models is important, he believes that teams need to work on those after the feature sets have been hammered out.

Prioritize Based on Value

Not only is agile development a product of feature-focused work, it is also about prioritizing the list of deliverables based on the value of those features. Collier explains that part of the communication with users is to get them to not only list their requirements, but to also prioritize them.

Then the team can start with the most important features and work their way down the list. Doing so ensures the system is delivered quickly with all of the features users really need. This is important considering that industry studies have shown that users typically use only about 60 percent of any given software feature set.

“By doing that and by delivering features early, we can converge more quickly on a system that’s ready to go into production,” Collier says. “And the user might be able to say, ‘This is good enough, I don’t need that other 40 percent in there.”

Test Automation

With such short production times between conception of new features and delivery of these functions, testing can become a sticking point if the development team doesn’t do it right.

Unfortunately, many database and BI developers have been manually testing their code for years, Collier says. That simply won’t fly under the agile model.

“If you follow my guideline to work in two-week iterations and you’re trying to manually test these new features, you get pretty quickly buried under the weight of your own testing processes,” he says.

This is why test automation is absolutely critical. Collier suggests teams look into new open-source testing tools for Oracle and IBM databases that have come out of late and to be sure to check out the latest test automation tools in SQL Server 2008 if you’re using that platform.

Encourage Culture of Collaboration Among Developers

In order to meet the fast and furious pace of agile development, Collier believes that teamwork among the developers is essential.

“So, for example, if it’s a data warehouse you’re working on, having your data modelers, your BI developers and your DBAs working face-to-face together in the same room is a very important element,” Collier says.

Getting the team to work well together as a single unit is critical in order to rally around the demands of delivering very discrete and focused features so quickly.

Start Quickly Out of the Gate

Unlike the typical BI projects that can drag on for months or even years before useable functions are placed at users’ fingertips, agile projects require very fast delivery. Which means that the pressure is steady throughout the life of the project, even the beginning.

“One of the great things about working in an iterative, incremental and evolutionary fashion is that you’re busy right from the start,” Collier explains. “So a development team has to get started right now because in two weeks they’re expected to deliver some new feature.”

While this can be intense, it can actually result on less stress in the long run. After all, agile projects are much less likely to face the dreaded 11th-hour death march just before a system goes live.

Monday, August 24, 2009

Five Trends Changing the Face of BI

Original: http://www.smartertechnology.com/c/a/Technology-For-Change/Five-Trends-Changing-the-Face-of-BI/

Predictive analytics, agile development, user-centric business intelligence and improvements in visualization are giving new life to this mature technology।

How does your organization extract true value from its business information? Answering this question has been a persistent challenge facing technology and line-of-business executives for decades.

While business intelligence (BI) has evolved since the days of “green bar” reports, the industry still has a long way to go to offer companies business information that can be translated into actionable steps that drive business results, says Joe Bugajski, senior analyst in business intelligence for the Midvale, Utah-based Burton Group.

“There is a sea change coming in business intelligence,” he says. “The existing tool sets have been out there since the early ’90s—some of them before that. And we still have tools that are too complicated for most folks. We’re pushing too much of the technical mumbo jumbo behind BI into the faces of users, and we’re still not giving access to valuable information in a simple fashion to the majority of the business population.”

As a result, users are increasingly asking their BI units, “What have you done for me lately?”

The best BI teams answer that question with a bevy of new capabilities based on five trends that experts say are changing the face of business intelligence.

Trend 1: Predictive Analytics

If Ram Nagappan had to name one critical area where he thinks business intelligence has the most potential to completely transform his enterprise, predictive analytics would get the nod.

“If you look at it, everyone supplies records, everyone has dashboards—or they're planning on doing it,” says Nagappan, managing director for Pershing LLC, a Jersey City, N.J.-based financial services affiliate of The Bank of New York Mellon. “But in these economic times, the information that I know beforehand is what will help me save money and steer the ship in the right direction.”

As he puts it, the BI industry is just “scratching the surface” of predictive analytics. This is partially because analytics in general has lagged so far behind the rest of the more traditional reporting functions of BI.

“Analytics has been the last to the party in the BI space,” Burton Group’s Bugajski says. “All the easy stuff has been done. We can make very pretty charts and graphs, but it's not the [same as an] interaction with the core information of the business so I can understand what’s going on. That’s still missing.”

Right now, most organizations are pushing the boundaries of current tool set capabilities. “What we’re seeing are mostly in the research and university areas,” Pershing’s Nagappan says of current predictive analytic tool development. “I know that people can take their current analytical data models and other things that they’ve created and do a prediction on them, but the tools are not quite there yet.”

Clearly, Nagappan’s expectation for better tools tracks well with recent rumblings in the business intelligence marketplace. According to IDC, the analytics market is expected to grow about 4 percent this year.

In late July, IBM banked more than $1.2 billion on a bet that predictive analytics is the key to BI’s future. The investment was made in the acquisition of SPSS, an analytics firm well-known for its predictive analytics technology.

“With this acquisition, we are extending our capabilities around a new level of analytics that provides clients not only with greater insight, but also with true foresight,” Ambuj Goyal, general manager of information management for IBM, said in a statement about the acquisition. “Predictive analytics can help clients move beyond the ‘sense and respond’ mode—which can leave blind spots for strategic information in today's fast-paced environment—to ‘predict and act’ for improved business outcomes.”

While organizations wait for the market to shake out, Pershing’s Nagappan believes that those who prepare their subject-matter experts will be best prepared to take advantage of new technology innovations. “I think the challenge in predictive analytics is going to be building the subject-matter expertise within the analysts,” he says. “In order to predict systematically an analyst needs to know the subject matter well.”

Trend 2: Agile Development

The economic downturn is forcing BI departments to rethink the way they develop their solution sets, according to Wayne Eckerson, director of research and services for The Data Warehousing Institute, a Renton, Wash.-based analyst firm. With users crying for more capabilities and management demanding faster deployments, he believes more enterprises will start to port the agile development philosophies that have permeated the developer community to the more traditional BI development teams.

“With the down economy, there is a lot of movement to come up with lower-cost models and faster deployment to keep up with the business,” says Eckerson. “Organizations are exploring agile BI because the business doesn't want to wait around for even three months.”

That golden three-month period used to be the perfect milestone that BI teams would shoot for to satisfy the business with new innovations. “Now it’s more like a couple of weeks to a month,” Eckerson says.

The key to the agile approach to BI is that it “rolls out business intelligence in an incremental evolutionary way with a lot of involvement and participation from end users or customers,” says Ken Collier, senior consultant in business intelligence and agile product and project management for Cutter Consortium, an Arlington, Mass.-based IT advisory firm.

In the enterprise projects he leads, Collier targets a two-week iteration of new functionality releases. He says the factors most critical to meeting this demanding schedule are to keep milestones small and targeted; to foster a highly collaborative environment between analysts, developers and users; and to implement test automation for databases.

“That’s an entirely new concept for database folks who have been manually testing for years,” he says. “The problem is that if you work in two-week iterations and you’re trying to manually test these new features, you quickly get buried under the weight of your testing processes.” If you do it right, Collier adds, agile BI can deliver value in a number of ways. The most important is responsiveness to user needs.

“In a relatively medium-size data warehouse or BI system, it typically could take eight or 12 months of requirements analysis and development and testing before users get to see working [betas] on their desktops,” Collier says. “What is really key is being able to show users features within the first few weeks of a project when new data is trickling in every night—even if you don’t roll these things live into production—and being able to show users working features and get feedback so you can quickly adapt.”

Agile development can also cut down on function overkill. Collier cites industry statistics that show the typical user of any given system uses only about 50 percent of the features included in the application.

“Simply by virtue of the fact that we focus on the highest value things first, we can complete projects faster and at less expense,” Collier says. “We can converge more quickly on a system that’s ready to go into production, and the user can say, ‘This is good enough; I don't need that other 40 percent in there.’”

Even if an organization isn’t gung-ho about developing on a two-week schedule, the lesson to take away about the agile movement is its bite-size mentality of incrementalism. Pershing’s Nagappan says this is for organizations seeking to ramp up their intelligence maturity.

“Many organizations try to do an enterprisewide solution on Day 1, and that is a huge elephant to move,” he says. “That is not going to be a success. Any time you wait a year for a product to show up, it’s not going to be easy.”

Trend 3: User-Centric BI

Since the last time the economy took a nosedive in 2001, Nagappan has shifted his department’s focus to better customize the information he delivers to different user segments based on their roles within the organization.

“We have noticed that our customer segments—we call them personas—that use our platform are all different,” Nagappan says, explaining that personas can range from financial reps to marketing and sales folks to business advisers. “The key thing we recognized was that one size was not going to fit them all.”

As he helps the businesses come out the other end of the recession on a strong note, Nagappan’s top priorities include a shift to offering user-centric analytics based on role.

“We need to take the same data and create analytical models that satisfy the various personas that are going to look at the information so they have better decision-making ability from it,” he says. “We take the same transactional data and create various functional areas so these different consumers can come and take what they need.”

According to Burton Group’s Bugajski, it is this kind of user-centric focus that more BI departments must develop to enable the business to drive true value from the information it is analyzing. He says the typical organization too often faces the prospect of gathering BI information from what he calls the “human GUIs” of the enterprise: users who know how to extract data and end up interfacing with BI systems for colleagues who either don’t know how or don’t want to learn.

“There’s value there,” Bugajski says. “There are reasonable and responsible behaviors there. But that’s not the original vision for BI.

“Where is the tooling that my CEO could use? Where is the tooling that my business analysts who are not experts in data, but who are experts in marketing, could use? Where is that stuff? They want something that is as simple to use as a Google search. And that’s fair to ask.”

Trend 4: Visualization Improvements

Bugajski believes the only way organizations are going to extract the full value from their BI endeavors is if they redesign their visualization philosophies and designs. “Business intelligence as we know it is just about dead,” he says. “We need a new paradigm, and I think visualization is the key.”

The visual ways in which users collaborate and analyze information through Web 2.0 tools are setting the bar high for BI deliverables, which Bugajski says are sometimes stuck in form factors from the 1990s and even the 1980s.

Nagappan couldn’t agree more. He says Pershing is focusing on Web 2.0 technology in order to innovate better ways to pump up visualization and improve the way users interact with data.

“If you look at the traditional ways, you just put out a spreadsheet of information, and people might pivot that spreadsheet, but that's about as far as they used to go,” he says. “Now, with Web 2.0, we can create geographic mapping using Google; we can do Flash-based animation. We have AJAX technology and sharing.”

Nagappan says the cross-section of Web 2.0 and BI enables organizations such as Pershing to take the same information and make it consumable in a number of ways. This dovetails nicely into the user-centric model that his organization is striving for, he says.

In fact, Nagappan and his organization are so passionate about how Web 2.0 can change the face of visualization BI that they recently used these technology philosophies in creating a new flagship software platform for the company’s financial planning customers. Launched in July, the Pershing NetX360 puts the power of customizable data dashboards in the hands of users, allowing them to visually see real-time financial numbers and data crunched on-demand from a smorgasbord of data sources.

Trend 5: Operationalization of IT

At the advent of BI, the idea was to put the right data and analytics in the hands of people who could make actionable changes that would improve the way business is done. Somewhere along the line, that simple idea grew muddled.

BI systems grew up to be scattered across enterprises, complicated and difficult to use—even by business analysts.

As enterprises assess how to move forward with their BI efforts, one of the driving forces of these initiatives should be to make BI simpler and easier to access by a wide range of workers. In short, organizations want to bring BI back to its philosophical roots.

“When I started, one of the promises of BI was empowering decision-makers and knowledge workers,” says PivotLink’s chief marketing officer, Dyke Hensen, who calls himself an old BI “oak tree” after 20 years in the space. “It was to create pervasive BI and leverage BI for everyone. The problem is that over the years, a lot of these offerings became very complex, very bloated and expensive.”

Eckerson of TDWI agrees, saying that it is odd that so many enterprises’ operational workers have to switch gears between a business intelligence application and an operational application in order to open a dashboard or a report to see the impact of an action taken based on business intelligence.

Over the last two years, the market has seen a drive to consolidate these tasks with a push by major ERP vendors to help bring BI under a larger operational umbrella. Acquisitions such as the SAP pickup of Business Objects last year are a sign of where the BI space is headed.

Eckerson believes the recent shift to bring together business intelligence systems and operational systems such as ERP makes sense. “There is definitely an opportunity for vendors to embed BI right into operational applications,” he says.

Pershing’s Nagappan also thinks that operationalizing BI is a no-brainer. He says his organization worked to do so years ago, leveraging tool sets from Information Builders and in-house work.

“It’s a very key area for us, and we have done this for many years,” he says. “I know many other people in business intelligence who focus on finance [intelligence] and a few other [intelligence areas], but at Pershing, we focused on operational and compliance [intelligence] so many years ago because compliance is a key aspect of our business.”

Brian Kilcourse, managing partner for Retail Systems Research, a Miami-based retail IT analyst firm, agrees that this “operationalization” of BI is currently one of the most significant intelligence trends sweeping through enterprises. He’s seen a lot of anecdotal evidence illustrating how a shift to embedding BI within operations gives workers out in the field better tools to drive day-to-day operations and gives customers better ways to make informed purchase decisions.

“We’re seeing a lot of companies injecting actionable information into operational processes in just-in-time fashion,” Kilcourse says. For example, in one case study he analyzed, Kilcourse witnessed Virgin Megastores offer its store managers an effective way to improve sales. BI systems were integrated with up-to-the-minute in-store sales so that managers could see how hit titles were selling in comparison to other hits with similar sales.

The intelligence match-up compared the first few days of release of one title with other releases that had similar sales starts, giving managers the ability to project sales going forward. It also offered actionable analysis that enabled workers to pair up overstocked albums with hot sellers in endcaps to move otherwise stationary products.

Even though Virgin closed its retail stores for other reasons, Kilcourse says this application of operational BI is too good to be ignored. “They were basically doing a kind of a product mashup on the sales floor, in more or less real time, based on the signals they were getting from sales as they were occurring,” he says. “They were basically doing shelf resets based on the fact that one title was flying off the shelves, and they wanted the other one to fly with it.”

Kilcourse says these kinds of initiatives help organizations better adopt a sense-and-respond mentality. He also believes that embedding BI into operations provides very good back-end benefits.

“One of the big values is that the operational systems or processes can deliver to the business intelligence system some information that says, ‘This is what happened after you responded.’”

Here's what matters when selecting a product and supplier. Hint: Price isn't everything.

Original: http://www.smartertechnology.com/c/a/Technology-For-Change/Smarter-Buying-Business-Intelligence-and-Performance-Management-Software/?kc=STNL08202009STR6

Boil down the scores of industry polls of CXOs over the last few years, and you get a remarkably consistent message: We need to become a smarter, faster, more efficient (and cheaper) organization. No surprise this now-familiar mandate has been adopted by numerous industry vendors.

Nowhere is this truer than the hot markets for business intelligence and performance management software and services. IDC forecasts global sales of BI software tools will grow from $7.5 billion in 2008 to $10.2 billion by 2013, or about 6.3 percent annually.

It makes sense: Who doesn’t want to make better organizational decisions? Run a tighter, more effective outfit? (Especially, as vendors are eager to point out, during an economic downturn.)

Finding the most appropriate decision-making software will take some thoughtful decision-making of your own. As with much in IT (and life), the answer to “what’s best?” depends heavily on your particular situation.

  • What strategic vendors have you committed to?
  • What’s your budget?
  • Staff expertise and availability?
  • Are you looking for an easy-to-use solution for a group of nontechnical business users? A master analytic engine for the entire enterprise?
  • Are you willing to try a lesser known small but maybe more affordable supplier?

Very basic stuff, but here as elsewhere, they are make or break considerations.

That said, the list below shows what your IT peers look for when buying BI/PM software and services. It’s from an e-mail and online survey of 1,380 qualified buyers of products in the space conducted for Ziff Davis Enterprise by Preference Research in January.

Respondents were asked whom they now buy from, and whom they’d consider and recommend. Predictably, the list of BI suppliers that respondents would strongly/very strongly consider for next purchase was dominated by the Big Five in the space: Microsoft, Oracle, Business Objects, Cognos (IBM) and SAP. Predictably too, each vendor had particular strengths: IBM had best combo of features and reputation, Microsoft had best user familiarity, Oracle and SAP scalability.

On price, respondents all wanted cheaper products (stop the presses!). Fortunately, they also said price mattered less (ranked seventh overall) than good ROI (third).

Here’s what buyers told us what matters in choosing BI/PM. How does that square with your checklist?

Attributes "Very Important" to Selecting Vendor for Short List:

  • Usability: e.g., intuitive interface: 71.8%
  • Performance: e.g., speed, stability, lower rates: 63.1%
  • Value for the dollar: i.e., good return on investment: 50.6%
  • Features: e.g., uniqueness, depth, superiority: 49.6%
  • Technical support/service: e.g., telephone, Web, on-site assistance: 49.3%
  • Scalability: e.g., ability to handle high-volume usage: 44.7%
  • Price: i.e., relatively inexpensive: 41.8%
  • Low cost of operation: e.g., ease of deployment/maintenance: 40.0%
  • Familiarity: e.g., convenient because you or others already use it: 18.9%
  • Reputation: i.e., positive word-of-mouth, news or reviews: 18.7%