What to look for in modern business intelligence solutions

Jérôme Chabrillat
ZS Associates
Published in
10 min readOct 19, 2023

By Jérôme Chabrillat, Romana Rao Malempati, Anuj Sharma and Saksham Khatri

Photo by Il Vagabiondo on Unsplash

Business Intelligence platforms matured over the last 20 years with the objective to help explore, visualize and make sense of signals contained in data. As data volumes and diversity explodes, data literacy expectations grow and AI brings new ways to solve problems, BI platforms are adapting and evolving their capability-sets.

Well established platforms are transitioning, adding “Augmented Analytics” features to their solutions while being careful not to break what made them successful. New entrants are trying to disrupt the market by placing Augmented Analytics features at the core of their platform design. Technology providers’ marketing teams are trying to shape the market to their advantage and position their features as “must-haves” to potential buyers. Add to the mix GPT and LLM and it is easy to see why purchasers may be confused.

What capabilities are important? What is real vs. hype? What are the things I should look for that I am not thinking about? What is a shiny new toy vs. a true value-add feature? What is truly differentiating between these platforms?

Answers to most of these questions will differ based on your organization’s unique needs and priorities. Business Intelligence Evaluation Frameworks must evolve to thoughtfully include Augmented Analytics requirements to the existing mix of Enterprise and Reporting requirements.

To guide clients through their selection process, we use a framework with all stakeholders to 1) capture functional and enterprise needs and 2) use this input to customize the evaluation framework to clients’ needs.

The following guide provides a comprehensive list and explanation of functional and enterprise requirements. Its goal is to ensure all facets are considered within the context of your needs.

1. Foundational capabilities

Visualization, reporting and self-service experience

Core functionality of business intelligence tools. This is a set of capabilities to create charts and organize them on dashboard and reports.

Data visualizations: All tools provide the ability to simply construct all common charts (line, bar, pie, etc.), covering a majority of user needs. Some tools like Tableau are known to be the gold standard in data visualization for two reasons: common charts are designed by default with best practices in mind, and both the library and customization features allow to build very complex visualizations. As you evaluate solutions, look for the visualizations available in the standard library and if they are enough for your needs. If there are some gaps for edge cases, look for the availability of extensions.

Reporting: The main difference in reporting / dashboard editing approach is pixel-perfect vs. responsive. A pixel-perfect approach designs for a specific screen size, with very precise positioning of charts and objects (some tools offer multiple pixel-perfect layout options based on the display size). The Responsive approach is inspired by website design where charts and objects reposition themselves (and sometimes change aspect) based on the display size.

No/low-code customization experience: If your use case involves self-service, the asset creation experience is a key element to assess as we know that a steep learning curve leads to low adoption. How easy is the interface to understand, create analysis that is accurate, to customize the display? Does the no-code interface go deep enough for users to create meaningful deliverables, or will they just use it to extract data and finish in Excel? Does the interface show results as reports are being created or do users need to execute steps before they can see the data? A no/low-code interface will typically give users less control over the final appearance, which can be limiting to developers if there is not another option to create very specific contents.

Collaboration

Ability for multiple users to comment and share insights in and outside of the BI application. Ability for multiple creators to co-edit reporting assets.

Sharing and commenting: Data analysis is even more powerful when involving cross-functional teams. Analysts need to share and communicate their deliverables with other analysts to test their approach, and with stakeholders to test their findings. Most tools provide sufficient security to manage sharing within the platform. If your use case requires more, look for sharing outside the platform and how it is managed. For instance, the experience and quality of e-mail shares, PowerPoint/pdf exports, etc. Several tools offer integration with Slack and Teams, but with very different experiences (post a link vs. NL prompts within channel). Many demos showcase Slack integration, but Teams is the enterprise choice in many companies, maybe yours. Make sure to check what is required to enable these channels, both from your security teams and from the technology vendors.

Alerting: Alerts are a great way to bring users to the application when an important data event happens. While all tools offer the ability to monitor a KPI and send alerts (e-mail, notification in app, etc.) once thresholds are met, more advanced features can prove extremely valuable to ensure the alerts are relevant and don’t become a nuisance. Look for advanced statistical capabilities such as process control models and trend breaks, and for the ability for each user to personalize the alert experience.

Co-creating: As you review the self-service experience, it may be useful to include scenarios where multiple analysts co-create a single report or analytical asset, including data management, modeling and user interface configuration. Look for version control management and change traceability. What workflows can be configured to manage publication?

Mobile and embedded

Experience users get from looking at the applications from a tablet or smartphone (potentially wearables), whether on a browser or a native app. Ability to deliver all or parts of a dashboard / analytics application within a separate application.

Mobile app features: If mobile is an important use case, it is critical to understand and experience on these devices as it can differ greatly from desktop / web browser interfaces. Several tools have important features that are not enabled on the app.

Off-line: Not all applications offer off-line capabilities. Those who do may not have all features available when off-line. In a world where connectivity is taken for granted, it is important to understand the use case and if the tool’s capabilities are sufficient.

Phone experience: Our phone is the most personal device we all have. Users are going to engage more and more frequently with their phones to consult data and insights. Solutions with a phone-centric design and set of features include phone layout specific formatting options, navigation inspired by consumer applications (feeds) and intuitive navigation patterns.

Embed: When embedding into another application, let’s say a web page or another business tool like Veeva, what happens to the layout? Is the information still as legible? Do we lose any functionality? How easy is it to configure the integration, including security and authentication?

2. Augmented analytics capabilities

Auto-insight generation

Machine Learning (ML) and advanced statistical models identify insights for users, such as outliers, trend breaks, performance drivers, etc.

Depth of analytics: While most tools offer Auto-insights, the depth of analysis and experience for the user can greatly vary. Some tools “boil the ocean” and delivers findings in a report for the user to review and pick and choose from. Others provide mode specific analysis (performance drivers, comparisons, etc.) but are more complex to set up. It is important to understand your target audience, their goals and what they will get the most value from. A real-life use case will be extremely helpful here to distinguish the capabilities provided and if they will really help solve your problems.

Model integration: ability to integrate custom models created in R, Python, etc.

Natural Language Querying (NLQ) and search

Ability for the tool to understand English (or other languages) terms, interpret what the user is looking for, and convert into a data query.

NLQ vs. Search: NLQ (i.e., ask a question as you would ask to a person, like “what were our product sales last year”) vs. Search (i.e. use key words and their synonyms, like “product sales last year”) — not one is better than other (users tend to forego the “what is..” after a few days), this is an experience decision.

Synonyms: Ability to augment the metadata with synonyms (e.g., physician, HCP, prescriber, writer, customer) and ability to augment reference data with synonyms (e.g. product names with their nicknames or abbreviations).

Response time and input help: does the tool have auto-fill / auto-correct features? Does it respond as the user is typing or only once the full question is expressed?

Context understanding and retention: Does the tool understand and/or remember previous questions to avoid repetition and ensure continuity? Does the solution know or can be configured to select the right market definitions and units when asking about market share?

Natural Language Generation (NLG)

Generative AI that converts data into understandable sentences. Output can be single sentences or fully formatted paragraphs.

Level of control: generic language models are easy to deploy and come more and more out-of-the box from BI solutions, but they are usually very descriptive and of limited scope and value. More elaborate NLG comes from Large Language Models or from custom / purpose-built linguistic models (deterministic AI). The latter provide more control over the language generated, therefore limiting compliance and inaccuracy risks, but require expert configuration.

Data storytelling: Ability to present and combine data, visualizations, insights, sentences in ways that resonate with the target audience (i.e. communicate with clarity). Look for features where one can embed NLG output or have authors annotate charts and visualizations for greater context. Define your own definition of Data Storytelling and describe it as a requirement to technology vendors. Expect more than just arranging a few charts on a page.

3. Data and administration

Data integrations and preparation

Platform’s ability to connect to various data sources natively, whether on-premise or on-cloud, and associated features to manipulate the data.

Connectivity: Most tools have native connectivity to industry-leading data platforms, both on-premise and on-cloud. If not native, APIs + ODBC make this capability fairly standard. If your company uses custom or niche software, a validation may be required. It is a good practice to list your source systems, establish the desired integration protocol and frequency (batch vs. live, etc.).

Data prep: Once connected to a data source, the data often needs to be manipulated for optimal use in the analytics interface. Understanding the data manipulation requirements and what can be done within the tool will drive important design decisions (e.g. create a specific publication layer at the source vs. do more within the tool itself). Considerations should include the ability to integrate data coming from different systems, the types of data manipulations possible, the interface to implement these manipulations (drag and drop vs. scripting). Note that some tools require an add-on license to enable some of the data manipulations capabilities.

Data modeling requirements: Understand the models that work optimally for each tool and how compatible they are with your existing data assets and functional needs. Some tools work better with a star schema (despite their support of all schema types). What mechanisms exist to resolve common modeling challenges, like chasm traps?

Enterprise readiness

Set of capabilities, usually having little influence on user experience, that ensure the functionality is deployed and managed safely and efficiently.

Architecture: A key architecture question is whether the analytics tool stores the data in-memory or queries the data source “live”. If live querying, does it query the actual client data warehouse or does the data need to be “moved” to a specific place.

Security: What single-sign-on (SSO) protocols are supported? How are user groups managed? How do you control access to documents / assets? How is row-level security managed? How are exports tracked and managed?

Deployment models: Is the solution offered as Software-as-a-Service (SaaS), available on-premise or on private cloud? For is the upgrade protocol and frequency? What is the support model for each?

Licensing Models: Estimate costs based on your expected data and computing loads and on the number of users, at launch and planning for expansion. The most common licensing model is by user (sometimes different tiers for different user types), but capacity-based models are becoming more frequent. Understand the discounting capacity of the vendor.

Governance

Functionalities allowing a superset of users to track platform usage and to manage access to content.

User access management: Typically, user groups are defined and assigned rights to specific contents, or groups of content. Some tools provide deeper controls than others, while some integrate with existing security infrastructure such as Active Directory, making administration easier and consistent across.

Asset catalogs: data catalog, metrics catalog, analytics catalog and other catalog functionalities helping users find the information they seek easily. Look for recommendation engines to help users find information faster. From an administration perspective, look for the ability to manage content availability from a centralized place, easily.

Usage management: Set of reports and views to monitor asset usage — for instance how often users ask a specific data question. Look for out-of-the-box reports, which are often sufficient. Also evaluate the depth of data captured and how accessible it is for analytics. Some tools track object-click level statistics but don’t report on them out-of-the-box. These allow fact-based decisions on which reports and assets to prioritize.

How to use this guide

You likely noticed that, while this guide is very thorough, it does not state exactly what is required, but provides direction for you, the purchaser, to create your own detailed framework.

For each dimension in the framework, create a 1–3 sentences description of what your organization need today and in the next 2 years.

Challenge and validate these requirements with a critical eye, to ensure the needs are real (and not nice-to-haves). One way to do this is to add a tangible business value description next to each requirement.

Prioritize the requirements. This can be done in multiple ways: ranking, weighting, H/M/L, etc.

When inviting vendors, consider adding examples to the high-level requirements. Include your run-of-the-mill analysis requirements as examples, as well as a few stretch assignments that have proven difficult in the past.

Read more insights from ZS.

--

--

Jérôme Chabrillat
ZS Associates

Data & analytics leader driving high impact technology strategies