As interest in building a modern data stack has increased, so have the options of business intelligence tools available to support it. Sometimes just selecting your analytics tool can feel as daunting as the actual implementation itself.What is a business intelligence tool?A business intelligence (BI) tool—also known as an analytics tool—is data visualization software used for the purposes of reporting or analysis. Dashboards and reports commonly used for day-to-day analysis of key metrics and indicators are examples of a business intelligence tool.Before selecting a BI tool that is right for your environment you need to understand how it works (or doesn’t work), what is required to implement it, and how it will help your company meet business objectives. In this blog, we will discuss different use cases, the pros and cons of some of the more popular BI tools, and best practices to get started with your modern analytics implementation.BI Tools in The Context of Organizational Use CasesTo help narrow your focus, think about who within the business will be using the BI tool and the ways in which they will use it. Here are some common use cases to consider:Enterprise Reporting: Enterprise reporting is commonly defined as analytics that spans across several business units within your organization (i.e., sales, marketing, and finance) for the purposes of internal reporting. Typically, this entails a wider deployment across multiple business users, data sources, and the expectation that users will be using your data analytics tool for their day-to-day analysis. If your organizational analytics and requirements are centered around traditional enterprise reporting, then starting with the Gartner Magic Quadrant Leaders is a safe bet. Qlik, Tableau, and Power BI have market longevity in their respective platforms, robust cloud capabilities, active user communities, and R&D investment that allows for continual feature releases. These can be considered the tried-and-true options with the industry experience and depth to back up most customer use cases.Hybrid Vendor Approach: Often organizations will choose to build their solutions entirely on one vendor platform (i.e., all Microsoft stack). However, hybrid or modern data stacks (i.e., FiveTran/dbt/ Snowflake) have increasingly grown in popularity. Hybrid approaches to building a modern data stack lend themselves to business intelligence software that is built to take advantage of your data warehouse layer and not necessarily constrained to a specific vendor platform. We commonly see tools like Looker and Sisense fit this niche incredibly well. Both tools are positioned to fit well within a modern data platform, and are great for self-service analytics, guided analytics, and embedded analytics.Embedded Analytics: Embedded analytics places analytics at the point of need—inside of a workflow or application—and makes it possible for your users to take immediate action without needing to leave the application. If your focus is on embedded or infused analytics, a visionary like Sisense or a niche player like AWS QuickSight are great fits. These two tools are major players in the embedded analytics space but can also work well in a traditional enterprise reporting setting. These two vendors place a heavy emphasis on their embedded capabilities, and its core to their respective software platforms.Analytics Novice: For organizations just starting their business analytics journey and not wanting to make a significant investment upfront, low-cost alternatives exist that make the barrier to entry relatively low. Two great options that fit this use case are Power BI Pro licensing and AWS QuickSight. Power BI Pro licensing provides a low cost, per user entry point for clients looking to pilot Power BI. AWS QuickSight has several different licensing tiers that allows clients the flexibility to choose which option works for them.Top BI Tools Pros and Cons: There is No One-Size Fits All Organizational use cases help to narrow your focus in tool selection. But ultimately knowing where each analytics tool fits in your respective data framework is of equal importance. Consider this your “cheat sheet” so to speak on tool selection. Power BIPros: Power BI has established a major foothold in the BI and analytics space in six short years. Its native integration within the Office 365 stack, flexibility in terms of back-end deployment options, and relatively cheap entry point with Power BI Pro licensing makes it omnipresent in almost every tool evaluation I’ve been a part of the last several years. Power BI can be deployed in a variety of methods in terms of back-end data integration (Import, DirectQuery, LiveConnection, or Dataflows) and that flexibility alone opens multiple avenues in architecting your solution.Cons: Because it’s a Microsoft product, Power BI is oriented toward customers with an investment in the Microsoft stack. While hybrid approaches (Snowflake hosted on Azure to Power BI) are feasible, the potential downside to Power BI is attempting to fit it into a cloud architecture (AWS or GCP) in which native integration is limited.QlikPros: In terms of a time-to-value proposition in building an application from the ground up to production—Qlik is a definite market leader. Its associative data model still remains to me one of the most powerful features in an analytics tool period, and it’s making significant investment in its SaaS platform as well as advanced analytics functions in its search-based capabilities.Cons: Qlik is one of the few remaining “independents” that does not have a back-end cloud platform (i.e., AWS/QuickSight, Azure/Power BI, GCP/Looker) and it’s to be seen how that will affect direction long term in an industry that is rapidly driving toward convergence. On the other hand, Qlik is at least attempting to bridge the gap with its lack of a native cloud platform with Qlik Replicate and Qlik Compose.LookerPros: Looker is currently defined as a Challenger in the magic quadrant, but I fully expect it to be a Leader in 2022. Looker defines itself not as a traditional BI tool but rather a modern data platform—and if your organization adheres to a modern data stack approach within the cloud, then Looker is an excellent fit. It is great for self-service, guided, and embedded analytics. One of its key differentiators is its semantic layer (known as LookML), which is essentially a way of writing and generating SQL. What this opens up is true data governance and version control with reusable models. If you make a change once, it flows and propagates to everything. Coming from a developer background myself, this is a huge win in terms of the level of collaboration it opens up between your dev team.Cons: Looker is built with a dependency on a database or data warehouse, so if your approach is to use a few excel spreadsheets as your source and begin developing then this is likely a poor fit. It is important to note that performance will be tied to that of your data warehouse. Query optimization and indexing within your data warehouse layer takes on added precedence with a Looker implementation.AWS QuickSightPros: QuickSight is currently defined as a niche player but expect this to change in the not-so-distant future. QuickSight plays well in terms of its extremely affordable price point for customers starting their analytics journey, and in a short amount of time has established itself a major foothold in the embedded analytics space.Cons: The product has room to improve from an overall visualization and capabilities perspective, but I would expect it to grow and better compete in the enterprise reporting space in 2022. For now, its capabilities are best suited to embedded analytics and solution-based applications.SisensePros: Sisense is also one of the few remaining independents in the analytics industry with a relatively strong user base. Sisense can be deployed via its ElastiCubes engine, which is its high-performance semantic layer that allows for blending of multiple data sources, or live queries that run directly against your data sources. Sisense is also built to handle most embedded analytics use cases and is truly one tool that can operate in both an enterprise and embedded analytics capacity with ease.Cons: Being an independent, Sisense will face the same challenges of not having a back-end cloud platform as part of its solution stack. While Sisense has a robust international presence, it does not have as wide of a user base and name brand recognition in North America as some of the other major analytics vendors.TableauPros: Tableau can be considered the industry leader in terms of visualizations and front-end aesthetics. Tableau has a robust user base, ecosystem, and long tenured standing within the analytics industry. Tableau Prep has helped to bridge the gap in the tool’s ETL and data prep capabilities that was previously filled using Alteryx as an intermediary.Cons: Tableau still has weaknesses in its overall ability for embedded analytics and needs a strong data warehouse component or intermediary layer to scale effectively. A development path outside of a traditional data warehouse is achievable, but only when tools such as Tableau Prep, Knime or Alteryx are utilized. This approach also means data logic is constrained within the tool (s) itself as opposed to a central, governed data warehouse.Best Practices for Selecting the Right BI Tool for Your EnvironmentOnce you’ve evaluated your business use cases and had the opportunity to review the pros and cons of the different business intelligence tools, here are some best practices to follow before making your final section.1.) Evaluate of your data stack:An analytics tool should not be considered a standalone tool but rather a part of your overall data stack/data architecture. Locking yourself into an analytics tool that isn’t a clean fit within your data architecture sets up development hurdles before the process even begins. So, if you’re building your modern data stack on a cloud platform like Azure, AWS, GCP, or Snowflake, then you should consider how each analytics tool (Power BI, Qlik, Looker, Tableau) fits into these respective architectures. Data complexity and frequency of data refreshes are two excellent starting points to evaluate this question, so start by asking:Will transformation logic occur within the analytics tool itself, or will an intermediary layer be needed for the heavy lifting? This will help you determine what’s needed in the back end versus front end and how your analytics tool will fit.Will data need to be scheduled weekly, daily, or real time/streaming? Again, this will help you determine your approach on the back end and what analytics tool fits the frequency use case.When deciding on a cloud platform, consider:Snowflake: There is no such thing as a unicorn, but Snowflake has assumed clear market leadership in the cloud data warehouse space, and that trend will continue into the foreseeable future. One of the big advantages of Snowflake is that it works well with most analytics tools. We have successfully deployed Power BI, Qlik, Tableau, and Looker on top of Snowflake for clients.Azure, AWS, and GCP on the other hand, are not as flexible in terms of tool selection. While cloud vendors can play nicely and allow for non-traditional use cases, vendor cloud platforms will naturally lean in the direction of their own analytics tool. While we have seen clients use Tableau and Qlik with Azure, a typical Azure deployment will entail the use of Power BI on the front end and a typical GCP implementation will generally leverage Looker on the front end. Amazon Redshift, however, can still be a relatively fluid situation—Tableau, Qlik, Looker, and Power BI are all options but not without their technical gaps in implementation.While not an absolute rule of thumb, be prepared for the possibility that a vendor cloud platform selection will also have significant bearing on your analytics tool of choice in keeping to a single vendor stack. If you’re not on the cloud, or a data warehouse isn’t part of your current or future state roadmap, don’t feel left out.No—or lightweight—data warehouse: Despite market pressure and messaging, there is still a very real and very sizable segment of the market where a cloud first strategy does not fit. Constraints such as relatively static data volumes, budgetary pressure, or trained staff on hand to maintain solutions are all reasonable reasons to defer on a cloud migration. In these situations, analytics tools are often called upon to function as both a data transformation and data visualization layer. Qlik and Power BI are the two most effective analytics solutions you can leverage in this situation. Both allow for robust data transformation capabilities out of the box; both can mimic the ingestion, replication, transformation, and data warehousing layers popularly employed in a modern data stack; and both offer best-in- class visualizations on the front end.2.) Run a proof of concept/tool evaluation: An analytics tool is an investment and it’s important to know how that investment will work for you. We encourage clients to go through an evaluation phase that either entails building a proof-of-concept application or conducting a tool evaluation to understand the level of effort in implementation and associated cost. Doing the homework up front allows you to understand how your future state analytics tool will function, scale, and what the cost may be.3.) Review the tool’s platform growth: When picking an analytics tool, the idea of future state roadmap or anticipated growth in that platform should be a key consideration. A tool with a limited product roadmap should be considered a red flag. I’ve had a client elect to choose a tool on the clear downswing because they were given a sweetheart deal on licensing. To no surprise, a year later they had to go through the process of ripping out that tool as it did not meet their future state growth goals.Selecting a business intelligence tool does not need to be a daunting task. Just be sure to go into the selection process knowing your use case, and how the tool will be used to meet your analytics needs.4.) Consider the intangibles during your evaluation: When a technical evaluation of your development approach has been completed, it’s likely the tool selection process will have been pared down to two to three front runners. This is where specific organizational requirements and intangibles can factor into the selection process. Consider this your gate check for the final evaluation.Data Visualization: Analytics vendors are increasingly driving toward convergence in terms of front-end capabilities. Search based functionality, alerting, ad-hoc reporting, and geospatial mapping are all common out-of-the-box capabilities that were once fragmented across select vendors. While each tool functions differently in terms of navigation, convergence means you’re going to get a similar user experience in terms of aesthetics and out-of-the-box visualization choices across all common major analytics vendors. To differentiate, really look for the compelling edge cases that are specific to your organization. If the ability to connect underlying datasets to Excel, create automated alerting thresholds on KPIs, or create collaborative notes within visuals is of important to you, find the vendor that stands out for these use cases and weigh these in your selection accordingly.Licensing: Everyone likes getting a good deal but be aware of licensing constraints and how it affects future state scalability before you commit to a software purchase. Software vendors are increasingly driving subscription costs downward for mass market adoption, and it’s easier than ever to adopt starter licensing as a means to pilot and drive an analytics tool. My advice is to be aware of the constraints that come with the starter licensing prices. Premium features, or a significantly scaled up implementation over time, will incur a corresponding cost. Be aware of the price points that are incurred as you scale up, even if it’s a year away in your development roadmap.User and Developer Learning Curve: It goes without saying, but when you adopt an analytics tool an expert will need to build your end state solution and users will need to be trained on how to use the tool. With a talent war in full force within the data and analytics industry, it’s important that you’re choosing a solution that can be widely developed on and adopted. A niche tool means a niche knowledge base will be needed to develop and train and will likely command a premium in terms of talent cost to support development.