The data and analytics market moves so fast, it’s hard to gauge what’s really trending and what’s just a passing fad. Luckily, data and analytics is our business and we’ve identified where you should place your focus in 2022.

What are the key concepts and solutions within data and analytics this year? What technologies are businesses adopting and why? And where should you place your attention to help transform your business and move at the pace needed to meet the demand that’s out there?

In this blog—with the help of experts in this space—we answer all those questions to help you navigate the data and analytics trends this year and better decide where you should place your focus and why.

Here are the Top 8 Data and Analytics Trends to Think About This Year

Trend #1: Embedded Analytics

Embedded analytics was a hot topic last year, is still trending this year, and will likely continue into next—especially as more businesses begin to identify how it can benefit their users. Embedded analytics places analytics at the point of need—inside of a workflow or application—and makes it possible for your users to take immediate action without needing to leave the application.

Companies are looking at embedded analytics to provide not only a better data experience for their business users, but to also enable faster decision-making, reduce user error, optimize a mobile workforce, address licensing constraints, and much more.

The streamlined nature of embedded analytics makes it possible to get more value out of your organization’s data for both internal and external users.

There is currently a low barrier to entry for embedded analytics as most modern analytics solutions (Power BI, Looker, Qlik, Tableau, etc.) all have embedded capabilities baked into the tool or specific embedded-only tiers of licensing. Previously, embedding one of these solutions required stretching the capabilities of those tools beyond what was recommended, but that is no longer the case.

We built an embedded solution using Tableau that integrates a variety data types from many sources into an external-facing site that users could access via SSO (single sign-on) without any additional authentication or sign-on required.

Trend #2: Reverse ETL

Reverse ETL is not a new concept—just a new phrase. Traditional ETL (extract, transform, and load) takes raw data from multiple disparate operational systems, such as Salesforce and Marketo, transforms it into meaningful and usable data, and stores it in a data warehouse, such as Snowflake, creating a single source of truth. Once the data is in the data warehouse, it’s typically then only used for analytics purposes going forward.

Reverse ETL is the process of copying or syncing that rich, multi-sourced transformed data from the data warehouse into operational systems so it can be used by business users in their day-to-day workflows and processes. This concept is similar to embedded analytics, but the level of integration is very different. Where embedded analytics integrates clean, transformed data into an operational system in the form of a visualization, reverse ETL integrates at the data level so it can be used natively by operational systems and its users. For example, a “customer master record” in the data warehouse may be sourced from a CRM, ERP, and a marketing automation platform. Synchronizing this enriched and clean customer data back with operational source systems allows users to get the entire picture of a customer natively in each platform. Historically, reverse ETL was only accomplished with custom code, but now tools such as hightouch or Census are making reverse ETL easier than ever.

You’re already doing the hard work of consolidating and governing your data and making it valuable for analytical purposes. Reverse ETL lets you take advantage of the work you’ve already done and helps improve data quality within your operational systems too.

Trend #3: Data Observability 

Data observability is a term that covers a broad category of activities that allow you to maintain a constant pulse—monitor, track, and triage any breakdowns in your data pipelines and workflows—in near real time. This is really beneficial as organizations need—and want—to quickly view and understand the health of the data in their organization, not just from a data quality perspective, but also from the perspective of ensuring proper flow and use.

With the availability of more mature data platforms and the increasing growth of the modern data stack, organizations need to be able to automate data governance wherever possible across the entire data lifecycle. Business users need to trust that the data they are using to make decisions is up-to-date and trustworthy, and the expanded data observability capabilities within transformation toolsets such as dbt and are helping to make that easier in modern data stacks.

Data observability tools provide real-time pipeline visibility, lineage tracking and change detection, secure data feeds, automated linage tracing, and pipeline alerting—all things that improve data quality and promote a data-driven culture. (photo credit:

You don’t want to wait to find out that there are problems in your data pipelines. The more proactive you are, the less downtime you will experience and the more efficient your data users can be.

Trend #4: Headless BI

Headless BI is a hot topic, especially among data engineers. The rationale behind headless BI is that all data logic, transformations, dimensions, aggregations, etc. should be the domain of a central, universal semantic layer and that should be decoupled entirely from your presentation layer. By having this universal semantic layer established, any BI or visualization tool can connect and inherit all the data points from this single source of truth.

With a headless BI approach, data visualization and analytics tools can easily be interchanged as the semantic layer is decoupled from the tool itself. (photo credit: Medium)

In practice, the idea of a well-built and accessible data warehouse is effectively headless BI. It’s only a trending topic of conversation now because in most analytics implementations the pendulum has swung back in the direction of data engineering work happening in the data warehouse as opposed to building the logic into the BI tool itself. People see the utility in central data governance, and the prevalence of tools and platforms such as dbt and Snowflake have driven this recent interest.

Headless BI can benefit your organization because it provides flexibility to move or reorient to another—or multiple—BI tools.

If all your data logic is constrained to a universal semantic layer, then you can simply layer in any BI or reporting tool based on your organizational preferences. If later you realize the chosen BI tool isn’t a right fit, then you can leverage another without unwinding layers of data logic within the tool itself.

Trend #5: Data as a Service and Data Monetization

Data as a service and data monetization—in theory, has been around for decades. Consumer reports and surveys have been published or sold in the open market by various entities. But data monetization is an option for more organizations today because modern data and analytics platforms make it easier than ever to “white label” an analytics solution and develop a product offering, truly maximizing the value of company data.

Companies—particularly those oriented in a consumer services business model—likely have valuable data assets that can command market value or a high degree of interaction if offered up as a service, product, or subscription. As we increasingly become a data-driven economy, having access to valuable insights that can give you an edge will be critical. Finding a mechanism to take advantage of this data will be a means to drive revenue for some organizations; and the ability to do so has never been easier.

Data monetization is attainable because modern data and analytics platforms make it easier than ever to “white label” an analytics solution and develop a product offering, truly maximizing the value of company data.

We created a customer portal for one of our customers—a legal services provider—so that they can provide subscription-based access to insights to their clients. The company had a host of benchmarking data at their disposal that they could feasibly compare against client data to measure litigation spend and use to help reduce cost. This was a valuable data asset that they were able turn into a product and monetize.

If you have the data, let it work and make money for you. Ideally, you can use a product-based, data monetization approach to offset the cost of software licensing.

Trend #6: Data Lakehouse

Data lakehouse combines the best parts of a data warehouse and a data lake into one solution.

It takes the flexibility, scale, and cost efficiency of a data lake, but adds the data management and data structure of a data warehouse to execute compute tasks as well as storage needs.

This enables users to do everything from business intelligence, SQL analytics, data science, and machine learning on a single, open and flexible platform.

This is beneficial because not all data needs to be highly governed and moved into a data warehouse, and this approach allows you to avoid that. You can leverage your data lake like a data warehouse by having capabilities to structure and query data for analytics consumption in the data lakehouse. This helps to manage costs, speed up development, and enable very tactical data pipelines. One tool that can help you do that is Databricks with their Delta Lake platform.

A new paradigm: The data lakehouse is blurring the lines between traditionally independent solutions: data warehouses, data lakes, and compute resources, and creating a one-platform solution for modern data. (photo credit: Databricks)

Trend #7: Managed Services

Managed services are in high demand and will continue to be so this year. As more organizations embark on transforming their business by moving to the cloud, they quickly realize that the time, effort, and expertise to manage these new environments don’t just disappear. Cloud vendors manage many parts of a cloud environment, but you’re still responsible for building and maintaining your own applications and platforms.

As a result, organizations are turning to managed services by a third-party to manage maintenance and support of applications so they can focus on core business objectives and worry less about managing applications and platforms. Managed services typically include upgrades, security patches, etc. and are ideal for organizations that lack the in-house expertise or resources to do this themselves.

The cost of managed services is often easily justified when organizations can focus their time and resources on meeting core business objectives and revenue generating activities.

It’s also beneficial to look at managed services to help offset the costs and stress related to hiring and retaining talent—especially in light of the current worker shortage.

Trend #8: Serverless Technologies

Serverless technologies are the best way to make sure your cloud is as “cloudy” as possible. Serverless is exactly what it sounds like—no servers to manage or maintain. But it actually goes even farther than that — there’s not even any applications or platforms to maintain either.  Serverless technologies can be thought of as a “function as a service” or something that executes purpose-built code. Of course this code runs on a server somewhere, but since the backend servers are completely transparent, there is literally no overhead associated with server maintenance. Serverless technologies also have an extreme cost advantage—you only pay for the time a serverless function runs, not the entire time it might need to run, which would be the case if the code was hosted on a dedicated server.

As an example, think of an ELT process. You could provision a server, install any necessary software to perform ELT, maintain the server and ELT software forever, and then be charged for every second that the server is running—even when it’s not being used to perform an ELT process. Contrast that with a serverless paradigm where purpose-built ELT code runs on-demand, completely independent of a server and underlying operating system, and you’re only charged for the time while the process is running.

Don’t Forget About the Basics

Trends are important to ensure you stay innovative, but before you even consider some of the trends outlined in the blog, we always recommend you have the right foundation in place to set you up for success. Examine your data strategy to ensure it’s up to date, assess the capabilities of your data stack, and plan out what it will take to get to your desired future state. These foundational activities are key to transforming your business with data and analytics.

Sharon Rehana Sharon Rehana is the content manager at Analytics8 with experience in creating content across multiple industries. She found a home in data and analytics because that’s where storytelling always begins.
Subscribe to

The Insider

Sign up to receive our monthly newsletter, and get the latest insights, tips, advice.

Thank You!