There is a lot to think about when making a move from Qlik to Looker. Knowing why and what is motivating the migration will help to make the new environment fit your business needs.

There are lots of reasons to consider migrating to a data platform such as Looker, including wanting to do more with analytics than simply just creating dashboards or building basic business intelligence (BI) applications. Many organizations are looking to migrate to Looker from Qlik. And although this might be a simple concept on paper, in practice it often comes with complexities, decisions, and a lot of planning.

In this blog, we’ll discuss key considerations before your migration, as well as understanding how your current Qlik environment will map out to Looker. We’ll also provide best practices for a successful migration to Looker.

Key Things to Consider When Migrating from Qlik to Looker

Moving from one BI tool to another requires a wide lens into your existing environment before you can determine what, if anything, you want different in your new environment.

One of the first decisions that needs to be made is what kind of migration makes sense for your organization. While no two migrations are the same, the most common types can be bucketed into three different types: lift and shift, reconciliation, and hybrid migration.

  • Lift and shift: This, conceptually, is the simplest migration type. Your current apps and dashboards meet your needs, and you are simply looking to replicate them in Looker. While the “how” might change between Qlik and Looker (something that still needs to be thought through), the output should still be the same—dashboard for dashboard. Don’t be fooled though, while the dashboards will be the same between the two, end users will still have a plethora of new tools and business process possibilities at their fingertips with Looker.
  • Reconciliation: This is the most comprehensive migration model for several different reasons. This is when your BI environment no longer serves your needs, you have outgrown it, or want to do things differently. Your current environment can be used as an outline, but you should take a hard look at what dashboards, reports, or analysis exist across your entire environment and consider how to build it bigger, better, and stronger. This type of migration is often the longest and most difficult because it is rebuilding from the ground up. However, sometimes that’s what’s needed.
  • Hybrid: This is the most common migration type. Some of your apps and dashboards may be exactly what you need while others may not be a good fit anymore. Maybe there is a lot of duplicate or similar analytics? The hybrid approach still requires you to look at what is in your environment but gives you a lot of flexibility and quick time-to-value as your end users can take advantage of what will be migrated while you think through and build what will not.

Understanding Your Environment and How It Maps to Looker

Once you have decided the type of migration that best fits your needs, it is time to understand how Qlik maps to Looker. This process has a purpose and involves gaining an understanding of each part of your BI environment and planning what parts you need to migrate into Looker, making improvements along the way.

Now, fair warning, you probably take your current environment for granted. You might not know what is in the nooks and crannies of every application, or scripts, or set analysis, etc. and that’s ok. Now is the time to look at your environment as a whole and get a handle on data sources, transformations, visualizations, and usage. An honest, thorough assessment of your environment will pay dividends when it’s time to migrate as well as better understand your end users.

Here are some things to keep in mind as you review and assess your existing Qlik environment:

  • Consider not just what you have, but its importance. Determine which components are critical. Spend extra time documenting critical areas, and make sure they receive priority during the migration.
  • Look for problematic ad-hoc fixes, obsolete code, timing workarounds, and so on. Qlik often requires you to build dependencies in your environment, so make sure you are thinking through all of them.
  • Find redundant data and code and identify errors. Take note of any areas that are obsolete or no longer needed. This often leads to a lot of discussion and decisions that need to be made as these areas are often put off because they haven’t been given any sort of priority. Use this time to make them a priority.
  • Consider your end users. What types of users work in your environment? What type of access do they need? Involve users directly, and work on getting their input and buy-in to ensure that the new environment is everything they need and more, not just what they have always been given.
  • Prioritize your use cases. No matter which migration path you are on, it is important to understand which use cases should be migrated, and when.
  • Make sure all stakeholders are using the same definitions for terms and KPIs. Ask them what would enable them to do their job more effectively and efficiently. So often we see definitions and functionality slowly drift from application to application for one reason or another. Use this time to reign everything back in and ensure stakeholders and end users are all on the same page.
  • Work with the administrators and developers to figure out where time is being spent unnecessarily. How can that be mitigated?

While the points above are not specific to just Qlik or Looker, they are extremely important to think through. The next few points will hit a little closer to your Qlik environment and how it will potentially map out to your new Looker environment.

Data Sources and Data Extraction

Where is your data coming from? How are Qlik applications using, storing, and manipulating the data from source to presentation? Looker does not connect to flat files (by design), however, Qlik scripts often rely very heavily on flat files for mapping tables, quick transformations, and a whole host of different things. Knowing exactly what is being transformed and how it will allow you to either put this back into your source systems or identify what needs to be created with LookML.

Qlik scripts are often iteratively built upon and used throughout your environment.  Creating a map of how it goes from source to presentation is key to a successful migration. Look through all your Qlik scripts to identify how data is being pulled in, where it is being stored, and what applications are using which qvds. Often, applications share qvds and further iterate on the data for each application and or script. Like the previous points, understanding what applications are using what qvds (and how) will be key to identifying how to migrate to Looker.

From here, a plan can be created on how to move transformations and business logic.  Should it be pushed back to your source data, or built in LookML?

Business Logic

Qlik is a very powerful tool, and lets developers create a lot of business logic without the need for a centralized data repository. This can also lead to a lot of siloed data, business logic, and variations of the data your organization needs to run effectively.

Look in all the places Qlik allows you to have business logic. Scripts, data manager, master items (dimensions, measures, visualizations, alternate states, and everyone’s favorite—set analysis). Document each instance of business logic. Is it the same across applications? Where and why does it vary? Do these definitions still apply and are they agreed upon by the business?

When migrating to Looker, LookML can be used to implement equivalent business logic, however with Looker, you only need to define this logic once, and are able to use throughout your entire environment. If a change ever needs to be made, it can be done in one place, and it will automatically be propagated through every inch of your Looker environment. This said, best practice for any BI tool is that business logic live up stream of the BI platform in a centralized place. Are any of the transformations able to be pushed up stream?

On the topic of set analysis, a combination of Looker’s integrated filters, custom measures and dimensions, and LookML will allow you to re-create the analysis set analysis enables but in a more developer and end user friendly manner. No more copy and paste for lengthy set analysis.

Applications

Don’t just look at your applications, but also take stock of the dashboards, visualizations, drills, navigations (including the drill-down hierarchies), user-created content, and custom objects and extensions. Which are critical? Are there redundancies? Can any of them be combined and improved?

Work with stakeholders and end users to verify that the applications are in fact useful and not missing anything (or don’t contain any fluff). While Qlik apps typically consist of several sheets, Looker dashboards operate a little differently. The sheets in a Qlik app will most likely correlate to multiple Looker dashboards or be converted to Looks (or simply reports). To mimic the navigation of Qlik (if desired), you can use custom links and drills in Looker to connect the dashboards replacing those sheets.

Looker also has plug-ins and extensions like Qlik’s, which you can find in the Looker Marketplace. You may be able to implement the functionality of custom objects natively in Looker without needing a third-party plug-in.

The main takeaway here is to focus on what analysis is being done, and how it needs to be presented. Looker has a plethora of options available to have one-off reports saved, exported, sent, etc. without having to have it placed physically on a dashboard for consumption. Use this as an exercise to reduce the clutter, making what is displayed all that more meaningful.

Tasks

Reload tasks, actions, triggers, user syncs, external programs, and everything else that is handled in the QMC has a purpose and needs to be thought through when migrating. Because Looker queries your databases directly, there is no need to reload the data every time you need fresh data. This helps not only with data freshness, but also eliminates dependencies on reloads and scripts. No more having to cascade reloads manually and ensure qvds are created in the proper order to avoid reload errors. Or worse, not knowing a dependency didn’t reload and some or all your data is stale.  Looker utilizes data groups to facilitate caching, derived tables, and anything else that needs to be managed in a particular order and does it automatically. Define once and use throughout your environment.

Aside from what is being done programmatically in Qlik, are there actions that are not handled programmatically but should be or desired to be? Outside of data refreshes, Looker’s API can be used to kick off business processes (outside of BI even), user provisioning, and anything in between.

Security

Aside from set analysis, section access and security rules are usually runner ups for the most discussed topics in Qlik. While each of these topics can be extremely granular and difficult to maintain or implement in Qlik, think big picture about the purpose and function of these items. Do they still apply? What type of security do we need to migrate or implement in Looker? What is the goal or intent with section access and security rules? Row, column, app, report, folder, and every other aspect of your environment can be secured in different ways with Looker, but in a much more developer and end user friendly way.

Using a combination of your database’s already existing security models and Looker’s user attributes, groups, persisted filters, and LookML—each aspect of your security requirements can be created in Looker. Column and row level security can be managed in one place, folder security and access the same. In Looker, admins can even pseudo login as any user to ensure that they see exactly what they are allowed to see. No need to run Qlik’s audit feature and make sense of the lengthy results.

Best Practices for Migrating to Looker

There is a lot of ground to cover when preparing for a migration of any kind and if it hasn’t been made clear up to this point, good preparation is the key to a successful migration. To prepare, you need to clarify your vision, survey and document your BI environment, make a plan, and finally carry out your plan.

  • Clarify Your Vision: Understand where you are now and where you want to be. What does your ideal state look like? Work with end users to see what business needs might have changed since you created your Qlik environment. Your new environment should be a better fit than your old environment. If you knew then what you know now, how would your BI environment be set up differently?
  • Establish a clear definition of success: Don’t let your migration drag on forever. Be clear upfront about your endpoint, so you know when your migration is a success. Qlik doesn’t need to be shut off completely before declaring success either. If end users can do their job more effectively and efficiently, that is a huge success. That said, paying for two different BI platforms isn’t ideal, knowing when and being able to completely cut over is key to completing the migration.
  • Survey, Document, Analyze and Design: Create a comprehensive map of your Qlik environment, to gain a complete picture and enable you to see it for what it is and understand it. Then, design your migrated environment.
  • Create a phased migration plan: Once you have your vision, your map, and your design—set priorities. What is the best path forward? What areas are critical? What still needs to be thought through? Who are the stakeholders and business process owners?Don’t try to tackle everything at once; split your migration into systematic phases to allow for validation, enablement, and adjustments. If the migration will change business logic or processes, think about how to validate the new environment. How will the business account for before-and-after differences when the business logic changes? Take time to build in efficiencies to make the new environment more bullet-proof. And plan to develop templates and documentation to ensure consistency.

    Plan an agile rollout process with multiple sprints. Have a plan; stick to it where possible and adapt it where needed.

  • Involve your end users: End user involvement is the key to success. Keep them in the loop with everything being planned. Too many times have migrations failed because the end users weren’t involved in the planning process. End users are just that, the end users. If they aren’t getting what they want and need, but instead what the developers think they need, then the migration is doomed from the start. End users know the business inside and out, or if they are external, know what they need to know to make effective decisions.
  • Plan for training and mentoring: Create a comprehensive training and user adoption plan to be implemented before rollout. Help users understand how to use all the self-service features of Looker. Plan for knowledge transfer and mentoring for your developers so that they understand Looker best practices and how to fully use the platform of its abilities. What good is brand new platform if nobody knows or understands how to use it?Use this as an opportunity to change business processes as well. If end users can answer questions for themselves, they don’t have to inundate developers with BI requests. If enough ad-hoc analysis is built by end users, plan to incorporate that into the production environment as well.

There is a lot to think through after the decision to migrate has been made. The how, when, and where of a migration can seem like a very daunting task, and truthfully, it is.  The points above are merely scratching the surface in some cases, but ultimately, it is important to keep the why in mind when planning a migration. Empowering end users and business users to have access to the data and answers they need will pay immense dividends in the end. Take the time, do it right, and future you will thank you.

Check out our blog, What to Consider When Migrating to Looker, for more tips about preparing for your migration.

Josh Goldner Josh is Analytic8’s Google Practice Director and is also a certified LookML Developer. Josh implements modern analytics solutions to help his clients get more value from their data. Josh is an avid outdoorsman and balances his professional work with hunting and teaching his coworkers how to fish.
Eric Morrell Eric Morrell is a consultant based out of our Chicago office. He specializes in analytics projects using a wide variety of tools to help companies get value out of their data. Outside of work he enjoys golfing and using machine learning to help win his fantasy sports leagues.
Subscribe to

The Insider

Sign up to receive our monthly newsletter, and get the latest insights, tips, and advice.

Thank You!