Early Access: The content on this website is provided for informational purposes only in connection with pre-General Availability Qlik Products.
All content is subject to change and is provided without warranty.
Skip to main content Skip to complementary content

Prerequisites

Before defining your Data Lake project, make sure the following prerequisites have been met.

Required clients

Depending on the Compute platform you select when you set up your project, you will need to install one of the following drivers.

Warning note
  • As the driver name is the same for Cloudera Data Platform, Google Dataproc, and Azure HDInsight, in order to prevent driver conflicts, only one project with any of the aforementioned compute platforms can be created per Compose installation.
  • As the driver name is the same for Cloudera Data Platform and Amazon EMR, in order to prevent driver conflicts, only one project with any of the aforementioned compute platforms can be created per Compose installation.

Cloudera Data Platform, Google Dataproc, and Azure HDInsight

  1. Download the Hive JDBC Driver from the Cloudera website:

    https://www.cloudera.com/downloads/

    Then, extract the HiveJDBC41.jar file from the zip file that contains the Hive JDBC Connector.

    Information note

    You need to register on the Simba and Cloudera websites before you can download the Hortonworks or Hive JDBC Driver.

  2. Copy the HiveJDBC41.jar file to <Compose_Installation_Dir>\java\jdbc.
  3. Restart the Qlik Compose service.

Databricks

  1. Download the SimbaSparkJDBC42-<version>.zip or DatabricksJDBC42-<version>.zip file from the Databricks website.

    Warning noteUnity Catalog support requires DatabricksJDBC42-2.6.32.1054.zip or later.
  2. Copy the SparkJDBC42.jar file or the DatabricksJDBC42.jar file (according to the file you downloaded) to <compose_product_dir>\java\jdbc.
  3. Restart the Qlik Compose service.

Amazon EMR

  1. Download the Amazon Hive JDBC Driver (HiveJDBC41.jar) from the Amazon website.
  2. Copy the HiveJDBC41.jar file to <compose_product_dir>\java\jdbc.
  3. Restart the Qlik Compose service.

Required databases and privileges

Compose Data Lake projects require four separate databases. You can create the required databases manually (which will also allow you to override the default storage location for the files) or let Compose create them for you. If you want Compose to create the databases, you need to grant the user defined in the Storage Zone settings, the CREATE DATABASE privilege for the following databases:

  • Storage Zone database - The database specified in the Storage Zone settings. This database can have any name.
  • Landing Zone database - The database specified in the Landing Zone settings. This database can have any name.
  • Exposed views database - This database must have the same name as the Storage Zone database and be appended with the suffix defined in the project settings’ Naming tab (_v by default).
  • Internal views database - Both ODS Live Views and the HDS Live Views reference this database for updates. This database must have the same name as the Storage Zone database and be appended with the suffix defined in the project settings’ Naming tab (_v_internal by default).

For more information about Compose views, see Working with views.

Required table and view privileges

The user specified in the Storage Connection Settings must be granted the following privileges on the required databases:

  • SELECT
  • CREATE
  • MODIFY
  • READ_METADATA

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!