Prerequisites
The following section describes the prerequisites for using Snowflake as a Qlik Replicate target endpoint. The topic is divided into two parts:
- General prerequisites - Describes the prerequisites regardless of where the source data will be staged (Amazon S3, Google Cloud Storage, or Azure Blob Storage)
- Staging prerequisites - Describes the prerequisites for each of the supported staging types
General prerequisites
Client prerequisites
Qlik Replicate for Windows:
Download and install Snowflake ODBC driver 2.25.3 (64-bit) or later for Windows on the Qlik Replicate Server machine.
Qlik Replicate for Linux:
Download and install Snowflake ODBC driver 2.25.3 (64-bit) or later for Linux on the Qlik Replicate Server machine.
The expected name for the ODBC driver is SnowflakeDSIIDriver (the default).
Performance and Cloud Services usage optimization
In order to optimize both Snowflake Cloud Services usage and overall performance, it is strongly recommended to enable the Apply batched changes to multiple tables concurrently option in the task settings' Change Processing Tuning tab.
Firewall prerequisites
Firewall port 443 needs to be opened for outbound communication.
Staging prerequisites
Google Cloud Storage prerequisites
You need to specify a storage integration name in the Snowflake target endpoint settings. Integrations avoid the need for passing explicit cloud provider credentials such as secret keys or access tokens; instead, integration objects reference a Cloud Storage service account.
For more information on creating a storage integration name, see Configuring an Integration for Google Cloud Storage
Amazon S3 prerequisites
Amazon S3 account prerequisites
Sign up for an Amazon Web Services account. Then use the AWS Management Console to purchase Snowflake On Demand - Standard or Snowflake On Demand - Premier and launch a Snowflake cluster. After registering for an Amazon Web Services (AWS) account, you can launch a Snowflake cluster and download the required client tools.
Make a note of the basic information about your AWS account and your Snowflake cluster, such as your password and user name. You will need this information to configure Qlik Replicate to work with the Snowflake data warehouse.
Amazon S3 bucket prerequisites
You can configure the Snowflake endpoint to stage the data files on Snowflake (internally) or on Amazon S3. If you want to use Amazon S3 staging, you need to have an Amazon S3 bucket, preferably (for best performance) located in your Snowflake cluster region.
You must be able to access your Amazon S3 bucket directly from the Replicate machine.
For information on signing up for Amazon S3, visit:
https://aws.amazon.com/console/
- Bucket access credentials: Make a note of the bucket name, region, access key and secret access key - you will need to provide them in the Qlik Replicate Snowflake target settings.
- Bucket access permissions: Qlik Replicate requires read/write/delete permissions to the Amazon S3 bucket.
Time and time zone settings on the Replicate machine
Make sure the time and time zone settings on the Replicate machine are accurate. This is required in order to ensure:
- Proper synchronization of Full Load and CDC tasks.
- Correlation of the transaction log time with the actual time.
Azure Blob Storage staging prerequisites
You can configure the Snowflake endpoint to stage the data files either on Snowflake (supported on Windows and Linux) or on Azure Blob storage (supported on Windows only). If you want to use Azure Blob storage, you need to sign up for a Microsoft Azure Blob Storage account and make a note of the account name, access key, container name, SAS token (Shared Access Signature), and target folder.
Note that the duration of the SAS token must be the same as the Replicate task duration. For an explanation of how to configure the SAS token, visit:
https://docs.snowflake.com/en/user-guide/data-load-azure-config.html
For information on signing up for a Microsoft Azure Blob Storage account, see the vendor's online help.
Azure Blob Storage permissions
Qlik Replicate performs the following operations on the Azure Blob Storage container/folder:
- On the Azure Blob Storage container: LIST and CREATE on SCHEMA public
- On the Azure Blob Storage folder: READ, WRITE, and DELETE
Supported blob storage types
The following blob storage types are supported:
- Standard storage with block blobs
- Premium storage with block blobs only