Early Access: The content on this website is provided for informational purposes only in connection with pre-General Availability Qlik Products.
All content is subject to change and is provided without warranty.
Skip to main content Skip to complementary content

Creating an OpenAI connection

To communicate with the OpenAI platform, create a connection to the OpenAI analytics source.

To communicate with OpenAI, create a connection to the OpenAI analytics source. Create the connection in the Create page of the Analytics activity center, Data load editor, or Script.

Data received from these connections can be used in the load script and in chart expressions to enhance your Qlik Sense analytics apps.

Configurations and configurable settings

Set up your OpenAI analytics connection with one of the following configurations. The other configurable settings can vary depending on which configuration is used.

OpenAI Completions API (GPT-3) - Rows

This configuration sends each row of data as a question to the completions API (in small batches to improve performance). Each response is stored as text in a table with the same number of rows as the input.

This configuration can be used in both the load script and chart expressions.

For OpenAI's documentation about the API used by this configuration, see Completions.

Configurable settings for 'OpenAI Completions API (GPT-3) - Rows' configuration
Field Description
Select Configuration Under Configuration, select OpenAI Completions (GPT-3) - Rows.
Authorization Enter the OpenAI API Key. For more information, see Generating an OpenAI API key.
OpenAI Request
  • OpenAI Model: The model used.

  • Temperature: The sampling temperature to use.

  • Max Tokens: This parameter adjusts the maximum number of tokens to be generated. In other words, it controls the size of the response that will be generated. You will almost always need to alter the default value in order for the connection to function properly for your needs.

  • Top P: Adjusts the nucleus sampling. This can be altered as an alternative to Temperature sampling.

  • Frequency Penalty: The degree to which the model penalizes new tokens that are repeated verbatim from existing tokens.

  • Presence Penalty: The degree to which the model penalizes new tokens which are already contained in the text up until the point at which they are introduced.

  • User: The end-user ID, which can help in monitoring policy violations.

Association

Specify an Association Field, a field from the input data table containing a unique identifier.

It is required to include this field in the source data when making an endpoint request for the results table returned to be associated with the source field table using a key. The designated field will be returned as a field in the response and enable the response to be associated with the source data in the data model. This can be any field with a unique ID, either from the source data or as part of the table load process.

Name The name of the connection. The default name is used if you do not enter a name.

OpenAI Completions API (GPT-3) - JSON Tables

This configuration sends a request on each row, where the response is expected to be a JSON list of data. The connector will convert the JSON table into a table of data in the Qlik data model. It should be treated as experimental due to the nature of the OpenAI responses.

This configuration can be used in the load script to synthesize data. It is not intended for use in chart expressions. Depending on how you ask your question, it might not always return valid JSON.

Information noteGPT models will predict what JSON looks like in ways that might not always be accurate, such as numbers containing text without quotes. In certain cases, a question can be asked with the addition of “as a JSON list” to the request in the load script. In other scenarios, you must be much more specific to obtain the desired result.
Configurable settings for 'OpenAI Completions API (GPT-3) - JSON Tables' configuration
Field Description
Select Configuration Under Configuration, select OpenAI Completions (GPT-3) - JSON Tables.
Authorization Enter the OpenAI API Key. For more information, see Generating an OpenAI API key.
OpenAI Request
  • OpenAI Model: The model used.

  • Temperature: The sampling temperature to use.

  • Max Tokens: This parameter adjusts the maximum number of tokens to be generated. In other words, it controls the size of the response that will be generated. You will almost always need to alter the default value in order for the connection to function properly for your needs.

  • Top P: Adjusts the nucleus sampling. This can be altered as an alternative to Temperature sampling.

  • Frequency Penalty: The degree to which the model penalizes new tokens that are repeated verbatim from existing tokens.

  • Presence Penalty: The degree to which the model penalizes new tokens which are already contained in the text up until the point at which they are introduced.

  • User: The end-user ID, which can help in monitoring policy violations.

Association

Specify an Association Field, a field from the input data table containing a unique identifier.

It is required to include this field in the source data when making an endpoint request for the results table returned to be associated with the source field table using a key. The designated field will be returned as a field in the response and enable the response to be associated with the source data in the data model. This can be any field with a unique ID, either from the source data or as part of the table load process.

Name The name of the connection. The default name is used if you do not enter a name.

OpenAI Chat Completions API (GPT-3.5, GPT-4) - Rows

This configuration works in a similar fashion to the OpenAI Completions API (GPT-3) - Rows configuration, but it sends each row of data as a request to the OpenAI Chat Completions API. In this case, the requests are made with the “user” message role, and each row of data from Qlik is sent as a separate request.

Information noteSending multiple rows as a chat context is not supported. You need to include all the questions in a single request row.

This configuration can be used in both the load script and chart expressions.

For OpenAI's documentation about the API used by this configuration, see Chat.

Configurable settings for 'OpenAI Completions API (GPT-3.5, GPT-4) - Rows' configuration
Field Description
Select Configuration Under Configuration, select OpenAI Completions (GPT-3.5, GPT-4) - Rows.
Authorization Enter the OpenAI API Key. For more information, see Generating an OpenAI API key.
OpenAI Request
  • OpenAI Model: The model used.

  • Temperature: The sampling temperature to use.

  • Max Tokens: This parameter adjusts the maximum number of tokens to be generated. In other words, it controls the size of the response that will be generated. You will almost always need to alter the default value in order for the connection to function properly for your needs.

  • Top P: Adjusts the nucleus sampling. This can be altered as an alternative to Temperature sampling.

  • Frequency Penalty: The degree to which the model penalizes new tokens that are repeated verbatim from existing tokens.

  • Presence Penalty: The degree to which the model penalizes new tokens which are already contained in the text up until the point at which they are introduced.

  • User: The end-user ID, which can help in monitoring policy violations.

Association

Specify an Association Field, a field from the input data table containing a unique identifier.

It is required to include this field in the source data when making an endpoint request for the results table returned to be associated with the source field table using a key. The designated field will be returned as a field in the response and enable the response to be associated with the source data in the data model. This can be any field with a unique ID, either from the source data or as part of the table load process.

Name The name of the connection. The default name is used if you do not enter a name.

OpenAI Embeddings

This configuration sends rows of input text to the OpenAI Embeddings API. OpenAI returns a separate vector representation of each input. This vector representation is in a form which can be consumed by machine learning models.

For OpenAI's documentation about the API used by this configuration, see Embeddings.

Configurable settings for 'OpenAI Embeddings' configuration
Field Description
Select Configuration Under Configuration, select OpenAI Embeddings.
Authorization Enter the OpenAI API Key. For more information, see Generating an OpenAI API key.
OpenAI Request
  • OpenAI Model: The model used.

  • User: The end-user ID, which can help in monitoring policy violations.

Association

Specify an Association Field, a field from the input data table containing a unique identifier.

It is required to include this field in the source data when making an endpoint request for the results table returned to be associated with the source field table using a key. The designated field will be returned as a field in the response and enable the response to be associated with the source data in the data model. This can be any field with a unique ID, either from the source data or as part of the table load process.

Name The name of the connection. The default name is used if you do not enter a name.

Creating the connection

You can create a connection to the analytic connector from the Analytics activity center, from Data load editor in an existing app, or from Script in an existing script. Follow the steps below to create a connection.

  1. Create a new Qlik Sense app or script. Open Data load editor or Script.

  2. Click Create new connection.

  3. Under Space, select the space where the connection will be located.

  4. Under Analytics sources, click OpenAI.

  5. Choose the Configuration needed. For more information about each available option, see Configurations and configurable settings.

  6. Enter your OpenAI API Key. For more information, see Generating an OpenAI API key.

  7. Under OpenAI Model, select the OpenAI model you want to use.

  8. Adjust any of the other default parameter values as required. These parameters are described in OpenAI's API documentation. For more information, see Chat, Completions, and Embeddings.

    For additional descriptions of the parameters, see Configurations and configurable settings.

    Information noteYou will almost always need to alter the Max Tokens parameter value in order for the connection to function properly for your needs. This parameter controls the size of the response that will be generated.
  9. Click Create.

The data connection is saved to the space where the app is created, so it can be reused in other Qlik Sense apps and scripts. It is also listed under Data connections in Data load editor or Script.

Once you have created the connection, you can use it to load data with the requests and the platform's responses to them. Additionally, you can use it in chart expressions. For more information, see Select and load data from an OpenAI connection and Using OpenAI connections in visualization expressions.

Detailed examples

For full examples of how to work with the OpenAI analytics connector, see Tutorial – Using the OpenAI analytics connector in Qlik Cloud.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!