Amazon Redshift

Version:
Current
Last modified: September 23, 2020
Connection Type

ODBC (32- and 64-bit)    

Type of Support

Read & Write, In-Database    

Validated On

Client version 1.3.7.1000

Alteryx Tools Used to Connect

Standard Workflow Processing

Link
Input Data Tool Icon

Input Data Tool

Link
Output Data Tool Icon

Output Data Tool

In-database Workflow Processing

Link
Blue icon with database being plugged in.

Connect In-DB Tool

Link
Blue icon with a stream-like object flowing into a database.

Data Stream In Tool

Configure an ODBC Connection

In the ODBC Data Source Administrator...

  1. Select the Redshift driver and select Configure.
  2. Enter your Connection Settings and credentials.
  3. In the Additional Options area, select the Retrieve Entire Results Into Memory option.
  4. Select OK to save the connection.

Configure an Amazon Redshift Bulk Connection

To use the bulk connection via the Output Data tool...

  1. Select the Write to File or Database dropdown and select Other Databases > Amazon Redshift Bulk.
  2. Select a Data Source Name (or select ODBC Admin to create one). See ODBC and OLEDB Database Connections.
  3. (Optional) Enter a User Name and Password.
  4. In the Amazon S3 section, enter or paste your AWS Access Key and AWS Secret Key to access the data for upload.
  5. In the Secret Key Encryption dropdown, select an encryption option:
    • Hide: Hide the password using minimal encryption.
    • Encrypt for Machine: Any user on the computer is able to fully use the connection.
    • Encrypt for User: The signed-in user can use the connection on any computer.
  6. In the Endpoint dropdown, select Default to allow Amazon to determine the endpoint automatically based on the bucket you select. To specify an endpoint for private S3 deployments, or if you know a specific bucket region, you can alternately select an endpoint (S3 region), enter a custom endpoint, or select from one of ten previously-entered custom endpoints.

    If the Bucket you select is not in the region of the endpoint you specify, this error occurs: "The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint." Select Default to clear the error.
     
  7. (Optional) Select Use Signature V4 for Authentication to use Signature Version 4 instead of the default Signature Version 2. This increases security, but connection speeds might be slower. This option is automatically enabled for regions requiring Signature Version 4.

    Regions That Require Signature Version 4: Regions created after January 30, 2014 support only Signature Version 4. These regions require Signature Version 4 authentication:
    • US East (Ohio) Region
    • Canada (Central) Region
    • Asia Pacific (Mumbai) Region
    • Asia Pacific (Seoul) Region
    • EU (Frankfurt) Region
    • EU (London) Region
    • China (Beijing) Region
  8. Select a Server-Side Encryption method for uploading to an encrypted Amazon S3 bucket. For more information on Amazon S3 encryption methods, see the Amazon Simple Storage Service Developer Guide.
    • None (Default): No encryption method is used.
    • SSE-KMS: Use server-side encryption with AWS KMS-managed keys. Optionally provide a KMS Key ID. When you select this method, Use Signature V4 for Authentication is enabled by default.
  9. In Bucket Name, enter the name of the AWS bucket in which your data objects are stored.

Optionally select Use Redshift Spectrum to connect to Spectrum tables.

Configure Output Options

You can optionally specify or adjust the following Redshift options. For more information, see the Amazon Redshift Database Developer Guide.

To create Spectrum tables with the Output Data tool, specify both the schema and table name.

spectrum_schema.tablename

Distribution Key is ignored if 'Key' is not selected for Distribution Style. Sort Key is ignored if 'None' is selected for Sort Style.

  • Primary Key: Select columns for the Primary Key and adjust the order of columns.
  • Distribution Style: Select Even, Key, or All.
  • Distribution Key: Select a column for the Distribution Key.
  • Sort Style: Select None, Compound, or Interleaved.
  • Sort Key: Select columns for the Sort Key and adjust the order of columns.
  • Enable Vacuum and Analyze Operations: (Bulk connections only) Enabled by default. When enabled, VACUUM and ANALYZE maintenance commands are executed after a bulk load APPEND to the Redshift database.
  • Size of Bulk Load Chunks (1 MB to 102400 MB): To increase upload performance, large files are split into smaller files with a specified integer size, in megabytes. The default value is 128.
  • Enable backslash (\) as escape character: (Bulk connections only) Enabled by default. When enabled, a character that immediately follows a backslash character is loaded as column data, even if that character normally is used for a special purpose (for example, delimiter character, quotation mark, embedded newline character, or escape character).
Was This Helpful?

Running into problems or issues with your Alteryx product? Visit the Alteryx Community or contact support.