Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 11 Next »

Introduction

A separate database plugin to support Snowflake-specific features and configurations.

Use-Case

  • Users can choose and install Snowflake source and sink plugins.
  • Users should see Snowflake logo on plugin configuration page for better experience.
  • Users should get relevant information from the tool tip:
    • The tool tip should describe accurately what each field is used for.
  • Users should not have to specify any redundant configuration
  • Users should get field level lineage for the source and sink that is being used.
  • Reference documentation should be updated to account for the changes.
  • The source code for Snowflake database plugin should be placed in repo under data-integrations org.
  • The data pipeline using source and sink plugins should run on both mapreduce and spark engines.

User Stories

  • User should be able to install Snowflake specific database source and sink plugins from the Hub
  • Users should have each tool tip accurately describe what each field does
  • Users should get field level lineage information for the Snowflake source and sink 
  • Users should be able to setup a pipeline avoiding specifying redundant information
  • Users should get updated reference document for Snowflake source and sink
  • Users should be able to read all the DB types

Plugin Type

  • Batch Source
  • Batch Sink 
  • Real-time Source
  • Real-time Sink
  • Action
  • Post-Run Action
  • Aggregate
  • Join
  • Spark Model
  • Spark Compute

Snowflake Overview

Snowflake is an analytic data warehouse provided as Software-as-a-Service (SaaS). Snowflake provides a data warehouse that is faster, easier to use, and far more flexible than traditional data warehouse offerings.

Snowflake’s data warehouse is not built on an existing database or “big data” software platform such as Hadoop. The Snowflake data warehouse uses a new SQL database engine with a unique architecture designed for the cloud. To the user, Snowflake has many similarities to other enterprise data warehouses, but also has additional functionality and unique capabilities.

Snowflake bulk API

Using JDBC for loading data has performance limitations. Snowflake provides bulk APIs for loading data

COPY INTO <table> command loads data from staged files to an existing table. The files must already be staged in one of the following locations:

  • Named internal stage (or table/user stage). Files can be staged using the PUT command. 
  • Named external stage that references an external location (AWS S3, Google Cloud Storage, or Microsoft Azure). 
  • External location (AWS S3, Google Cloud Storage, or Microsoft Azure).

Example:

-- Stages
copy into mytable from '@mystage/path 1/file 1.csv';
copy into mytable from '@%mytable/path 1/file 1.csv';
copy into mytable from '@~/path 1/file 1.csv';

-- S3 bucket
copy into mytable from 's3://mybucket 1/prefix 1/file 1.csv';

-- Azure container
copy into mytable from 'azure://myaccount.blob.core.windows.net/mycontainer/encrypted_files/file 1.csv';

Also, there is an option to use Snowpipe to load data continuously.

Notes:


COPY INTO <location> command unloads data from a table (or query) into one or more files in one of the following locations:

  • Named internal stage (or table/user stage). The files can then be downloaded from the stage/location using the GET command.
  • Named external stage that references an external location (AWS S3, Google Cloud Storage, or Microsoft Azure).
  • External location (AWS S3, Google Cloud Storage, or Microsoft Azure).

Example:

-- Stages
copy into '@mystage/path 1/file 1.csv' from mytable;
copy into '@%mytable/path 1/file 1.csv' from mytable;
copy into '@~/path 1/file 1.csv' from mytable;

-- S3 bucket
copy into 's3://mybucket 1/prefix 1/file 1.csv' from mytable;

-- Azure container
copy into 'azure://myaccount.blob.core.windows.net/mycontainer/encrypted_files/file 1.csv' from mytable;

Notes:

Design Tips

JDBC Driver API Support: https://docs.snowflake.net/manuals/user-guide/jdbc-api.html

Loading Data into Snowflake: https://docs.snowflake.net/manuals/user-guide-data-load.html

Design

The suggestion is to create a new maven sub-module in the database-plugins repo under the data-integrations organization.

Action Plugins

The proposal is to create a Snowflake Data Loading Action Plugin that will utilize COPY INTO <table> command to load data files into Snowflake and Snowflake Data Unloading Action Plugin that will utilize COPY INTO <location> command to unload data from a table (or query) into one or more files.

Also, Snowflake Action Plugin that runs a SQL query should be created.

Batch Source/Sink Plugins

Option 1

The first option is to create a new maven sub-module in the database-plugins repo under the data-integrations organization for Batch Source/Sink Plugins and implement them using JDBC in the same way as other DB plugins. 

Option 2

Using JDBC for loading data has performance limitations, so we can utilize Snowflake's bulk APIs for loading data.

Source Plugin

There is an option for the Source plugin to unload data from a table (or query) using COPY INTO <location> command into one or more files that than will be directly read by the plugin. 

Example that unloads the result of a query into a named internal stage (my_stage) using a folder/filename prefix (result/data_), a named file format (myformat), and gzip compression:

copy into @my_stage/result/data_ from (select * from orderstiny)
   file_format=(format_name='myformat' compression='gzip');

---------------+-------------+--------------+
 rows_unloaded | input_bytes | output_bytes |
---------------+-------------+--------------+
 73            | 8339        | 3322         |
---------------+-------------+--------------+

Data Files than can be directly downloaded from an Internal Stage to a Stream:

Connection connection = DriverManager.getConnection(url, prop);
InputStream out = connection.unwrap(SnowflakeConnection.class).downloadStream(
    "~",
    DEST_PREFIX + "/" + TEST_DATA_FILE + ".gz",
    true);

Files can be stored in one of the following locations:

  • Named internal stage (or table/user stage). The files can then be downloaded from the stage/location using the GET command.
  • Named external stage that references an external location (AWS S3, Google Cloud Storage, or Microsoft Azure).

  • External location (AWS S3, Google Cloud Storage, or Microsoft Azure).

The proposal is to use an internal stage since files can then be downloaded directly using SnowflakeConnection#downloadStream method.

Sink Plugin

For the Sink, it's possible to write data to the internal stage files first(according to the File Sizing Recommendations) and then use COPY INTO <table> command.

Example of loading files from a named internal stage into a table:

copy into mytable
from @my_int_stage;


Data Files can be uploaded directly from a Stream to an Internal Stage:

Connection connection = DriverManager.getConnection(url, prop);
File file = new File("/tmp/test.csv");
FileInputStream fileInputStream = new FileInputStream(file);

// upload file stream to user stage
connection.unwrap(SnowflakeConnection.class).uploadStream("MYSTAGE", "testUploadStream",
   fileInputStream, "destFile.csv", true);

Files can be staged in one of the following locations:

  • Named internal stage (or table/user stage). Files can be staged using the PUT command.
  • Named external stage that references an external location (AWS S3, Google Cloud Storage, or Microsoft Azure).
  • External location (AWS S3, Google Cloud Storage, or Microsoft Azure).

The proposal is to use an internal stage since files can be uploaded directly using SnowflakeConnection#uploadStream method.

Option 3

Although it is possible to create a Batch Sink Plugin that will accept files' locations and utilize COPY INTO <table> command (or Snowpipe) to load data files into Snowflake, it seems that it's not a good idea:

  • COPY INTO <table> command uses locations of the raw data files, so it won't be possible to use this sink in the same way as other Database Sinks.
  • No transformations can be done on actual data on the CDAP since we operate on file locations and not on the actual data. It is still possible to perform a transformation on the Snowflake side using Transformation Parameters.

Source Properties

Option 1

Section

User Configuration LabelLabel DescriptionOptionsDefaultVariableUser Widget
GeneralLabelLabel for UI.


textbox

Reference NameUniquely identified name for lineage.

referenceNametextbox

Account NameFull name of Snowflake account.

accountNametextbox

Database

Database name to connect to.



databasetextbox

Import Query

Query for import data.



importQuerytextarea
CredentialsUsernameUser identity for connecting to the specified database.

usernametextbox

PasswordPassword to use to connect to the specified database.

passwordpassword
Key Pair AuthenticationKey Pair Authentication EnabledIf true, plugin will perform Key Pair authentication.
  • True
  • False
FalsekeyPairEnabledtoggle

Key File PathPath to the private key file.

pathtextbox
OAuth2OAuth2 EnabledIf true, plugin will perform OAuth2 authentication.
  • True
  • False
Falseoauth2Enabledtoggle

Auth URLEndpoint for the authorization server used to retrieve the authorization code.

authUrltextbox

Token URLEndpoint for the resource server, which exchanges the authorization code for an access token.

tokenUrltextbox

Client IDClient identifier obtained during the Application registration process.

clientIdtextbox

Client SecretClient secret obtained during the Application registration process.

clientSecretpassword

ScopesScope of the access request, which might have multiple space-separated values.

scopestextbox

Refresh TokenToken used to receive accessToken, which is end product of OAuth2.

refreshTokentextbox
AdvancedBounding QueryBounding Query should return the min and max of the values of the 'splitBy' field. For example, 'SELECT MIN(id),MAX(id) FROM table'. Not required if numSplits is set to one.

boundingQuerytextarea

Split-By Field NameField Name which will be used to generate splits. Not required if numSplits is set to one.

splitBytextbox

Number of Splits to GenerateNumber of splits to generate.

numSplitstextbox

Connection ArgumentsA list of arbitrary string tag/value pairs as connection arguments. See: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string

connectionArgumentskeyvalue

Notes:

Option 2

TODO

Source Data Types Mapping

Snowflake Data TypesCDAP Schema Data TypeComment
NUMBER
decimalDefault precision and scale are (38,0).
DECIMALdecimalSynonymous with NUMBER.
NUMERICdecimalSynonymous with NUMBER.
INT, INTEGER, BIGINT, SMALLINTdecimalSynonymous with NUMBER, except that precision and scale cannot be specified (i.e. always defaults to NUMBER(38, 0)).
FLOAT, FLOAT4, FLOAT8 doubleSnowflake uses double-precision (64 bit) IEEE 754 floating point numbers.
DOUBLEdoubleSynonymous with FLOAT.
DOUBLE PRECISIONdoubleSynonymous with FLOAT.
REALdoubleSynonymous with FLOAT.
VARCHARstringDefault (and maximum) is 16,777,216 bytes.
CHAR, CHARACTERstringSynonymous with VARCHAR except default length is VARCHAR(1).
STRINGstringSynonymous with VARCHAR.
TEXTstringSynonymous with VARCHAR.
BINARYbytes
VARBINARYbytesSynonymous with BINARY.
BOOLEANboolean
DATEdate
DATETIMEtimestampAlias for TIMESTAMP_NTZ
TIMEtime
TIMESTAMPtimestamp/stringAlias for one of the TIMESTAMP variations (TIMESTAMP_NTZ by default).
TIMESTAMP_LTZtimestampTIMESTAMP with local time zone; time zone, if provided, is not stored.
TIMESTAMP_NTZtimestampTIMESTAMP with no time zone; time zone, if provided, is not stored.
TIMESTAMP_TZstringTIMESTAMP with time zone.
VARIANTstringA tagged universal type, which can store values of any other type, including OBJECT and ARRAY, up to a maximum size of 16 MB compressed.
OBJECTrecord
ARRAYarray


See: 

Sink Properties

Option 1

SectionUser Configuration LabelLabel DescriptionOptionsDefaultVariableUser Widget
GeneralLabelLabel for UI.


textbox

Reference NameUniquely identified name for lineage.

referenceNametextbox

Account NameFull name of Snowflake account.

accountNametextbox

Database

Database name to connect to connect to.



databasetextbox

Table Name

Name of a database table to write to.



tabletextbox
CredentialsUsernameUser identity for connecting to the specified database.

usernametextbox

PasswordPassword to use to connect to the specified database.

passwordpassword
Key Pair AuthenticationKey Pair Authentication EnabledIf true, plugin will perform Key Pair authentication.
  • True
  • False
FalsekeyPairEnabledtoggle

Key File PathPath to the private key file.

pathtextbox
OAuth2OAuth2 EnabledIf true, plugin will perform OAuth2 authentication.
  • True
  • False
Falseoauth2Enabledtoggle

Auth URLEndpoint for the authorization server used to retrieve the authorization code.

authUrltextbox

Token URLEndpoint for the resource server, which exchanges the authorization code for an access token.

tokenUrltextbox

Client IDClient identifier obtained during the Application registration process.

clientIdtextbox

Client SecretClient secret obtained during the Application registration process.

clientSecretpassword

ScopesScope of the access request, which might have multiple space-separated values.

scopestextbox

Refresh TokenToken used to receive accessToken, which is end product of OAuth2.

refreshTokentextbox
AdvancedConnection ArgumentsA list of arbitrary string tag/value pairs as connection arguments. See: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string

connectionArgumentskeyvalue

Option 2

TODO

Sink Data Types Mapping

CDAP Schema Data TypeSnowflake Data TypesComment
booleanBOOLEAN
bytesBINARY
dateDATE
doubleFLOATSnowflake uses double-precision (64 bit) IEEE 754 floating point numbers.
decimalNUMBER(s, p)
floatFLOAT
intNUMBER(s, p)

Where p >= 10.


It's safe to write primitives as values of decimal logical type in the case of valid precision.

longNUMBER(s, p)

Where p >= 19.


It's safe to write primitives as values of decimal logical type in the case of valid precision.

stringVARCHAR
timeTIME
timestampTIMESTAMP_NTZ
arrayARRAY
recordOBJECT
enumVARCHAR
mapOBJECT
unionVARIANT

Action Plugin Properties

Section

User Configuration Label

Label Description

OptionsDefault

Variable

User Widget
General

Label

Label for UI




textbox

Account NameFull name of Snowflake account.

accountNametextbox

Database

Database name to connect to connect to.



databasetextbox

Query

SQL query to run.



querytextarea
CredentialsUsernameUser identity for connecting to the specified database.

usernametextbox

PasswordPassword to use to connect to the specified database.

passwordpassword
Key Pair AuthenticationKey Pair Authentication EnabledIf true, plugin will perform Key Pair authentication.
  • True
  • False
FalsekeyPairEnabledtoggle

Key File PathPath to the private key file.

pathtextbox
OAuth2OAuth2 EnabledIf true, plugin will perform OAuth2 authentication.
  • True
  • False
Falseoauth2Enabledtoggle

Auth URLEndpoint for the authorization server used to retrieve the authorization code.

authUrltextbox

Token URLEndpoint for the resource server, which exchanges the authorization code for an access token.

tokenUrltextbox

Client IDClient identifier obtained during the Application registration process.

clientIdtextbox

Client SecretClient secret obtained during the Application registration process.

clientSecretpassword

ScopesScope of the access request, which might have multiple space-separated values.

scopestextbox

Refresh TokenToken used to receive accessToken, which is end product of OAuth2.

refreshTokentextbox
AdvancedConnection ArgumentsA list of arbitrary string tag/value pairs as connection arguments. See: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string

connectionArgumentskeyvalue

Snowflake Data Loading Action Plugin Properties

Section

User Configuration Label

Label Description

OptionsDefault

Variable

User WidgetComment
General

Label

Label for UI




textbox

Account NameFull name of Snowflake account.

accountNametextbox

FromInternal or external location where the files containing data to be loaded are staged

locationtextbox

IntoName of the table into which data is loaded.

tabletextbox

SelectOptional SELECT statement used for transformations. Specifies an explicit set of fields/columns (separated by commas) to load from the staged data files. The fields/columns are selected from the files using a standard SQL query. The list must match the sequence of columns in the target table.

selecttextarea
CredentialsUsernameLogin name of the user for the connection.

usernametextbox

PasswordPassword for the specified user.

passwordpassword
Key Pair AuthenticationKey Pair Authentication EnabledIf true, plugin will perform Key Pair authentication.
  • True
  • False
FalsekeyPairEnabledtoggle

Key File PathPath to the private key file.

pathtextboxDisplayed only if Key Pair Authentication Enabled set to true.
OAuth2OAuth2 EnabledIf true, plugin will perform OAuth2 authentication.
  • True
  • False
Falseoauth2Enabledtoggle

Auth URLEndpoint for the authorization server used to retrieve the authorization code.

authUrltextboxDisplayed only if OAuth2 Enabled set to true.

Token URLEndpoint for the resource server, which exchanges the authorization code for an access token.

tokenUrltextboxDisplayed only if OAuth2 Enabled set to true.

Client IDClient identifier obtained during the Application registration process.

clientIdtextboxDisplayed only if OAuth2 Enabled set to true.

Client SecretClient secret obtained during the Application registration process.

clientSecretpasswordDisplayed only if OAuth2 Enabled set to true.

ScopesScope of the access request, which might have multiple space-separated values.

scopestextboxDisplayed only if OAuth2 Enabled set to true.

Refresh TokenToken used to receive accessToken, which is end product of OAuth2.

refreshTokentextboxDisplayed only if OAuth2 Enabled set to true.
Cloud Provider ParametersUse Cloud Provider ParametersIf true, plugin will use specified Cloud Provider Parameters.
  • True
  • False
FalseuseCloudProviderParameterstoggle

Cloud Provider
  • GCP
  • AWS
  • Microsoft Azure

GCP

cloudProviderradio-groupDisplayed only if Use Cloud Provider Parameters set to true.


Storage IntegrationName of the storage integration used to delegate authentication responsibility for external cloud storage to a Snowflake identity and access management (IAM) entity. For more details, see CREATE STORAGE INTEGRATION.

storageIntegrationtextboxDisplayed only if GCP or AWS Cloud Provider selected.

Key IdKey Id for connecting to AWS and accessing the private/protected S3 bucket where the files to load are staged. For more information, see Configuring Secure Access to AWS S3.

keyIdtextboxDisplayed only if AWS Cloud Provider selected.

Secret KeySecret Key for connecting to AWS and accessing the private/protected S3 bucket where the files to load are staged. For more information, see Configuring Secure Access to AWS S3.

secretKeypassword
Displayed only if AWS Cloud Provider selected.

TokenToken for connecting to AWS and accessing the private/protected S3 bucket where the files to load are staged. For more information, see Configuring Secure Access to AWS S3.

tokentextboxDisplayed only if AWS Cloud Provider selected.


SAS TokenShared access signature token for connecting to Azure and accessing the private/protected container where the files containing data are staged. Credentials are generated by Azure.

sasTokentextboxDisplayed only if Microsoft Azure Cloud Provider selected.

Files EncryptedIf true, plugin will perform loading from encrypted files.
  • True
  • False
FalsefilesEncryptedtoggleDisplayed only if AWS or Microsoft Azure Cloud Provider selected.

Encryption TypeEncryption type used.

For AWS:

  • NONE
  • AWS_CSE
  • AWS_SSE_S3
  • AWS_SSE_KMS

For Azure:

  • NONE
  • AZURE_CSE

encryptionTypeselectDisplayed only if Files Encrypted set to true.

Master KeyClient-side master key that was used to encrypt the files in the bucket. The master key must be a 128-bit or 256-bit key in Base64-encoded form. Snowflake requires this key to decrypt encrypted files in the bucket and extract data for loading.

masterKeytextboxDisplayed only if to AWS_CSE or AZURE_CSE Encryption Type selected.

Master Key IdAWS Master Key ID.

masterKeyIdtextboxDisplayed only if to AWS_SSE_KMS Encryption Type selected.
File FormatFile FormatOptional parameter to specify the format of the data files to load.
  • Undefined
  • By Name
  • By Type

Undefined

fileFormattoggle

Format NameExisting named file format to use for loading data into the table. The named file format determines the format type (CSV, JSON, etc.), as well as any other format options, for the data files. For more information, see CREATE FILE FORMAT.

fileFormatNametextboxDisplayed only if 'By Name' File Format selected.

Format TypeType of files to load into the table. If a format type is specified, then additional format-specific options can be specified. For more details, see Format Type Options.
  • CSV
  • JSON
  • AVRO
  • ORC
  • PARQUET
  • XML

fileFormatTypeselectDisplayed only if 'By Type' File Format selected.

Format Type OptionsFormat-specific options separated by blank spaces, commas, or new lines.

typeOptionstextarea

Displayed only if CSV, JSON, AVRO, PARQUET or XML Format Type selected.

ORC does not support any format type options.

AdvancedFilesList of one or more files names (separated by commas) to be loaded. The files must already have been staged in either the Snowflake internal location or external location specified in the command.  The maximum number of files names that can be specified is 1000.

filestextbox

PatternRegular expression pattern string, enclosed in single quotes, specifying the file names and/or paths to match. For the best performance, try to avoid applying patterns that filter on a large number of files.

patterntextbox

Copy OptionsOne or more copy options separated by blank spaces, commas, or new lines.

copyOptionstextarea

Connection ArgumentsA list of arbitrary string tag/value pairs as connection arguments. See: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string

connectionArgumentskeyvalue

Notes:

Snowflake Data Unloading Action Plugin Properties


User Configuration Label

Label Description

OptionsDefault

Variable

User WidgetComment
General

Label

Label for UI




textbox

Account NameFull name of Snowflake account.

accountNametextbox

IntoInternal or external location where the files containing data to be loaded are staged

locationtextbox

FromSource of the data to be unloaded, which can either be a table or a query.

sourcetextarea
CredentialsUsernameLogin name of the user for the connection.

usernametextbox

PasswordPassword for the specified user.

passwordpassword
Key Pair AuthenticationKey Pair Authentication EnabledIf true, plugin will perform Key Pair authentication.
  • True
  • False
FalsekeyPairEnabledtoggle

Key File PathPath to the private key file.

pathtextboxDisplayed only if Key Pair Authentication Enabled set to true.
OAuth2OAuth2 EnabledIf true, plugin will perform OAuth2 authentication.
  • True
  • False
Falseoauth2Enabledtoggle

Auth URLEndpoint for the authorization server used to retrieve the authorization code.

authUrltextboxDisplayed only if OAuth2 Enabled set to true.

Token URLEndpoint for the resource server, which exchanges the authorization code for an access token.

tokenUrltextboxDisplayed only if OAuth2 Enabled set to true.

Client IDClient identifier obtained during the Application registration process.

clientIdtextboxDisplayed only if OAuth2 Enabled set to true.

Client SecretClient secret obtained during the Application registration process.

clientSecretpasswordDisplayed only if OAuth2 Enabled set to true.

ScopesScope of the access request, which might have multiple space-separated values.

scopestextboxDisplayed only if OAuth2 Enabled set to true.

Refresh TokenToken used to receive accessToken, which is end product of OAuth2.

refreshTokentextboxDisplayed only if OAuth2 Enabled set to true.
Cloud Provider ParametersUse Cloud Provider ParametersIf true, plugin will use specified Cloud Provider Parameters.
  • True
  • False
FalseuseCloudProviderParameterstoggle

Cloud Provider
  • GCP
  • AWS
  • Microsoft Azure

GCP

cloudProviderradio-groupDisplayed only if Use Cloud Provider Parameters set to true.


Storage IntegrationName of the storage integration used to delegate authentication responsibility for external cloud storage to a Snowflake identity and access management (IAM) entity. For more details, see CREATE STORAGE INTEGRATION.

storageIntegrationtextboxDisplayed only if GCP or AWS Cloud Provider selected.

Key IdKey Id for connecting to AWS and accessing the private/protected S3 bucket where the files to load are staged. For more information, see Configuring Secure Access to AWS S3.

keyIdtextboxDisplayed only if AWS Cloud Provider selected.

Secret KeySecret Key for connecting to AWS and accessing the private/protected S3 bucket where the files to load are staged. For more information, see Configuring Secure Access to AWS S3.

secretKeypasswordDisplayed only if AWS Cloud Provider selected.

TokenToken for connecting to AWS and accessing the private/protected S3 bucket where the files to load are staged. For more information, see Configuring Secure Access to AWS S3.

tokentextboxDisplayed only if AWS Cloud Provider selected.


SAS TokenShared access signature token for connecting to Azure and accessing the private/protected container where the files containing data are staged. Credentials are generated by Azure.

sasTokentextboxDisplayed only if Microsoft Azure Cloud Provider selected.

Files EncryptedIf true, plugin will perform loading from encrypted files.
  • True
  • False
FalsefilesEncryptedtoggleDisplayed only if AWS or Microsoft Azure Cloud Provider selected.

Encryption TypeEncryption type used.

For AWS:

  • NONE
  • AWS_CSE
  • AWS_SSE_S3
  • AWS_SSE_KMS

For Azure:

  • NONE
  • AZURE_CSE

encryptionTypeselectDisplayed only if Files Encrypted set to true.

Master KeyClient-side master key that was used to encrypt the files in the bucket. The master key must be a 128-bit or 256-bit key in Base64-encoded form. Snowflake requires this key to decrypt encrypted files in the bucket and extract data for loading.

masterKeytextboxDisplayed only if to AWS_CSE or AZURE_CSE Encryption Type selected.

Master Key IdAWS Master Key ID.

masterKeyIdtextboxDisplayed only if to AWS_SSE_KMS Encryption Type selected.
File FormatFile FormatOptional parameter to specify the format of the data files to load.
  • Undefined
  • By Name
  • By Type

Undefined

fileFormattoggle

Format NameExisting named file format to use for loading data into the table. The named file format determines the format type (CSV, JSON, etc.), as well as any other format options, for the data files. For more information, see CREATE FILE FORMAT.

fileFormatNametextboxDisplayed only if 'By Name' File Format selected.

Format TypeType of files to load into the table. If a format type is specified, then additional format-specific options can be specified. For more details, see Format Type Options.
  • CSV
  • JSON
  • PARQUET

fileFormatTypeselectDisplayed only if 'By Type' File Format selected.

Format Type OptionsFormat-specific options separated by blank spaces, commas, or new lines.

typeOptionstextarea

Displayed only if 'By Type' File Format selected.


AdvancedCopy OptionsOne or more copy options separated by blank spaces, commas, or new lines.

copyOptionstextarea

Include Header

Specifies whether to include the table column headings in the output files. For more details, see Optional Parameters.

  • True
  • False
FalseincludeHeadertoggle

Connection ArgumentsA list of arbitrary string tag/value pairs as connection arguments. See: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string

connectionArgumentskeyvalue

Approach

Create a new maven project in it's own repository.

Pipeline Samples


Releases

Release X.Y.Z

Related Work

Database plugin enhancements

  • No labels