The target-snowflake Meltano loader pulls data from Snowflake that can then be sent to a destination using a loader.
Other Available Variants
- datamill-co
- meltano
- transferwise (default)
Getting Started
Prerequisites
If you haven't already, follow the initial steps of the Getting Started guide:
Dependencies
target-snowflake
requires the
libpq
library to be available on your system.
If you've installed PostgreSQL, you should already have it, but you can also install it by itself using the
libpq-dev
package on Ubuntu/Debian or the
libpq
Homebrew formula on macOS.
Installation and configuration
-
Add the target-snowflake loader to your project
using
:meltano add
-
Configure the target-snowflake settings using
:meltano config
meltano add loader target-snowflake --variant datamill-co
meltano config target-snowflake set --interactive
Next steps
Follow the remaining steps of the Getting Started guide:
If you run into any issues, learn how to get help.
Troubleshooting
Error: pg_config executable not found
or libpq-fe.h: No such file or directory
This error message indicates that the libpq
dependency is missing.
To resolve this, refer to the "Dependencies" section above.
Capabilities
The current capabilities fortarget-snowflake
may have been automatically set when originally added to the Hub. Please review the
capabilities when using this loader. If you find they are out of date, please
consider updating them by making a pull request to the YAML file that defines the
capabilities for this loader.This plugin has the following capabilities:
You can
override these capabilities or specify additional ones
in your meltano.yml
by adding the capabilities
key.
Settings
The
target-snowflake
settings that are known to Meltano are documented below. To quickly
find the setting you're looking for, click on any setting name from the list:
snowflake_account
snowflake_username
snowflake_password
snowflake_role
snowflake_database
snowflake_authenticator
snowflake_warehouse
invalid_records_detect
invalid_records_threshold
disable_collection
logging_level
persist_empty_tables
snowflake_schema
state_support
target_s3.bucket
target_s3.key_prefix
target_s3.aws_access_key_id
target_s3.aws_secret_access_key
You can
override these settings or specify additional ones
in your meltano.yml
by adding the settings
key.
Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.
Account (snowflake_account)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT
ACCOUNT
might require the region
and cloud
platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud>
(e.g. xy12345.east-us-2.azure
)
Refer to Snowflake's documentation about Accounts.
Username (snowflake_username)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME
Password (snowflake_password)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD
Role (snowflake_role)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_ROLE
If not specified, Snowflake will use the user's default role.
Snowflake Database (snowflake_database)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE
Snowflake Authenticator (snowflake_authenticator)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR
-
Default Value:
snowflake
Specifies the authentication provider for snowflake to use. Valud options are the internal one ("snowflake"), a browser session ("externalbrowser"), or Okta ("https://
Warehouse (snowflake_warehouse)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE
Invalid Records Detect (invalid_records_detect)
-
Environment variable:
TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT
-
Default Value:
true
Include false
in your config to disable crashing on invalid records
Invalid Records Threshold (invalid_records_threshold)
-
Environment variable:
TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD
-
Default Value:
0
Include a positive value n
in your config to allow at most n
invalid records per stream before giving up.
Disable Collection (disable_collection)
-
Environment variable:
TARGET_SNOWFLAKE_DISABLE_COLLECTION
-
Default Value:
false
Include true
in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging
Logging Level (logging_level)
-
Environment variable:
TARGET_SNOWFLAKE_LOGGING_LEVEL
-
Default Value:
INFO
The level for logging. Set to DEBUG
to get things like queries executed, timing of those queries, etc. See Python's Logger Levels for information about valid values.
Persist Empty Tables (persist_empty_tables)
-
Environment variable:
TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES
-
Default Value:
false
Whether the Target should create tables which have no records present in Remote.
Snowflake Schema (snowflake_schema)
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA
-
Default Value:
$MELTANO_EXTRACT__LOAD_SCHEMA
Note $MELTANO_EXTRACT__LOAD_SCHEMA
will expand to the value of the load_schema
extra for the extractor used in the pipeline, which defaults to the extractor's namespace, e.g. tap_gitlab
for tap-gitlab
. Values are automatically converted to uppercase before they're passed on to the plugin, so tap_gitlab
becomes TAP_GITLAB
.
State Support (state_support)
-
Environment variable:
TARGET_SNOWFLAKE_STATE_SUPPORT
-
Default Value:
true
Whether the Target should emit STATE
messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.
Target S3 Bucket (target_s3.bucket)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_BUCKET
When included, use S3 to stage files. Bucket where staging files should be uploaded to.
Target S3 Key Prefix (target_s3.key_prefix)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_KEY_PREFIX
Prefix for staging file uploads to allow for better delineation of tmp files
Target S3 AWS Access Key ID (target_s3.aws_access_key_id)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_AWS_ACCESS_KEY_ID
Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key)
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3_AWS_SECRET_ACCESS_KEY
Something missing?
This page is generated from a YAML file that you can contribute changes to.
Edit it on GitHub!Looking for help?
#plugins-general
channel.
Install
meltano add loader target-snowflake --variant datamill-co
Homepage
Maintenance Status
Meltano Stats
Keywords