Snowflake

target-snowflake from datamill-co

Snowflake database loader

The target-snowflake Meltano loader pulls data from Snowflake that can then be sent to a destination using a loader.

Other Available Variants

Getting Started

Prerequisites

If you haven't already, follow the initial steps of the Getting Started guide:

  1. Install Meltano
  2. Create your Meltano project

Dependencies

target-snowflake requires the libpq library to be available on your system. If you've installed PostgreSQL, you should already have it, but you can also install it by itself using the libpq-dev package on Ubuntu/Debian or the libpq Homebrew formula on macOS.

Installation and configuration

  1. Add the target-snowflake loader to your project using
    meltano add
    :
  2. meltano add loader target-snowflake --variant datamill-co
  3. Configure the target-snowflake settings using
    meltano config
    :
  4. meltano config target-snowflake set --interactive

Next steps

If you run into any issues, learn how to get help.

Troubleshooting

Error: pg_config executable not found or libpq-fe.h: No such file or directory

This error message indicates that the libpq dependency is missing.

To resolve this, refer to the "Dependencies" section above.

Capabilities

The current capabilities for
target-snowflake
may have been automatically set when originally added to the Hub. Please review the capabilities when using this loader. If you find they are out of date, please consider updating them by making a pull request to the YAML file that defines the capabilities for this loader.

This plugin has the following capabilities:

    You can override these capabilities or specify additional ones in your meltano.yml by adding the capabilities key.

    Settings

    The target-snowflake settings that are known to Meltano are documented below. To quickly find the setting you're looking for, click on any setting name from the list:

    You can override these settings or specify additional ones in your meltano.yml by adding the settings key.

    Please consider adding any settings you have defined locally to this definition on MeltanoHub by making a pull request to the YAML file that defines the settings for this plugin.

    Account (snowflake_account)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT

    ACCOUNT might require the region and cloud platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud> (e.g. xy12345.east-us-2.azure)

    Refer to Snowflake's documentation about Accounts.

    Username (snowflake_username)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME
    [No description provided.]

    Password (snowflake_password)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD
    [No description provided.]

    Role (snowflake_role)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_ROLE

    If not specified, Snowflake will use the user's default role.

    Snowflake Database (snowflake_database)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE
    [No description provided.]

    Snowflake Authenticator (snowflake_authenticator)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR
    • Default Value: snowflake

    Specifies the authentication provider for snowflake to use. Valud options are the internal one ("snowflake"), a browser session ("externalbrowser"), or Okta ("https://.okta.com"). See the snowflake docs for more details.

    Warehouse (snowflake_warehouse)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE
    [No description provided.]

    Invalid Records Detect (invalid_records_detect)

    • Environment variable: TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT
    • Default Value: true

    Include false in your config to disable crashing on invalid records

    Invalid Records Threshold (invalid_records_threshold)

    • Environment variable: TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD
    • Default Value: 0

    Include a positive value n in your config to allow at most n invalid records per stream before giving up.

    Disable Collection (disable_collection)

    • Environment variable: TARGET_SNOWFLAKE_DISABLE_COLLECTION
    • Default Value: false

    Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging

    Logging Level (logging_level)

    • Environment variable: TARGET_SNOWFLAKE_LOGGING_LEVEL
    • Default Value: INFO

    The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc. See Python's Logger Levels for information about valid values.

    Persist Empty Tables (persist_empty_tables)

    • Environment variable: TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES
    • Default Value: false

    Whether the Target should create tables which have no records present in Remote.

    Snowflake Schema (snowflake_schema)

    • Environment variable: TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA
    • Default Value: $MELTANO_EXTRACT__LOAD_SCHEMA

    Note $MELTANO_EXTRACT__LOAD_SCHEMA will expand to the value of the load_schema extra for the extractor used in the pipeline, which defaults to the extractor's namespace, e.g. tap_gitlab for tap-gitlab. Values are automatically converted to uppercase before they're passed on to the plugin, so tap_gitlab becomes TAP_GITLAB.

    State Support (state_support)

    • Environment variable: TARGET_SNOWFLAKE_STATE_SUPPORT
    • Default Value: true

    Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.

    Target S3 Bucket (target_s3.bucket)

    • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_BUCKET

    When included, use S3 to stage files. Bucket where staging files should be uploaded to.

    Target S3 Key Prefix (target_s3.key_prefix)

    • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_KEY_PREFIX

    Prefix for staging file uploads to allow for better delineation of tmp files

    Target S3 AWS Access Key ID (target_s3.aws_access_key_id)

    • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_AWS_ACCESS_KEY_ID
    [No description provided.]

    Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key)

    • Environment variable: TARGET_SNOWFLAKE_TARGET_S3_AWS_SECRET_ACCESS_KEY
    [No description provided.]

    Something missing?

    This page is generated from a YAML file that you can contribute changes to.

    Edit it on GitHub!

    Looking for help?

    If you're having trouble getting the target-snowflake loader to work, look for an existing issue in its repository, file a new issue, or join the Meltano Slack community and ask for help in the
    #plugins-general
    channel.

    Install

    meltano add loader target-snowflake --variant datamill-co

    Maintenance Status

    • Maintenance Status
    • Stars
    • Forks
    • Open Issues
    • Open PRs
    • Contributors
    • License

    Maintainer

    • Data Mill

    Meltano Stats

    • Total Executions (Last 3 Months)
    • Projects (Last 3 Months)

    Keywords

    • database