Set up a Snowflake data destination to export data from Singular automatically to a Snowflake database. Note that data destinations are a premium feature (learn more).

NEW: We've recently announced a Snowflake Data Exchange integration! It provides a seamless way to push Singular data into your Snowflake warehouse. See the Snowflake Data Exchange Destination article for more details.

 

Setup Instructions

1. Log In as an Account Admin

Make sure you are logged in as a user with admin access. If your default role is not ACCOUNTADMIN, switch to it using the following command:

USE ROLE ACCOUNTADMIN;

2.  Create a Snowflake Warehouse and Database

To create a warehouse, run the following command. Replace <singular_warehouse> with a warehouse name of your choice.

CREATE WAREHOUSE <singular_warehouse> 
WITH AUTO_RESUME = TRUE
WAREHOUSE_SIZE = 'XSMALL'
AUTO_SUSPEND = 600;
Parameter Description
AUTO_RESUME Whether the warehouse is automatically resumed after an SQL command is executed. We recommend setting this to TRUE.
WAREHOUSE_SIZE We recommend starting with XSMALL and increasing the size if necessary.
AUTO_SUSPEND The number of seconds of inactivity before the warehouse is suspended. We recommend setting this to 10-15 minutes. If you set it to NULL, the warehouse will keep running and consuming Snowflake credits.

Then create a database with the following command. Replace <singular_database> with a database name of your choice.

CREATE DATABASE <singular_database>;

3. Create a Database User

Now you need to create a user for Singular that has access to the warehouse and database you just created.

First, create a new role:

CREATE ROLE <singular_role> COMMENT = 'Role for Singular ETL Integration';

Next, grant it permissions to the warehouse and database:

GRANT ALL ON WAREHOUSE <singular_warehouse> TO ROLE <singular_role>; 
GRANT ALL ON DATABASE <singular_database> TO ROLE <singular_role>;

Finally, create a user as follows:

CREATE USER <singular_user> 
PASSWORD='<password>'
COMMENT='User for Singular ETL Integration'
DEFAULT_ROLE=<singular_role>
DEFAULT_WAREHOUSE=<singular_warehouse>;

GRANT ROLE <singular_role> TO USER <singular_user>;

Note: This gives Singular access to a dedicated Snowflake warehouse that contains only Singular data. Singular does not get access to any other warehouses you own.

4. Whitelist Singular Server IPs

If you have a network policy set up in your Snowflake instance, you need to whitelist Singular's Server IPs:

  1. Log into your Snowflake account (you need to have account admin privileges).
  2. If needed, switch to the ACCOUNTADMIN role:

    mceclip3.png

  3. Go to Account > Policies and choose the currently active network policy.

    mceclip4.png

  4. Click Edit and add the following IPs under Allowed IP Addresses:
    • 54.183.135.179/32
    • 54.183.113.72/32
    • 13.52.189.144/32

5. Add a Snowflake Data Destination

In Singular, add a data destination for Snowflake as follows:

  1. In your Singular account, go to Settings > Data Destinations and click Add a new destination.
  2. Type in either "Snowflake Destination" (to export aggregated marketing data) or "Snowflake User-Level Destination" (to export user-level data).

  3. In the window that opens, fill in the relevant details:

    mceclip0.png

    Field Description
    Username The username and password to the Snowflake database you created in Step 3.
    Password
    Snowflake Account Your Snowflake account name. You can find it in your database URL, e.g., if the URL is https://myaccount.snowflakecomputing.com/, the account name is myaccount.
    Warehouse Name The names of the warehouse and database you created in Step 2.
    Database Name
    Schema Name Enter a name for the schema that Singular is going to create in the database. The default value is "singular".
    Table Name Enter a name for the table that Singular is going to write to. The default value is "marketing_data".
    Data Schema The schema of the data loaded into the destination. For your schema options, see Data Destinations: Aggregated Marketing Data Schemas and Data Destinations: User-Level Data Schemas.
Was this article helpful?