BigQuery Data Destination

Set up a BigQuery data destination to export data from Singular automatically to a BigQuery data warehouse.

Note: Data destinations are an enterprise feature (learn more).

Best Practices

We recommend the following best practices with BigQuery:

  1. Create the Owner role while configuring BigQuery accounts.
  2. Enable BigQuery API and BigQuery Data Transfer API in the project.
  3. When configuring schemas, use different table names but the same Service Account, Project ID, and JSON Key.
  4. Avoid using special characters or spaces when naming Datasets and Tables. They may cause an "Invalid Credentials" error in the UI (“Invalid table ID” error internally).
  5. BigQuery timestamps are in UTC. When comparing to Singular Reports, remember to apply timezone conversion in the query.

Setup Instructions

1. Select a Google Cloud Project

Select or create a Google Cloud project and enable BigQuery API for the project (see Google's  instructions).

Write down the project ID.

mceclip0.png

Note: The instructions below grant Singular access to all datasets in the Google Cloud project. Singular will only use its own dataset, but if you want more controlled access, consider creating a dedicated project for Singular.

2. Grant Access

Singular loads data into BigQuery using Google Cloud Compute Service Accounts. We support two methods.

Option #1: Create a Service Account (Recommended)

  1. In the Google Cloud platform, go to IAM & Admin > Service Accounts and click Create Service Account.
  2. Enter a name, an ID and a description for the service account and click Create.

    mceclip4.png

  3. In the Service Account Permissions window, give the new account the following permissions:
    • BigQuery Data Owner - allows Singular to create and manage the dataset and tables.
    • BigQuery Job User - allows Singular to create load jobs into the dataset.
      mceclip5.png
  4. To create a JSON key for the account, click Create Key, choose a JSON key type, and click Create.

    mceclip6.png

  5. Download the key file and save it in a safe location, you will need to upload it to Singular on step #3.

Option #2: Grant Access to Singular's Pre-made Service Account

  1. In the Google Cloud platform, go to IAM & Admin > IAM and click Add
  2. Enter "singular-etl@singular-etl.iam.gserviceaccount.com" and add the following roles:

    • BigQuery Data Owner - allows Singular to create and manage the dataset and tables.
    • BigQuery Job User - allows Singular to create load jobs into the dataset.

    mceclip1.png

3. Add a BigQuery Data Destination

To add a BigQuery data destination in Singular:

  1. In your Singular account, go to Settings > Data Destinations and click Add a new destination.
  2. Type in either "BigQuery Destination" (to export aggregated marketing data) or "BigQuery User-Level Destination" (to export user-level data).

  3. In the window that opens, fill in the relevant details:

    mceclip1.png

    Field Description
    Service Account Type Choose the appropriate service account type (user created or Singular's pre-made) based on Step #2.
    Credentials File Upload the file you created in Step #2, wouldn't show if you chose to give access to Singular's pre-made Service Account.
    Project ID Enter the Project ID from Step #1.
    Dataset Name Enter a name for the dataset that Singular will write to. Default: "singular". If the dataset does not exist yet, it will be created.
    Table Name Enter a name for the table that Singular will write to. Default: "marketing_data". If the table does not exist yet, it will be created.
    Dataset Location Set your BigQuery dataset location (US is BigQuery's default location).
    Data Schema The schema of the data loaded into the destination. For your schema options, see Data Destinations: Aggregated Marketing Data Schemas and Data Destinations: User-Level Data Schemas.