Skip to main content
Data Push Guide

Learn how to use our data push feature

Claire avatar
Written by Claire
Updated over 9 months ago

Getting all of your social media data delivered daily to your own preferred database, whether that is Google BigQuery or AWS S3 can be automated in a matter of minutes.

Each Data Push task represents a single automation based on the selected data-sources.

The following points comprise the whole procedure from decision to delivery regarding having an automated data flow for all accessible data from Facelift Data Studio to your desired database destination.

  1. First, we need to visit our Data Push Tasks section and select “Add new”

  2. In this section, we can select one data source and any number of its columns.

Datasource: Here is where we can select the data-source we want to push data from

Columns: Each task can push multiple columns. Here we can select as many or all of the columns we would like to push from the selected data-source.

Date range, Time zone, and Interval: Here we can select the dynamic time interval according to which the data we want to push will be selected. As an example, if Last 7 days, UTC, and Daily are selected, each data push execution will push the past 7 days in a rolling window.

Push time column: In this section, we can name the column that will hold a DATETIME value for the time that this data push is triggered each time.

Schedule time: Create a rule for how often we would like this data push task to be triggered using the cron expression builder.

We can now click “Next”.

  1. On this section, we can select whether we want our data to be pushed to Google BigQuery or AWS S3. The options afterward are different according to the destination chosen.

Google BigQuery

Project id, Dataset id, Table id: Here we input the names we used while creating the GoogleBigQuery table we want to push our data to, or alternatively provide us with the names that you want the Google BigQuery to be named as, after you provide us with the ‘Create Table’ authorization at the next step, the JSON file.

And finally, upload the JSON file that was generated when the table was first created in Google BigQuery. This will allow Facelift Data Studio to push the data to your tables.

AWS S3

Access key, Secret access key: These fields contain the necessary information for Facelift Data Studio to be able to push the data to your S3 bucket.

Region: The region you AWS S3 bucket is located, e.g. "us-east-1".

Bucket: The Bucket that will contain the data we are pushing.

File prefix: This can hold a prefix that will allow for proper filing system in the case of multiple data push tasks being pushed into the same bucket.

Format: Lastly we can select the preferred format for the data push task between CSV or JSON

If you would like to enable this feature in your account, please reach out to your account manager or support@facelift-bbt.com.

Did this answer your question?