Google BigQuery Output Tool
Use the Google BigQuery Output Tool to write data from Designer to tables in Google BigQuery. See Google BigQuery for information about known limitations. More information about Google BigQuery can be found on the Google Big Query Documentation site.
This tool is not automatically installed with
Log in to Google BigQuery
You can log in to Google BigQuery with service-to-service or end user credentials.
Use Service-to-service authentication over End user authentication whenever possible. Managing distinct keys for various applications authenticating for the same service account allows the revocation of access for a specific application without requiring the other applications to update their authentication credentials.
Contact your Google Cloud Administrator for help with your Google BigQuery credentials.
Service-to-service authentication requires a Google Service Account File, also known as a service account JSON key file.
You can obtain a service account JSON key file from the Google Cloud Console or you can create a new key for an existing service account. More information about Google BigQuery can be found on the Google Big Query Documentation site under Creating and Managing Service Account Keys.
- In Authentication mode, select Service-to-Service.
- Click Select a file to select your Google service account file.
- In Open File, browse to and select the JSON file that contains your Google service account key. Selecting the file enters your Google Cloud Service account key ID and Service account email, and then displays the BigQuery Table Input Config window. If your key file is invalid, an error displays.
End user authentication
- In Authentication mode, select End user.
- Enter your Client ID and Client Secret.
- Click Log in.
- In Sign in with Google, enter your Google email and password then click Allow to approve the connector's permission to access Google BigQuery on your behalf.
- Click Next.
- In BigQuery Table Input Config, click Select table. Designer displays the tables you have access to.
- Select a project, expand a dataset, and then select a BigQuery table. Designer displays the tables you have access to.
- To refresh the metadata, click the refresh icon.
- Once you select a table,
Designer,displays the table's fully qualified identifier. Click Change table to select a different table
- In Insert batch size, select a batch size. This number specifies the number of rows that will be written to the Google BigQuery Table at each call to the insert API. Google sets limits on the number of API requests per second, the number of rows that may be inserted per second and per request, the size of the rows to be inserted, and size of HTTP requests.
- More information about limits and quotas can be found on the Google BigQuery Quotas and Limits Documentation site. You can also review the Streaming Data into BigQuery Documentation to understand how your data will be inserted into the Google BigQuery table.
- Run the workflow.
Outputting data from your designer workflow to Google BigQuery streams new rows to the table in BigQuery. Data will be held in a temporary streaming buffer for up to 90 minutes while processes in Google BigQuery convert the row-based data to columnar-based storage. In most cases, the data in the streaming buffer is still available for querying within seconds. However, you will not be able to see the inserted data in the Preview tab of the Google BigQuery console for the table until the extraction processes are complete. Learn more about the streaming buffer in this Google Blog Post. More information about the availability of streamed data can be found on the Google BigQuery Documentation site.