Export data with Tables to Storage
Learn how to export data located in a BigQuery table into CSV/JSON files using a Table to Storage operation.
Last updated
Was this helpful?
Learn how to export data located in a BigQuery table into CSV/JSON files using a Table to Storage operation.
Last updated
Was this helpful?
A Table to Storage (TTS) data pipeline operation allows you to export your data from a BigQuery table to a CSV/JSON file in a Google Cloud Storage bucket so you can leverage them with other tools, such as a warehouse management system.
Google BigQuery
CSV file in Google Cloud Storage
JSON file in Google Cloud Storage
When the Table to Storage Tailer workflow is triggered by an event (usually a BigQuery table update):
The SQL query you specify will be executed to extract the relevant data from the source BigQuery table.
The data will be exported to a CSV file or a JSON file located in the GCS bucket you specified.
Create a working folder as you want, and create a JSON file for your data operation inside.
Access your working folder by running the following command:
To deploy the data operation, run the following command:
Access the GCS bucket to check your output file (CSV or JSON).
Access your tailer folder (created during ).
Create a to determine what data to extract.
Prepare your JSON configuration file. Refer to this page to learn about all the .
Create a that will define how to trigger it.
Log in to to check the status and details of your data operation.
For your workflow to be executed, you either need to run the data operation that is set to trigger it in your Workflow data operation (previous step in your data pipeline), or to launch it manually from .