Table to Storage: SQL file
To run a Table to Storage data operation, you first need to prepare a SQL query that will extract the data to export.
The SQL file must contain a BigQuery standard SQL query. You can write it directly in the BigQuery query editor and then save it into a .sql file.
This query will be executed when the data operation is launched, and the result will be stored in the JSON file specified in your configuration.
For a GBQ to Firestore data pipeline, you must at least select a firestore_path column
📋 Global SQL pattern
In order to use the Python script to load the data into Firestore, the SQL that extracts it must follow a specific pattern.
You will get a BigQuery Result like this:
And after loading it into Firestore (see next pages for the next steps), you create collections and documents as specified in the firestore_path column and get data like this in Firestore:
You can also create a list in the document using a "data" column, created using the BigQuery ARRAY_AGG(STRUCT()) functions.
The SQL is more complex. Here is an example:
The result looks like this in Firestore:
Last updated