I tried several csv -F ormats (different escape characters, quotation marks and other settings) to export data from MySQL and import it into BigQuery, but I could not find a solution that works in each case.
Google SQL requires the following code to import / export from / to MySQL. Although Cloud SQL is not BigQuery, this is a good starting point:
SELECT * INTO OUTFILE 'filename.csv' CHARACTER SET 'utf8'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '' FROM table
I am currently using the following command to import compressed csv into BigQuery: bq --nosync load -F "," --null_marker "NULL" --Format=csv PROJECT:DATASET.tableName gs://bucket/data.csv.gz table_schema.json
On the one hand, the bq command does not allow the escape character to be set ( "escaped by the other ", which appears to be a well-defined CSV -F ormat). On the other hand, \"as an escape character for MySQL export will result in "Na "Null-value", which also does not work:
CSV table references column position 34, but line starting at position:0 contains only 34 columns. (error code: invalid)
Therefore, my question is: how to write (table) an independent export command for MySQL to SQL so that the generated file can be loaded into BigQuery. Which escape character should be used and how to handle / set null values?
source
share