Atomic inserts in a large request

When I upload more than 1 CSV file, how does a large request handle errors?

bq load --max_bad_record = 30 dbname.finalsep20xyz gs: //sep20new/abc.csv.gz,gs: //sep20new/xyzcsv.gz

There may be several files in a batch job that they may not load, because the number of expected columns will not match. However, I want to download the rest of the files. If the abc.csv file is not executed, will the xyz.csv file be executed? Or will all the work fail and the record will not be inserted?

I tried with dummy entries, but couldn’t finally find how errors are handled in multiple files.

+1
source share
1 answer

- , . , , . max_bad_records - .

- , , , . , , , .

, BigQuery, BQ (, , ), ; , , , , .

+3

Source: https://habr.com/ru/post/1538971/


All Articles