Getting "Too Big Query" in BigQuery

I store event data in BigQuery, divided by day - one table per day. The following request failed:

select count(distinct event) from TABLE_DATE_RANGE(my_dataset.my_dataset_events_, SEC_TO_TIMESTAMP(1391212800), SEC_TO_TIMESTAMP(1393631999)) 

Each table has a size of about 8 GB.

Has anyone else experienced this error? It seems to be limited by the size of the table because in this query I limited it to only one column. When I use a shorter time range, it works ... but the whole point of using BigQuery was to support large datasets.

+6
source share
1 answer

β€œQuery too large” in this case means that TABLE_RANGE expands internally to too many tables, generating an internal query that is too large to process.

This has 2 workarounds:

  • Request fewer tables (can you combine these tables into a larger one?).
  • Wait for the BQ team to solve this problem. Instead of using a workaround, you should be able to run this request unchanged. Just not today:).
+4
source

Source: https://habr.com/ru/post/971554/


All Articles