We have a week-long backup process that exports our Google Appengine Datastore products to Google Cloud Storage and then to Google BigQuery. Each week, we create a new dataset with a name YYYY_MM_DDthat contains a copy of the worksheets that day. Over time, we have collected many data sets, such as 2014_05_10, 2014_05_17etc. I want to create a dataset Latest_Production_Datathat contains a view for each of the tables in the last dataset YYYY_MM_DD. This will simplify sending downstream reports once and will always retrieve the latest data.
To do this, I have code that receives the most recent dataset and the names of all the tables that the dataset contains from the BigQuery API. Then for each of these tables, I run a call to tables.insert to create a view that is SELECT *from the table I create to reference.
This is not true for tables containing a field RECORD, from what appears to be a pretty soft column name rule.
For example, I have this table:

Why am I calling this API call:
{
'tableReference': {
'projectId': 'redacted',
'tableId': u'AccountDeletionRequest',
'datasetId': 'Latest_Production_Data'
}
'view': {
'query': u'SELECT * FROM [2014_05_17.AccountDeletionRequest]'
},
}
This results in the following error:
HttpError: https://www.googleapis.com/bigquery/v2/projects//datasets/Latest_Production_Data/tables?alt=json " " __key __. namespace ". , , , 128 ." >
- BigQuery, , . _. , , API .

, ? , , - , , , - , . - SELECT *, ?
, table copy, , .