When loading data from Datastore into BigQuery wit

2019-07-17 03:39发布

While using the command line tool to load from Datastore into BigQuery I've noticed the following strange behaviour. When I specify what fields to include using the option projection_fields, there is one rather complex nested field whose subfields are not all included. I can determine no pattern in the selection of subfields. Strangely, if I don't specify projection_fields (i.e. include all fields), all subfields are included. (At least I have to assume so, because one of these subfields is actually causing an error, see this previous question.)

I've not been able to find any explanation of projection_fields except that it can only be used on top-level fields. Is there some design behind this behaviour or is it a bug?

2条回答
一夜七次
2楼-- · 2019-07-17 04:19

I believe the subfields excluded are simply those that are null everywhere. The error referred to in the question has a different cause, and does not imply that these subfields would have been loaded when projection_fields had not been set.

查看更多
Root(大扎)
3楼-- · 2019-07-17 04:37

The answer to your question is in the official documentation for Jobs config (scroll down to "configuration.load.projectionFields"). It does indeed say the following (emphasis mine):

"If sourceFormat is set to "DATASTORE_BACKUP", indicates which entity properties to load into BigQuery from a Cloud Datastore backup. Property names are case sensitive and must be top-level properties. If no properties are specified, BigQuery loads all properties. If any named property isn't found in the Cloud Datastore backup, an invalid error is returned in the job result."

So, to answer your question, it is indeed by design.

查看更多
登录 后发表回答