While using the command line tool to load from Datastore into BigQuery I've noticed the following strange behaviour. When I specify what fields to include using the option projection_fields, there is one rather complex nested field whose subfields are not all included. I can determine no pattern in the selection of subfields. Strangely, if I don't specify projection_fields (i.e. include all fields), all subfields are included. (At least I have to assume so, because one of these subfields is actually causing an error, see this previous question.)
I've not been able to find any explanation of projection_fields except that it can only be used on top-level fields. Is there some design behind this behaviour or is it a bug?
I believe the subfields excluded are simply those that are
null
everywhere. The error referred to in the question has a different cause, and does not imply that these subfields would have been loaded whenprojection_fields
had not been set.The answer to your question is in the official documentation for Jobs config (scroll down to "configuration.load.projectionFields"). It does indeed say the following (emphasis mine):
"If sourceFormat is set to "DATASTORE_BACKUP", indicates which entity properties to load into BigQuery from a Cloud Datastore backup. Property names are case sensitive and must be top-level properties. If no properties are specified, BigQuery loads all properties. If any named property isn't found in the Cloud Datastore backup, an invalid error is returned in the job result."
So, to answer your question, it is indeed by design.