I'm trying to write to DataStore from DataFlow using com.google.cloud.datastore
.
My code looks like this (inspired by the examples in [1]):
public void processElement(ProcessContext c) {
LocalDatastoreHelper HELPER = LocalDatastoreHelper.create(1.0);
Datastore datastore = HELPER.options().toBuilder().namespace("ghijklmnop").build().service();
Key taskKey = datastore.newKeyFactory()
.ancestors(PathElement.of("TaskList", "default"))
.kind("Task")
.newKey("sampleTask");
Entity task = Entity.builder(taskKey)
.set("category", "Personal")
.set("done", false)
.set("priority", 4)
.set("description", "Learn Cloud Datastore")
.build();
datastore.put(task);
}
I'm getting this error:
exception: "java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeException: com.google.cloud.datastore.DatastoreException: I/O error
at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:162)
at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.sideOutputWindowedValue(DoFnRunnerBase.java:314)
at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext.sideOutput(DoFnRunnerBase.java:470)
at com.google.cloud.dataflow.sdk.transforms.Partition$PartitionDoFn.processElement(Partition.java:172)
I have tried to use the DatastoreIO
sink, but it looks like it is not currently supported in the streaming runner.
How can I avoid that error ? or What's the best way to write from DataFlow to DataStore ?
Following @Sam McVeety advice, I tried to isolate my Datastore code outside of Dataflow. And I indeed got the same error !
But this also allowed me to see the cause of the exception, which I didn't see in Dataflow logs:
The clue is in this import line that I was using:
com.google.cloud.datastore.testing.LocalDatastoreHelper
.It's a helper for tests that takes care of basically mocking Datastore API locally. Oops.
So this is the code that I've got now after some local debugging:
The main difference is:
Becomes