I would like to use Memcache in dafalow ParDo? any ideas how?I can't use existing memcahse lib as they belong to appengine and are not serializable. Rohit
相关问题
- Why do Dataflow steps not start?
- Apache beam DataFlow runner throwing setup error
- Apply Side input to BigQueryIO.read operation in A
- Reading BigQuery federated table as source in Data
- CloudDataflow can not use “google.cloud.datastore”
相关文章
- Kafka to Google Cloud Platform Dataflow ingestion
- How to run dynamic second query in google cloud da
- Beam/Google Cloud Dataflow ReadFromPubsub Missing
- Cloud Dataflow failure recovery
- KafkaIO checkpoint - how to commit offsets to Kafk
- Validating rows before inserting into BigQuery fro
- Can Dataflow sideInput be updated per window by re
- Computing GroupBy once then passing it to multiple
My guess is, you have a private variable in your
DoFn
of typeMemcacheServiceImpl
(if my guess is wrong, please edit your question to include the code of yourDoFn
).Indeed, Dataflow serializes your
DoFn
's when you submit the pipeline and de-serializes them on the workers. The proper way to handle this is to make the variable transient, and initialize it lazily:Also note that to access AppEngine APIs, including Memcache, from a non-AppEngine environment, you should use the Remote API.