How do I launch a Cloud Dataflow job from a Google Cloud Function? I'd like to use Google Cloud Functions as a mechanism to enable cross-service composition.
相关问题
- Why do Dataflow steps not start?
- Firebase Cloud Functions emulator throws “exited w
- Firebase-Admin package Typescript error in Cloud F
- GCF deployment failure (region/billing)
- How to group Cloud Function log entries by executi
相关文章
- How can make folder with Firebase Cloud Functions
- Is it possible to test a Firebase trigger locally?
- Firestore + cloud functions: How to read from anot
- Setting NODE_ENV for firebase function
- CloudFirestore Triggers :How to get the data of th
- Cloud Functions and Firestore does not go up the r
- Kafka to Google Cloud Platform Dataflow ingestion
- How to run dynamic second query in google cloud da
I've included a very basic example of the WordCount sample below. Please note that you'll need to include a copy of the java binary in your Cloud Function deployment, since it is not in the default environment. Likewise, you'll need to package your deploy jar with your Cloud Function as well.
You could further enhance this example by using the non-blocking runner and having the function return the Job ID, so that you can poll for job completion separately. This pattern should be valid for other SDKs as well, so long as their dependencies can be packaged into the Cloud Function.