I don't want to write processed KStream to another topic, I directly want to write enriched KStream to database. How should I proceed?
相关问题
- Delete Messages from a Topic in Apache Kafka
- Serializing a serialized Thrift struct to Kafka in
- Kafka broker shutdown while cleaning up log files
- Getting : Error importing Spark Modules : No modul
- How to transform all timestamp fields when using K
相关文章
- Kafka doesn't delete old messages in topics
- Kafka + Spark Streaming: constant delay of 1 secon
- Spring Kafka Template implementaion example for se
- How to fetch recent messages from Kafka topic
- Determine the Kafka-Client compatibility with kafk
- Kafka to Google Cloud Platform Dataflow ingestion
- Kafka Producer Metrics
- Spark Structured Streaming + Kafka Integration: Mi
You can implement a custom
Processor
that opens a DB connection and apply it viaKStream#process()
. Cf. https://docs.confluent.io/current/streams/developer-guide/dsl-api.html#applying-processors-and-transformers-processor-api-integrationNote, you will need to do sync writes into your DB to guard against data loss.
Thus, not writing back to a topic has multiple disadvantages:
Therefore, it's recommended to write the results back into a topic and use Connect API to get the data into your database.