Is kafka consumer 0.9 backward compatible?

2019-02-21 18:13发布

Is the upcoming kafka consumer 0.9.x going to be compatible with 0.8 broker?

In other words - it is possible to only switch to new consumer implementation, without touching anything else?

5条回答
Anthone
2楼-- · 2019-02-21 18:27

It looks that in kafka 0.9.0 is back-ward compatibility built-in. Check http://kafka.apache.org/documentation.html#upgrade

Citation from documentation

0.9.0.0 has potential breaking changes (please review before upgrading) and an inter-broker protocol change from previous versions. For a rolling upgrade:

  • Update server.properties file on all brokers and add the following property: inter.broker.protocol.version=0.8.2.X
  • Upgrade the brokers. This can be done a broker at a time by simply bringing it down, updating the code, and restarting it.
  • Once the entire cluster is upgraded, bump the protocol version by editing inter.broker.protocol.version and setting it to 0.9.0.0.
  • Restart the brokers one by one for the new protocol version to take effect
查看更多
成全新的幸福
3楼-- · 2019-02-21 18:29

I recently faced the similar issue where in my application, I had to read from kafka 0.9 and then write back to kafka 0.8. I used kafka client 0.9 in following way.

Consumer Config

    props.put("bootstrap.servers", "brokers_ip as comma seperated values");
    props.put("group.id", "your group id");
    props.put("key.deserializer", StringDeserializer.class.getName());
    props.put("value.deserializer", StringDeserializer.class.getName());
    props.put("enable.auto.commit", "true");
    props.put("auto.commit.interval.ms", 1000);
    props.put("session.timeout.ms", 30000);
    consumer = new KafkaConsumer<String, String>(props);
    consumer.subscribe("List of topics to subscribe too");

Producer Config

        Properties props = new Properties();
        props.put("bootstrap.servers","list of broker ips");
        props.put("metadata.broker.list", "list of broker ips");
        props.put("serializer.class", "kafka.serializer.StringEncoder");
        props.put("acks", "all");
        props.put("retries", 0);
        props.put("batch.size", 16384);
        props.put("linger.ms", 1);
        props.put("buffer.memory", 33554432);
        ProducerConfig config = new ProducerConfig(props);
        Producer<String, String> producer = new Producer<String, String>(config);
        String message = "hello world";
        KeyedMessage<String, String> data = new KeyedMessage<String, String>(topic_name, message);
        producer.send(data);
        producer.close();

Hope this helps.

查看更多
走好不送
4楼-- · 2019-02-21 18:34

Based on this Consumer Client Re-design wiki page which quotes,

This would involve some significant changes to the consumer APIs*, so we would like to collect feedback on the proposal from our community. Since the list of changes is not small, we would like to understand if some features are preferred over others, and more importantly, if some features are not required at all.

*Emphasis mine.

I didn't find anywhere specifically stating no compatibility. But using that quote and the fact that the producer in 0.8 was not compatible with the producer in 0.7, I'm assuming that they not compatible.

查看更多
淡お忘
5楼-- · 2019-02-21 18:40

No. In general it's recommended to upgrade brokers before clients since brokers target backwards compatibility. The 0.9 broker will work with both the 0.8 consumer and 0.9 consumer APIs but not the other way around.

查看更多
男人必须洒脱
6楼-- · 2019-02-21 18:43

According to the documentation of Kafka 0.9.0, you can not use the new consumer for reading data from 0.8.x brokers. The reason is the following:

0.9.0.0 has an inter-broker protocol change from previous versions.

查看更多
登录 后发表回答