Kafka 2.1.0 Java Consumer vs Scala Consumer





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







1















I would like to upgrade my team's Kafka clusters from version 0.10.1.1 to version 2.1.0. By the way, Kafka official document says the following words.



Kafka Official Docu




Note that the older Scala clients, which are no longer maintained, do not support the message format introduced in 0.11, so to avoid conversion costs (or to take advantage of exactly once semantics), the newer Java clients must be used.




I do not understand that sentence well. Currently our team is using the Kafka Consumer Application written in Scala. But should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have.










share|improve this question




















  • 4





    The Scala Kafka clients are no longer supported, you should use the Java implementation from your Scala code.

    – Yuval Itzchakov
    Jan 3 at 8:53











  • Thanks for your comment. "no longer supported" means "I can use it but it will cause performance problems and other problems."? Right?

    – Dogil
    Jan 3 at 8:57






  • 3





    Using it means if you update your Kafka Brokers from 0.10.1.1 to 2.1.0, your clients will no longer "speak" the same protocol, and will probably not work.

    – Yuval Itzchakov
    Jan 3 at 8:58








  • 2





    @Dogil, You don't need to change your Scala code to Java. Given Scala runs on the JVM too, it is completely interoperable with Java. You only need to change the parts of your code that directly speak with Kafka to use the Java API instead (still in scala code).

    – Luis Miguel Mejía Suárez
    Jan 3 at 12:41






  • 1





    @Dogil I have no idea, sorry. I have not used Kafka - I just point out that you could call Java from Scala.

    – Luis Miguel Mejía Suárez
    Jan 7 at 3:14




















1















I would like to upgrade my team's Kafka clusters from version 0.10.1.1 to version 2.1.0. By the way, Kafka official document says the following words.



Kafka Official Docu




Note that the older Scala clients, which are no longer maintained, do not support the message format introduced in 0.11, so to avoid conversion costs (or to take advantage of exactly once semantics), the newer Java clients must be used.




I do not understand that sentence well. Currently our team is using the Kafka Consumer Application written in Scala. But should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have.










share|improve this question




















  • 4





    The Scala Kafka clients are no longer supported, you should use the Java implementation from your Scala code.

    – Yuval Itzchakov
    Jan 3 at 8:53











  • Thanks for your comment. "no longer supported" means "I can use it but it will cause performance problems and other problems."? Right?

    – Dogil
    Jan 3 at 8:57






  • 3





    Using it means if you update your Kafka Brokers from 0.10.1.1 to 2.1.0, your clients will no longer "speak" the same protocol, and will probably not work.

    – Yuval Itzchakov
    Jan 3 at 8:58








  • 2





    @Dogil, You don't need to change your Scala code to Java. Given Scala runs on the JVM too, it is completely interoperable with Java. You only need to change the parts of your code that directly speak with Kafka to use the Java API instead (still in scala code).

    – Luis Miguel Mejía Suárez
    Jan 3 at 12:41






  • 1





    @Dogil I have no idea, sorry. I have not used Kafka - I just point out that you could call Java from Scala.

    – Luis Miguel Mejía Suárez
    Jan 7 at 3:14
















1












1








1








I would like to upgrade my team's Kafka clusters from version 0.10.1.1 to version 2.1.0. By the way, Kafka official document says the following words.



Kafka Official Docu




Note that the older Scala clients, which are no longer maintained, do not support the message format introduced in 0.11, so to avoid conversion costs (or to take advantage of exactly once semantics), the newer Java clients must be used.




I do not understand that sentence well. Currently our team is using the Kafka Consumer Application written in Scala. But should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have.










share|improve this question
















I would like to upgrade my team's Kafka clusters from version 0.10.1.1 to version 2.1.0. By the way, Kafka official document says the following words.



Kafka Official Docu




Note that the older Scala clients, which are no longer maintained, do not support the message format introduced in 0.11, so to avoid conversion costs (or to take advantage of exactly once semantics), the newer Java clients must be used.




I do not understand that sentence well. Currently our team is using the Kafka Consumer Application written in Scala. But should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have.







java scala apache-kafka






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 3 at 8:58









Mohamed Anees A

1,362520




1,362520










asked Jan 3 at 8:51









DogilDogil

428




428








  • 4





    The Scala Kafka clients are no longer supported, you should use the Java implementation from your Scala code.

    – Yuval Itzchakov
    Jan 3 at 8:53











  • Thanks for your comment. "no longer supported" means "I can use it but it will cause performance problems and other problems."? Right?

    – Dogil
    Jan 3 at 8:57






  • 3





    Using it means if you update your Kafka Brokers from 0.10.1.1 to 2.1.0, your clients will no longer "speak" the same protocol, and will probably not work.

    – Yuval Itzchakov
    Jan 3 at 8:58








  • 2





    @Dogil, You don't need to change your Scala code to Java. Given Scala runs on the JVM too, it is completely interoperable with Java. You only need to change the parts of your code that directly speak with Kafka to use the Java API instead (still in scala code).

    – Luis Miguel Mejía Suárez
    Jan 3 at 12:41






  • 1





    @Dogil I have no idea, sorry. I have not used Kafka - I just point out that you could call Java from Scala.

    – Luis Miguel Mejía Suárez
    Jan 7 at 3:14
















  • 4





    The Scala Kafka clients are no longer supported, you should use the Java implementation from your Scala code.

    – Yuval Itzchakov
    Jan 3 at 8:53











  • Thanks for your comment. "no longer supported" means "I can use it but it will cause performance problems and other problems."? Right?

    – Dogil
    Jan 3 at 8:57






  • 3





    Using it means if you update your Kafka Brokers from 0.10.1.1 to 2.1.0, your clients will no longer "speak" the same protocol, and will probably not work.

    – Yuval Itzchakov
    Jan 3 at 8:58








  • 2





    @Dogil, You don't need to change your Scala code to Java. Given Scala runs on the JVM too, it is completely interoperable with Java. You only need to change the parts of your code that directly speak with Kafka to use the Java API instead (still in scala code).

    – Luis Miguel Mejía Suárez
    Jan 3 at 12:41






  • 1





    @Dogil I have no idea, sorry. I have not used Kafka - I just point out that you could call Java from Scala.

    – Luis Miguel Mejía Suárez
    Jan 7 at 3:14










4




4





The Scala Kafka clients are no longer supported, you should use the Java implementation from your Scala code.

– Yuval Itzchakov
Jan 3 at 8:53





The Scala Kafka clients are no longer supported, you should use the Java implementation from your Scala code.

– Yuval Itzchakov
Jan 3 at 8:53













Thanks for your comment. "no longer supported" means "I can use it but it will cause performance problems and other problems."? Right?

– Dogil
Jan 3 at 8:57





Thanks for your comment. "no longer supported" means "I can use it but it will cause performance problems and other problems."? Right?

– Dogil
Jan 3 at 8:57




3




3





Using it means if you update your Kafka Brokers from 0.10.1.1 to 2.1.0, your clients will no longer "speak" the same protocol, and will probably not work.

– Yuval Itzchakov
Jan 3 at 8:58







Using it means if you update your Kafka Brokers from 0.10.1.1 to 2.1.0, your clients will no longer "speak" the same protocol, and will probably not work.

– Yuval Itzchakov
Jan 3 at 8:58






2




2





@Dogil, You don't need to change your Scala code to Java. Given Scala runs on the JVM too, it is completely interoperable with Java. You only need to change the parts of your code that directly speak with Kafka to use the Java API instead (still in scala code).

– Luis Miguel Mejía Suárez
Jan 3 at 12:41





@Dogil, You don't need to change your Scala code to Java. Given Scala runs on the JVM too, it is completely interoperable with Java. You only need to change the parts of your code that directly speak with Kafka to use the Java API instead (still in scala code).

– Luis Miguel Mejía Suárez
Jan 3 at 12:41




1




1





@Dogil I have no idea, sorry. I have not used Kafka - I just point out that you could call Java from Scala.

– Luis Miguel Mejía Suárez
Jan 7 at 3:14







@Dogil I have no idea, sorry. I have not used Kafka - I just point out that you could call Java from Scala.

– Luis Miguel Mejía Suárez
Jan 7 at 3:14














1 Answer
1






active

oldest

votes


















2














I think you're confusing the old Scala-based kafka.consumer and kafka.producer packages that were in the Kafka core module with the new kafka-clients dependency, that is implemented in Java.



If your imports are these, you will be fine, and don't need to use different classes, and might only need to re-write a few parameters of methods calls after an upgrade



org.apache.kafka.clients.consumer.KafkaConsumer
org.apache.kafka.clients.producer.KafkaProducer



should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have




Java is more verbose, and doesn't have as nice a type-system as Scala. You're welcome to write the same code in Scala, Kotlin, Clojure, etc... At the end of the day, it's all running in the JVM






share|improve this answer


























  • Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

    – Dogil
    Jan 7 at 3:11






  • 2





    @Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

    – cricket_007
    Jan 7 at 3:41








  • 2





    For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

    – cricket_007
    Jan 7 at 3:43











  • Thanks for the explanation. I get the picture now.

    – Dogil
    Jan 7 at 3:57











  • Welcome. Feel free to use the checkmark next to the post to accept the answer

    – cricket_007
    Jan 7 at 4:01












Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54018976%2fkafka-2-1-0-java-consumer-vs-scala-consumer%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2














I think you're confusing the old Scala-based kafka.consumer and kafka.producer packages that were in the Kafka core module with the new kafka-clients dependency, that is implemented in Java.



If your imports are these, you will be fine, and don't need to use different classes, and might only need to re-write a few parameters of methods calls after an upgrade



org.apache.kafka.clients.consumer.KafkaConsumer
org.apache.kafka.clients.producer.KafkaProducer



should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have




Java is more verbose, and doesn't have as nice a type-system as Scala. You're welcome to write the same code in Scala, Kotlin, Clojure, etc... At the end of the day, it's all running in the JVM






share|improve this answer


























  • Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

    – Dogil
    Jan 7 at 3:11






  • 2





    @Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

    – cricket_007
    Jan 7 at 3:41








  • 2





    For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

    – cricket_007
    Jan 7 at 3:43











  • Thanks for the explanation. I get the picture now.

    – Dogil
    Jan 7 at 3:57











  • Welcome. Feel free to use the checkmark next to the post to accept the answer

    – cricket_007
    Jan 7 at 4:01
















2














I think you're confusing the old Scala-based kafka.consumer and kafka.producer packages that were in the Kafka core module with the new kafka-clients dependency, that is implemented in Java.



If your imports are these, you will be fine, and don't need to use different classes, and might only need to re-write a few parameters of methods calls after an upgrade



org.apache.kafka.clients.consumer.KafkaConsumer
org.apache.kafka.clients.producer.KafkaProducer



should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have




Java is more verbose, and doesn't have as nice a type-system as Scala. You're welcome to write the same code in Scala, Kotlin, Clojure, etc... At the end of the day, it's all running in the JVM






share|improve this answer


























  • Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

    – Dogil
    Jan 7 at 3:11






  • 2





    @Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

    – cricket_007
    Jan 7 at 3:41








  • 2





    For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

    – cricket_007
    Jan 7 at 3:43











  • Thanks for the explanation. I get the picture now.

    – Dogil
    Jan 7 at 3:57











  • Welcome. Feel free to use the checkmark next to the post to accept the answer

    – cricket_007
    Jan 7 at 4:01














2












2








2







I think you're confusing the old Scala-based kafka.consumer and kafka.producer packages that were in the Kafka core module with the new kafka-clients dependency, that is implemented in Java.



If your imports are these, you will be fine, and don't need to use different classes, and might only need to re-write a few parameters of methods calls after an upgrade



org.apache.kafka.clients.consumer.KafkaConsumer
org.apache.kafka.clients.producer.KafkaProducer



should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have




Java is more verbose, and doesn't have as nice a type-system as Scala. You're welcome to write the same code in Scala, Kotlin, Clojure, etc... At the end of the day, it's all running in the JVM






share|improve this answer















I think you're confusing the old Scala-based kafka.consumer and kafka.producer packages that were in the Kafka core module with the new kafka-clients dependency, that is implemented in Java.



If your imports are these, you will be fine, and don't need to use different classes, and might only need to re-write a few parameters of methods calls after an upgrade



org.apache.kafka.clients.consumer.KafkaConsumer
org.apache.kafka.clients.producer.KafkaProducer



should we turn this into Java? If we use the application written in Scala at present, I do not know exactly what disadvantages it can have




Java is more verbose, and doesn't have as nice a type-system as Scala. You're welcome to write the same code in Scala, Kotlin, Clojure, etc... At the end of the day, it's all running in the JVM







share|improve this answer














share|improve this answer



share|improve this answer








edited Jan 7 at 3:42

























answered Jan 3 at 17:06









cricket_007cricket_007

84.4k1147119




84.4k1147119













  • Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

    – Dogil
    Jan 7 at 3:11






  • 2





    @Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

    – cricket_007
    Jan 7 at 3:41








  • 2





    For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

    – cricket_007
    Jan 7 at 3:43











  • Thanks for the explanation. I get the picture now.

    – Dogil
    Jan 7 at 3:57











  • Welcome. Feel free to use the checkmark next to the post to accept the answer

    – cricket_007
    Jan 7 at 4:01



















  • Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

    – Dogil
    Jan 7 at 3:11






  • 2





    @Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

    – cricket_007
    Jan 7 at 3:41








  • 2





    For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

    – cricket_007
    Jan 7 at 3:43











  • Thanks for the explanation. I get the picture now.

    – Dogil
    Jan 7 at 3:57











  • Welcome. Feel free to use the checkmark next to the post to accept the answer

    – cricket_007
    Jan 7 at 4:01

















Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

– Dogil
Jan 7 at 3:11





Thank @cricket_007.. The sentence 'The Scala consumers, which have been deprecated since 0.11.0.0, have been removed. The Java consumer has been the recommended option since 0.10.0.0. Note that the Scala consumers in 1.1.0 (and older) will continue to work even if the brokers are upgraded to 2.0.0.' in Kafka official documents seams to mean that I should not use Scala.

– Dogil
Jan 7 at 3:11




2




2





@Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

– cricket_007
Jan 7 at 3:41







@Dogil No. It is saying not to use the client APIs based on Scala. You can use the new Java-based API in any JVM language

– cricket_007
Jan 7 at 3:41






2




2





For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

– cricket_007
Jan 7 at 3:43





For example, the Scala Kafka Streams API was added in Kafka 2.0, and actually is recommended for those using Scala, because it is easier to use than the Java API.

– cricket_007
Jan 7 at 3:43













Thanks for the explanation. I get the picture now.

– Dogil
Jan 7 at 3:57





Thanks for the explanation. I get the picture now.

– Dogil
Jan 7 at 3:57













Welcome. Feel free to use the checkmark next to the post to accept the answer

– cricket_007
Jan 7 at 4:01





Welcome. Feel free to use the checkmark next to the post to accept the answer

– cricket_007
Jan 7 at 4:01




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54018976%2fkafka-2-1-0-java-consumer-vs-scala-consumer%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

How to fix TextFormField cause rebuild widget in Flutter