Why do I need to create a Kafka Consumer to connect to Schema Registry?












0















Previous note: I am fairly new to Kafka.



I am trying to get all schemas from the Schema Registry, but I am not being able to do so only with a schema registry client.
It only works if, prior to that, I instantiate a KafkaConsumer.



Can't understand why.
Here's the code (with the consumer in place).



ConsumerConfig is just a class with all configurations needed. Including the Schema Registry URL.



Consumer<String, String>  consumer = new KafkaConsumer<String, String>(ConsumerConfig.get());
CachedSchemaRegistryClient client = new CachedSchemaRegistryClient(ConsumerConfig.getSchemaRegistryURL(), 30);
Collection<String> listOfSubjects = client.getAllSubjects();
consumer.close();


Without the consumer, i get:




io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
No content to map due to end-of-input




With the consumer, everything works fine.
I would like if someone could shine some light on why this happens, has I see no reason for me to need to connect to the actual Kafka Cluster via a consumer in order to access the Schema Registry that is on another endpoint.










share|improve this question

























  • Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer

    – cricket_007
    Jan 3 at 17:10











  • Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - curl http://localhost:8081/subjects?

    – belo
    Jan 3 at 21:23











  • @cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation.

    – LeYAUable
    Jan 6 at 15:10











  • @belo Yes, I tested it on postman and it works just fine

    – LeYAUable
    Jan 6 at 15:10











  • Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github.

    – cricket_007
    Jan 7 at 3:00


















0















Previous note: I am fairly new to Kafka.



I am trying to get all schemas from the Schema Registry, but I am not being able to do so only with a schema registry client.
It only works if, prior to that, I instantiate a KafkaConsumer.



Can't understand why.
Here's the code (with the consumer in place).



ConsumerConfig is just a class with all configurations needed. Including the Schema Registry URL.



Consumer<String, String>  consumer = new KafkaConsumer<String, String>(ConsumerConfig.get());
CachedSchemaRegistryClient client = new CachedSchemaRegistryClient(ConsumerConfig.getSchemaRegistryURL(), 30);
Collection<String> listOfSubjects = client.getAllSubjects();
consumer.close();


Without the consumer, i get:




io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
No content to map due to end-of-input




With the consumer, everything works fine.
I would like if someone could shine some light on why this happens, has I see no reason for me to need to connect to the actual Kafka Cluster via a consumer in order to access the Schema Registry that is on another endpoint.










share|improve this question

























  • Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer

    – cricket_007
    Jan 3 at 17:10











  • Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - curl http://localhost:8081/subjects?

    – belo
    Jan 3 at 21:23











  • @cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation.

    – LeYAUable
    Jan 6 at 15:10











  • @belo Yes, I tested it on postman and it works just fine

    – LeYAUable
    Jan 6 at 15:10











  • Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github.

    – cricket_007
    Jan 7 at 3:00
















0












0








0








Previous note: I am fairly new to Kafka.



I am trying to get all schemas from the Schema Registry, but I am not being able to do so only with a schema registry client.
It only works if, prior to that, I instantiate a KafkaConsumer.



Can't understand why.
Here's the code (with the consumer in place).



ConsumerConfig is just a class with all configurations needed. Including the Schema Registry URL.



Consumer<String, String>  consumer = new KafkaConsumer<String, String>(ConsumerConfig.get());
CachedSchemaRegistryClient client = new CachedSchemaRegistryClient(ConsumerConfig.getSchemaRegistryURL(), 30);
Collection<String> listOfSubjects = client.getAllSubjects();
consumer.close();


Without the consumer, i get:




io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
No content to map due to end-of-input




With the consumer, everything works fine.
I would like if someone could shine some light on why this happens, has I see no reason for me to need to connect to the actual Kafka Cluster via a consumer in order to access the Schema Registry that is on another endpoint.










share|improve this question
















Previous note: I am fairly new to Kafka.



I am trying to get all schemas from the Schema Registry, but I am not being able to do so only with a schema registry client.
It only works if, prior to that, I instantiate a KafkaConsumer.



Can't understand why.
Here's the code (with the consumer in place).



ConsumerConfig is just a class with all configurations needed. Including the Schema Registry URL.



Consumer<String, String>  consumer = new KafkaConsumer<String, String>(ConsumerConfig.get());
CachedSchemaRegistryClient client = new CachedSchemaRegistryClient(ConsumerConfig.getSchemaRegistryURL(), 30);
Collection<String> listOfSubjects = client.getAllSubjects();
consumer.close();


Without the consumer, i get:




io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
No content to map due to end-of-input




With the consumer, everything works fine.
I would like if someone could shine some light on why this happens, has I see no reason for me to need to connect to the actual Kafka Cluster via a consumer in order to access the Schema Registry that is on another endpoint.







java apache-kafka confluent-schema-registry






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 3 at 17:10









cricket_007

83.8k1147117




83.8k1147117










asked Jan 2 at 17:33









LeYAUableLeYAUable

637




637













  • Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer

    – cricket_007
    Jan 3 at 17:10











  • Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - curl http://localhost:8081/subjects?

    – belo
    Jan 3 at 21:23











  • @cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation.

    – LeYAUable
    Jan 6 at 15:10











  • @belo Yes, I tested it on postman and it works just fine

    – LeYAUable
    Jan 6 at 15:10











  • Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github.

    – cricket_007
    Jan 7 at 3:00





















  • Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer

    – cricket_007
    Jan 3 at 17:10











  • Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - curl http://localhost:8081/subjects?

    – belo
    Jan 3 at 21:23











  • @cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation.

    – LeYAUable
    Jan 6 at 15:10











  • @belo Yes, I tested it on postman and it works just fine

    – LeYAUable
    Jan 6 at 15:10











  • Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github.

    – cricket_007
    Jan 7 at 3:00



















Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer

– cricket_007
Jan 3 at 17:10





Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer

– cricket_007
Jan 3 at 17:10













Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - curl http://localhost:8081/subjects?

– belo
Jan 3 at 21:23





Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - curl http://localhost:8081/subjects?

– belo
Jan 3 at 21:23













@cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation.

– LeYAUable
Jan 6 at 15:10





@cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation.

– LeYAUable
Jan 6 at 15:10













@belo Yes, I tested it on postman and it works just fine

– LeYAUable
Jan 6 at 15:10





@belo Yes, I tested it on postman and it works just fine

– LeYAUable
Jan 6 at 15:10













Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github.

– cricket_007
Jan 7 at 3:00







Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github.

– cricket_007
Jan 7 at 3:00














1 Answer
1






active

oldest

votes


















2














You don't have to create KafkaConsumer instance at all. Both are totally independent.



If you just want to get all the subjects and schema from SchemaRegistry, just create an instance of CachedSchemaRegistryClient and call the related operation.



Here is a working example :



 private final static Map<String, Schema> schemas = new ConcurrentHashMap<>();
protected static SchemaRegistryClient schemaRegistryClient;

public static void main(String args) {
String registryUrl = "http://localhost:8081";
try {
schemaRegistryClient = new CachedSchemaRegistryClient(registryUrl, 30);
System.out.println(schemaRegistryClient);
Collection<String> subjects = schemaRegistryClient.getAllSubjects();
System.out.println(subjects);
} catch (Exception e){
throw new RuntimeException(e);
}
}





share|improve this answer


























  • Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

    – LeYAUable
    Jan 3 at 10:25












Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54010711%2fwhy-do-i-need-to-create-a-kafka-consumer-to-connect-to-schema-registry%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2














You don't have to create KafkaConsumer instance at all. Both are totally independent.



If you just want to get all the subjects and schema from SchemaRegistry, just create an instance of CachedSchemaRegistryClient and call the related operation.



Here is a working example :



 private final static Map<String, Schema> schemas = new ConcurrentHashMap<>();
protected static SchemaRegistryClient schemaRegistryClient;

public static void main(String args) {
String registryUrl = "http://localhost:8081";
try {
schemaRegistryClient = new CachedSchemaRegistryClient(registryUrl, 30);
System.out.println(schemaRegistryClient);
Collection<String> subjects = schemaRegistryClient.getAllSubjects();
System.out.println(subjects);
} catch (Exception e){
throw new RuntimeException(e);
}
}





share|improve this answer


























  • Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

    – LeYAUable
    Jan 3 at 10:25
















2














You don't have to create KafkaConsumer instance at all. Both are totally independent.



If you just want to get all the subjects and schema from SchemaRegistry, just create an instance of CachedSchemaRegistryClient and call the related operation.



Here is a working example :



 private final static Map<String, Schema> schemas = new ConcurrentHashMap<>();
protected static SchemaRegistryClient schemaRegistryClient;

public static void main(String args) {
String registryUrl = "http://localhost:8081";
try {
schemaRegistryClient = new CachedSchemaRegistryClient(registryUrl, 30);
System.out.println(schemaRegistryClient);
Collection<String> subjects = schemaRegistryClient.getAllSubjects();
System.out.println(subjects);
} catch (Exception e){
throw new RuntimeException(e);
}
}





share|improve this answer


























  • Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

    – LeYAUable
    Jan 3 at 10:25














2












2








2







You don't have to create KafkaConsumer instance at all. Both are totally independent.



If you just want to get all the subjects and schema from SchemaRegistry, just create an instance of CachedSchemaRegistryClient and call the related operation.



Here is a working example :



 private final static Map<String, Schema> schemas = new ConcurrentHashMap<>();
protected static SchemaRegistryClient schemaRegistryClient;

public static void main(String args) {
String registryUrl = "http://localhost:8081";
try {
schemaRegistryClient = new CachedSchemaRegistryClient(registryUrl, 30);
System.out.println(schemaRegistryClient);
Collection<String> subjects = schemaRegistryClient.getAllSubjects();
System.out.println(subjects);
} catch (Exception e){
throw new RuntimeException(e);
}
}





share|improve this answer















You don't have to create KafkaConsumer instance at all. Both are totally independent.



If you just want to get all the subjects and schema from SchemaRegistry, just create an instance of CachedSchemaRegistryClient and call the related operation.



Here is a working example :



 private final static Map<String, Schema> schemas = new ConcurrentHashMap<>();
protected static SchemaRegistryClient schemaRegistryClient;

public static void main(String args) {
String registryUrl = "http://localhost:8081";
try {
schemaRegistryClient = new CachedSchemaRegistryClient(registryUrl, 30);
System.out.println(schemaRegistryClient);
Collection<String> subjects = schemaRegistryClient.getAllSubjects();
System.out.println(subjects);
} catch (Exception e){
throw new RuntimeException(e);
}
}






share|improve this answer














share|improve this answer



share|improve this answer








edited Jan 3 at 17:11









cricket_007

83.8k1147117




83.8k1147117










answered Jan 2 at 21:10









Nishu TayalNishu Tayal

12.9k73584




12.9k73584













  • Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

    – LeYAUable
    Jan 3 at 10:25



















  • Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

    – LeYAUable
    Jan 3 at 10:25

















Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

– LeYAUable
Jan 3 at 10:25





Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred.

– LeYAUable
Jan 3 at 10:25




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54010711%2fwhy-do-i-need-to-create-a-kafka-consumer-to-connect-to-schema-registry%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

'app-layout' is not a known element: how to share Component with different Modules

android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

WPF add header to Image with URL pettitions [duplicate]