Read large files and send to jms queue using spring batch
I have a scenario to read a huge file and send the records to a jms queue for further processing.
The file can be CSV/FIX format. I am planning to use Spring batch to achieve the same.
I read about MultiResourcePartitioner for reading huge file when we use spring batch.
Is it a good idea to use spring batch in this scenario? Or should I use plain java code to read huge file? Or is there any other better approach for this scenario?
spring-batch batch-processing
add a comment |
I have a scenario to read a huge file and send the records to a jms queue for further processing.
The file can be CSV/FIX format. I am planning to use Spring batch to achieve the same.
I read about MultiResourcePartitioner for reading huge file when we use spring batch.
Is it a good idea to use spring batch in this scenario? Or should I use plain java code to read huge file? Or is there any other better approach for this scenario?
spring-batch batch-processing
add a comment |
I have a scenario to read a huge file and send the records to a jms queue for further processing.
The file can be CSV/FIX format. I am planning to use Spring batch to achieve the same.
I read about MultiResourcePartitioner for reading huge file when we use spring batch.
Is it a good idea to use spring batch in this scenario? Or should I use plain java code to read huge file? Or is there any other better approach for this scenario?
spring-batch batch-processing
I have a scenario to read a huge file and send the records to a jms queue for further processing.
The file can be CSV/FIX format. I am planning to use Spring batch to achieve the same.
I read about MultiResourcePartitioner for reading huge file when we use spring batch.
Is it a good idea to use spring batch in this scenario? Or should I use plain java code to read huge file? Or is there any other better approach for this scenario?
spring-batch batch-processing
spring-batch batch-processing
asked Jan 1 at 18:54
kattoorkattoor
1359
1359
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
I think Spring Batch is a good choice for your use case for a couple of reasons:
- You can use the
FlatFileItemReader
andJmsItemWriter
out-of-the-box (in comparison to writing this code yourself if you use plain Java) - You will have several scaling options (see below)
- The chunk-oriented processing model is suitable for huge data sets like in your use case
- And many other features for free (transaction management, restartability, etc)
Physically partitioning the input file into multiple resources and using the MultiResourcePartitioner
is indeed a good option for your use case. However, this is not the only way to scale a chunk-oriented step in Spring Batch, you can also use:
- A multi-threaded step where each chunk is processed in a separate thread
- A combination of the
AsyncItemProcessor
/AsyncItemWriter
(useful if you have some heavy processing to do on items before writing them to the queue)
The previous 3 scaling techniques are implemented within a single JVM. There are other options to scale a batch job across multiple JVMs like remote chunking and remote partitioning (but those are not required IMO for your use case).
You can find a talk about all these scaling techniques with code examples here: https://www.youtube.com/watch?v=J6IPlfm7N6w
Hope this helps.
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53998077%2fread-large-files-and-send-to-jms-queue-using-spring-batch%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I think Spring Batch is a good choice for your use case for a couple of reasons:
- You can use the
FlatFileItemReader
andJmsItemWriter
out-of-the-box (in comparison to writing this code yourself if you use plain Java) - You will have several scaling options (see below)
- The chunk-oriented processing model is suitable for huge data sets like in your use case
- And many other features for free (transaction management, restartability, etc)
Physically partitioning the input file into multiple resources and using the MultiResourcePartitioner
is indeed a good option for your use case. However, this is not the only way to scale a chunk-oriented step in Spring Batch, you can also use:
- A multi-threaded step where each chunk is processed in a separate thread
- A combination of the
AsyncItemProcessor
/AsyncItemWriter
(useful if you have some heavy processing to do on items before writing them to the queue)
The previous 3 scaling techniques are implemented within a single JVM. There are other options to scale a batch job across multiple JVMs like remote chunking and remote partitioning (but those are not required IMO for your use case).
You can find a talk about all these scaling techniques with code examples here: https://www.youtube.com/watch?v=J6IPlfm7N6w
Hope this helps.
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
add a comment |
I think Spring Batch is a good choice for your use case for a couple of reasons:
- You can use the
FlatFileItemReader
andJmsItemWriter
out-of-the-box (in comparison to writing this code yourself if you use plain Java) - You will have several scaling options (see below)
- The chunk-oriented processing model is suitable for huge data sets like in your use case
- And many other features for free (transaction management, restartability, etc)
Physically partitioning the input file into multiple resources and using the MultiResourcePartitioner
is indeed a good option for your use case. However, this is not the only way to scale a chunk-oriented step in Spring Batch, you can also use:
- A multi-threaded step where each chunk is processed in a separate thread
- A combination of the
AsyncItemProcessor
/AsyncItemWriter
(useful if you have some heavy processing to do on items before writing them to the queue)
The previous 3 scaling techniques are implemented within a single JVM. There are other options to scale a batch job across multiple JVMs like remote chunking and remote partitioning (but those are not required IMO for your use case).
You can find a talk about all these scaling techniques with code examples here: https://www.youtube.com/watch?v=J6IPlfm7N6w
Hope this helps.
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
add a comment |
I think Spring Batch is a good choice for your use case for a couple of reasons:
- You can use the
FlatFileItemReader
andJmsItemWriter
out-of-the-box (in comparison to writing this code yourself if you use plain Java) - You will have several scaling options (see below)
- The chunk-oriented processing model is suitable for huge data sets like in your use case
- And many other features for free (transaction management, restartability, etc)
Physically partitioning the input file into multiple resources and using the MultiResourcePartitioner
is indeed a good option for your use case. However, this is not the only way to scale a chunk-oriented step in Spring Batch, you can also use:
- A multi-threaded step where each chunk is processed in a separate thread
- A combination of the
AsyncItemProcessor
/AsyncItemWriter
(useful if you have some heavy processing to do on items before writing them to the queue)
The previous 3 scaling techniques are implemented within a single JVM. There are other options to scale a batch job across multiple JVMs like remote chunking and remote partitioning (but those are not required IMO for your use case).
You can find a talk about all these scaling techniques with code examples here: https://www.youtube.com/watch?v=J6IPlfm7N6w
Hope this helps.
I think Spring Batch is a good choice for your use case for a couple of reasons:
- You can use the
FlatFileItemReader
andJmsItemWriter
out-of-the-box (in comparison to writing this code yourself if you use plain Java) - You will have several scaling options (see below)
- The chunk-oriented processing model is suitable for huge data sets like in your use case
- And many other features for free (transaction management, restartability, etc)
Physically partitioning the input file into multiple resources and using the MultiResourcePartitioner
is indeed a good option for your use case. However, this is not the only way to scale a chunk-oriented step in Spring Batch, you can also use:
- A multi-threaded step where each chunk is processed in a separate thread
- A combination of the
AsyncItemProcessor
/AsyncItemWriter
(useful if you have some heavy processing to do on items before writing them to the queue)
The previous 3 scaling techniques are implemented within a single JVM. There are other options to scale a batch job across multiple JVMs like remote chunking and remote partitioning (but those are not required IMO for your use case).
You can find a talk about all these scaling techniques with code examples here: https://www.youtube.com/watch?v=J6IPlfm7N6w
Hope this helps.
answered Jan 1 at 23:51
Mahmoud Ben HassineMahmoud Ben Hassine
5,2051717
5,2051717
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
add a comment |
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
Appreciate your quick response. let me try using the same approach.!
– kattoor
Jan 2 at 16:10
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53998077%2fread-large-files-and-send-to-jms-queue-using-spring-batch%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown