how to read each row and insert into database along with remove unwant white space as well as multiple comma
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
i've gone through csv-parser (https://www.npmjs.com/package/csv-parse) document but haven't been able to find any solution, also i've read all read me but still haven't found any solution related to this situatuion
[ Row {
bankcustomers_id: '154491',
customerid: ' 154491',
title: 'MR ',
firstname: 'Santa ',
lastname: 'Clause ',
dob: '25-Dec-30',
mobileno: '07900 1234567 ',
emailid: 'santa-clause@northpole ',
' }]
above is the output of product by csv parser now my problem is that is there any provision to remove this white space at start and last of charater or consecutive ,,. And how i can get single column value. How i can remove all mention this is there anybody help me out with this
moreover i want to insert each row into data base so how i can read row on this function
.on('data', (data) =>{
// console.log(data)
results.push(data)
})
node.js csv express
add a comment |
i've gone through csv-parser (https://www.npmjs.com/package/csv-parse) document but haven't been able to find any solution, also i've read all read me but still haven't found any solution related to this situatuion
[ Row {
bankcustomers_id: '154491',
customerid: ' 154491',
title: 'MR ',
firstname: 'Santa ',
lastname: 'Clause ',
dob: '25-Dec-30',
mobileno: '07900 1234567 ',
emailid: 'santa-clause@northpole ',
' }]
above is the output of product by csv parser now my problem is that is there any provision to remove this white space at start and last of charater or consecutive ,,. And how i can get single column value. How i can remove all mention this is there anybody help me out with this
moreover i want to insert each row into data base so how i can read row on this function
.on('data', (data) =>{
// console.log(data)
results.push(data)
})
node.js csv express
add a comment |
i've gone through csv-parser (https://www.npmjs.com/package/csv-parse) document but haven't been able to find any solution, also i've read all read me but still haven't found any solution related to this situatuion
[ Row {
bankcustomers_id: '154491',
customerid: ' 154491',
title: 'MR ',
firstname: 'Santa ',
lastname: 'Clause ',
dob: '25-Dec-30',
mobileno: '07900 1234567 ',
emailid: 'santa-clause@northpole ',
' }]
above is the output of product by csv parser now my problem is that is there any provision to remove this white space at start and last of charater or consecutive ,,. And how i can get single column value. How i can remove all mention this is there anybody help me out with this
moreover i want to insert each row into data base so how i can read row on this function
.on('data', (data) =>{
// console.log(data)
results.push(data)
})
node.js csv express
i've gone through csv-parser (https://www.npmjs.com/package/csv-parse) document but haven't been able to find any solution, also i've read all read me but still haven't found any solution related to this situatuion
[ Row {
bankcustomers_id: '154491',
customerid: ' 154491',
title: 'MR ',
firstname: 'Santa ',
lastname: 'Clause ',
dob: '25-Dec-30',
mobileno: '07900 1234567 ',
emailid: 'santa-clause@northpole ',
' }]
above is the output of product by csv parser now my problem is that is there any provision to remove this white space at start and last of charater or consecutive ,,. And how i can get single column value. How i can remove all mention this is there anybody help me out with this
moreover i want to insert each row into data base so how i can read row on this function
.on('data', (data) =>{
// console.log(data)
results.push(data)
})
node.js csv express
node.js csv express
edited Jan 3 at 14:18
TheSearcher
306
306
asked Jan 3 at 14:00
bipinbipin
605310
605310
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Have you tried the ltrim and rtrim options? These will trim whitespace at either end of each field (if the field is not quoted)
const parse = require('csv-parse')
const input = `bankcustomers_id,customerid,title,firstname,lastname,dob,mobileno,emailid
154491, 154491,MR ,Santa ,Clause ,25-Dec-30,07900 1234567 ,santa-clause@northpole `;
const options = { delimiter: ",", cast: true, columns: true, ltrim: true, rtrim: true};
parse(input, options, function(err, output){
console.log(output);
});
The output I get is:
[ { bankcustomers_id: 154491,
customerid: 154491,
title: 'MR',
firstname: 'Santa',
lastname: 'Clause',
dob: '25-Dec-30',
mobileno: '07900 1234567',
emailid: 'santa-clause@northpole'
} ]
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
1
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54023784%2fhow-to-read-each-row-and-insert-into-database-along-with-remove-unwant-white-spa%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Have you tried the ltrim and rtrim options? These will trim whitespace at either end of each field (if the field is not quoted)
const parse = require('csv-parse')
const input = `bankcustomers_id,customerid,title,firstname,lastname,dob,mobileno,emailid
154491, 154491,MR ,Santa ,Clause ,25-Dec-30,07900 1234567 ,santa-clause@northpole `;
const options = { delimiter: ",", cast: true, columns: true, ltrim: true, rtrim: true};
parse(input, options, function(err, output){
console.log(output);
});
The output I get is:
[ { bankcustomers_id: 154491,
customerid: 154491,
title: 'MR',
firstname: 'Santa',
lastname: 'Clause',
dob: '25-Dec-30',
mobileno: '07900 1234567',
emailid: 'santa-clause@northpole'
} ]
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
1
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
add a comment |
Have you tried the ltrim and rtrim options? These will trim whitespace at either end of each field (if the field is not quoted)
const parse = require('csv-parse')
const input = `bankcustomers_id,customerid,title,firstname,lastname,dob,mobileno,emailid
154491, 154491,MR ,Santa ,Clause ,25-Dec-30,07900 1234567 ,santa-clause@northpole `;
const options = { delimiter: ",", cast: true, columns: true, ltrim: true, rtrim: true};
parse(input, options, function(err, output){
console.log(output);
});
The output I get is:
[ { bankcustomers_id: 154491,
customerid: 154491,
title: 'MR',
firstname: 'Santa',
lastname: 'Clause',
dob: '25-Dec-30',
mobileno: '07900 1234567',
emailid: 'santa-clause@northpole'
} ]
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
1
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
add a comment |
Have you tried the ltrim and rtrim options? These will trim whitespace at either end of each field (if the field is not quoted)
const parse = require('csv-parse')
const input = `bankcustomers_id,customerid,title,firstname,lastname,dob,mobileno,emailid
154491, 154491,MR ,Santa ,Clause ,25-Dec-30,07900 1234567 ,santa-clause@northpole `;
const options = { delimiter: ",", cast: true, columns: true, ltrim: true, rtrim: true};
parse(input, options, function(err, output){
console.log(output);
});
The output I get is:
[ { bankcustomers_id: 154491,
customerid: 154491,
title: 'MR',
firstname: 'Santa',
lastname: 'Clause',
dob: '25-Dec-30',
mobileno: '07900 1234567',
emailid: 'santa-clause@northpole'
} ]
Have you tried the ltrim and rtrim options? These will trim whitespace at either end of each field (if the field is not quoted)
const parse = require('csv-parse')
const input = `bankcustomers_id,customerid,title,firstname,lastname,dob,mobileno,emailid
154491, 154491,MR ,Santa ,Clause ,25-Dec-30,07900 1234567 ,santa-clause@northpole `;
const options = { delimiter: ",", cast: true, columns: true, ltrim: true, rtrim: true};
parse(input, options, function(err, output){
console.log(output);
});
The output I get is:
[ { bankcustomers_id: 154491,
customerid: 154491,
title: 'MR',
firstname: 'Santa',
lastname: 'Clause',
dob: '25-Dec-30',
mobileno: '07900 1234567',
emailid: 'santa-clause@northpole'
} ]
edited Jan 3 at 15:24
answered Jan 3 at 15:18
Terry LennoxTerry Lennox
5,3592718
5,3592718
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
1
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
add a comment |
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
1
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
thanks for you reply I will try it on next day . I have one more concerned will this parser handle lakhs of record if yes what configuration I required to do
– bipin
Jan 3 at 17:38
1
1
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
Cool, by lack of records, do you mean empty records? Like ,,,,,,?
– Terry Lennox
Jan 3 at 18:36
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
thanks for help my question is .will this parser handle huge CSV file .if yes what extra things require to do
– bipin
Jan 4 at 3:52
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
This parser will definitely handle large CSV files, it is used for this purpose by a lot of people! The stream API will ensure scalability, see the docs at csv.js.org/parse/api.
– Terry Lennox
Jan 4 at 8:22
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54023784%2fhow-to-read-each-row-and-insert-into-database-along-with-remove-unwant-white-spa%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown