reading large csv file contain comma with php












0















i have csv file that i need to read with php.I used 2 method but there is problem in theme.



methode i used are:
1- file_get_contents
2-fgetcsv
let explaine about csv file. the problem about the file is the fields contain comma that is used for delimiter and its bothering.
the first methode is fast but commas in the fiels make it work incorrectly like number seperator 14,200 . i fixed it withe a function named fixed number. but there is still random text that contain comma and doesnt follow any rule that i can fix them
the second method for large csv is very slow and i cant get out put to see that its working
the code for first methode is like:



$myFile = file_get_contents($file);
$lines = explode("rn",$myFile);//file to an array


while($counter <= count($lines)){
$data=$lines[$counter];

$tmp=fixnumbers($data);
$tmp=eregi_replace('"', '',$tmp);
$tmp=explode(',',$tmp);


if(count($tmp)> 0 ){
$newdata[$datacounter]=$tmp;//explode('*0*',$data);
$datacounter++;
}
$counter++;
}


the second methode is here :



$handle= fopen($file,"r");
$row=1;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);

for ($c=0; $c < $num; $c++) {
$mydata[$row][$c]=$data[$c] . "<br />n";
}$row++;
}

print "<div class="longList"><pre>";
print_r($mydata);
print "</pre></div>";
fclose($file);


}









share|improve this question

























  • Can you post a few lines of your csv file? If a numeric field has a comma in it, then it should be a text field instead and quoted, which fgetcsv should handle properly.

    – ivanivan
    Jan 1 at 17:22











  • here is a sample file but its not english : rapidgator.net/file/ef4f99844b702a8fa3871cc7acaee4af/…

    – hsoft
    Jan 1 at 18:14













  • ang fgetcsv is very slower than file_get_content in my experience

    – hsoft
    Jan 1 at 18:20











  • The few lines I looked at it seemed like the quotes were properly applied. fgetcsv should be able to deal with that for you.

    – ivanivan
    Jan 1 at 18:24











  • i run the second code with fgetcsv and use this file. but it takes more than 10 min and still no out put or max php execution time error

    – hsoft
    Jan 1 at 18:39
















0















i have csv file that i need to read with php.I used 2 method but there is problem in theme.



methode i used are:
1- file_get_contents
2-fgetcsv
let explaine about csv file. the problem about the file is the fields contain comma that is used for delimiter and its bothering.
the first methode is fast but commas in the fiels make it work incorrectly like number seperator 14,200 . i fixed it withe a function named fixed number. but there is still random text that contain comma and doesnt follow any rule that i can fix them
the second method for large csv is very slow and i cant get out put to see that its working
the code for first methode is like:



$myFile = file_get_contents($file);
$lines = explode("rn",$myFile);//file to an array


while($counter <= count($lines)){
$data=$lines[$counter];

$tmp=fixnumbers($data);
$tmp=eregi_replace('"', '',$tmp);
$tmp=explode(',',$tmp);


if(count($tmp)> 0 ){
$newdata[$datacounter]=$tmp;//explode('*0*',$data);
$datacounter++;
}
$counter++;
}


the second methode is here :



$handle= fopen($file,"r");
$row=1;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);

for ($c=0; $c < $num; $c++) {
$mydata[$row][$c]=$data[$c] . "<br />n";
}$row++;
}

print "<div class="longList"><pre>";
print_r($mydata);
print "</pre></div>";
fclose($file);


}









share|improve this question

























  • Can you post a few lines of your csv file? If a numeric field has a comma in it, then it should be a text field instead and quoted, which fgetcsv should handle properly.

    – ivanivan
    Jan 1 at 17:22











  • here is a sample file but its not english : rapidgator.net/file/ef4f99844b702a8fa3871cc7acaee4af/…

    – hsoft
    Jan 1 at 18:14













  • ang fgetcsv is very slower than file_get_content in my experience

    – hsoft
    Jan 1 at 18:20











  • The few lines I looked at it seemed like the quotes were properly applied. fgetcsv should be able to deal with that for you.

    – ivanivan
    Jan 1 at 18:24











  • i run the second code with fgetcsv and use this file. but it takes more than 10 min and still no out put or max php execution time error

    – hsoft
    Jan 1 at 18:39














0












0








0








i have csv file that i need to read with php.I used 2 method but there is problem in theme.



methode i used are:
1- file_get_contents
2-fgetcsv
let explaine about csv file. the problem about the file is the fields contain comma that is used for delimiter and its bothering.
the first methode is fast but commas in the fiels make it work incorrectly like number seperator 14,200 . i fixed it withe a function named fixed number. but there is still random text that contain comma and doesnt follow any rule that i can fix them
the second method for large csv is very slow and i cant get out put to see that its working
the code for first methode is like:



$myFile = file_get_contents($file);
$lines = explode("rn",$myFile);//file to an array


while($counter <= count($lines)){
$data=$lines[$counter];

$tmp=fixnumbers($data);
$tmp=eregi_replace('"', '',$tmp);
$tmp=explode(',',$tmp);


if(count($tmp)> 0 ){
$newdata[$datacounter]=$tmp;//explode('*0*',$data);
$datacounter++;
}
$counter++;
}


the second methode is here :



$handle= fopen($file,"r");
$row=1;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);

for ($c=0; $c < $num; $c++) {
$mydata[$row][$c]=$data[$c] . "<br />n";
}$row++;
}

print "<div class="longList"><pre>";
print_r($mydata);
print "</pre></div>";
fclose($file);


}









share|improve this question
















i have csv file that i need to read with php.I used 2 method but there is problem in theme.



methode i used are:
1- file_get_contents
2-fgetcsv
let explaine about csv file. the problem about the file is the fields contain comma that is used for delimiter and its bothering.
the first methode is fast but commas in the fiels make it work incorrectly like number seperator 14,200 . i fixed it withe a function named fixed number. but there is still random text that contain comma and doesnt follow any rule that i can fix them
the second method for large csv is very slow and i cant get out put to see that its working
the code for first methode is like:



$myFile = file_get_contents($file);
$lines = explode("rn",$myFile);//file to an array


while($counter <= count($lines)){
$data=$lines[$counter];

$tmp=fixnumbers($data);
$tmp=eregi_replace('"', '',$tmp);
$tmp=explode(',',$tmp);


if(count($tmp)> 0 ){
$newdata[$datacounter]=$tmp;//explode('*0*',$data);
$datacounter++;
}
$counter++;
}


the second methode is here :



$handle= fopen($file,"r");
$row=1;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);

for ($c=0; $c < $num; $c++) {
$mydata[$row][$c]=$data[$c] . "<br />n";
}$row++;
}

print "<div class="longList"><pre>";
print_r($mydata);
print "</pre></div>";
fclose($file);


}






php csv






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 1 at 16:36









Funk Forty Niner

1




1










asked Jan 1 at 16:33









hsofthsoft

246




246













  • Can you post a few lines of your csv file? If a numeric field has a comma in it, then it should be a text field instead and quoted, which fgetcsv should handle properly.

    – ivanivan
    Jan 1 at 17:22











  • here is a sample file but its not english : rapidgator.net/file/ef4f99844b702a8fa3871cc7acaee4af/…

    – hsoft
    Jan 1 at 18:14













  • ang fgetcsv is very slower than file_get_content in my experience

    – hsoft
    Jan 1 at 18:20











  • The few lines I looked at it seemed like the quotes were properly applied. fgetcsv should be able to deal with that for you.

    – ivanivan
    Jan 1 at 18:24











  • i run the second code with fgetcsv and use this file. but it takes more than 10 min and still no out put or max php execution time error

    – hsoft
    Jan 1 at 18:39



















  • Can you post a few lines of your csv file? If a numeric field has a comma in it, then it should be a text field instead and quoted, which fgetcsv should handle properly.

    – ivanivan
    Jan 1 at 17:22











  • here is a sample file but its not english : rapidgator.net/file/ef4f99844b702a8fa3871cc7acaee4af/…

    – hsoft
    Jan 1 at 18:14













  • ang fgetcsv is very slower than file_get_content in my experience

    – hsoft
    Jan 1 at 18:20











  • The few lines I looked at it seemed like the quotes were properly applied. fgetcsv should be able to deal with that for you.

    – ivanivan
    Jan 1 at 18:24











  • i run the second code with fgetcsv and use this file. but it takes more than 10 min and still no out put or max php execution time error

    – hsoft
    Jan 1 at 18:39

















Can you post a few lines of your csv file? If a numeric field has a comma in it, then it should be a text field instead and quoted, which fgetcsv should handle properly.

– ivanivan
Jan 1 at 17:22





Can you post a few lines of your csv file? If a numeric field has a comma in it, then it should be a text field instead and quoted, which fgetcsv should handle properly.

– ivanivan
Jan 1 at 17:22













here is a sample file but its not english : rapidgator.net/file/ef4f99844b702a8fa3871cc7acaee4af/…

– hsoft
Jan 1 at 18:14







here is a sample file but its not english : rapidgator.net/file/ef4f99844b702a8fa3871cc7acaee4af/…

– hsoft
Jan 1 at 18:14















ang fgetcsv is very slower than file_get_content in my experience

– hsoft
Jan 1 at 18:20





ang fgetcsv is very slower than file_get_content in my experience

– hsoft
Jan 1 at 18:20













The few lines I looked at it seemed like the quotes were properly applied. fgetcsv should be able to deal with that for you.

– ivanivan
Jan 1 at 18:24





The few lines I looked at it seemed like the quotes were properly applied. fgetcsv should be able to deal with that for you.

– ivanivan
Jan 1 at 18:24













i run the second code with fgetcsv and use this file. but it takes more than 10 min and still no out put or max php execution time error

– hsoft
Jan 1 at 18:39





i run the second code with fgetcsv and use this file. but it takes more than 10 min and still no out put or max php execution time error

– hsoft
Jan 1 at 18:39












1 Answer
1






active

oldest

votes


















0














So I waited the minute to download the file, grabbed the first 5 records, and used a copy/paste of the fgetcsv example in the PHP manual.



First 5 records - https://termbin.com/23ti - saved as "sm_file.csv"



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num++;
}
fclose($handle);
print_r($data);
}

?>



[0] => Array
(
[0] => از تاريخ وصل 01/07/1397 - با برنامه
[1] => تاريخ گزارش: 29/09/1397
[2] => شماره گزارش: (3-5)
[3] => صفحه 1
[4] => گزارش قطع و وصل فيدرهاي فشار متوسط (نمونه 3)
[5] => ملاحظات
[6] => شرايط جوي
[7] => عملكرد ريكلوزر
[8] => رله عامل
[9] => خاموشي (MWh)
[10] => بار فيدر (A)
[11] => مدت قطع
[12] => زمان وصل
[13] => تاريخ وصل
[14] => زمان قطع
[15] => تاريخ قطع
[16] => نوع اشكال بوجود آمده
[17] => فيدر فشار متوسط
[18] => پست فوق توزيع
[19] => شماره پرونده
[20] => رديف
[21] => ناحيه اسالم
[22] =>
[23] => آفتابي
[24] => ندارد
[25] => ندارد
[26] => 0.21
[27] => 3
[28] => 132
[29] => 11:30
[30] => 1397/07/04
[31] => 09:18
[32] => 1397/07/04
[33] => جهت كار در حريم شبكه
[34] => گيسوم
[35] => اسا لم
[36] => 96,042,429,972
[37] => 1
[38] => 61292.56
[39] => جمع کل بار فيدر:
[40] => 393.85
[41] => جمع کل خاموشي:
[42] => 92,725
[43] => جمع مدت قطع:
)


Looks like data element 36 is the one you are having issues with, as you can see fgetcsv handles it fine, you just need to convert from a string to a number as you process the data. Just strip the commas.



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$data[(count($data)-1)][36]=str_replace(",","",$data[(count($data)-1)][36]);
}
fclose($handle);
print_r($data);
}

?>


Which gives



[36] => 96042429972


As for how long it takes, your full file of 2k records



User time (seconds): 0.12
System time (seconds): 0.09
Percent of CPU this job got: 43%
Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.52
Average shared text size (kbytes): 0
Average unshared data size (kbytes): 0
Average stack size (kbytes): 0
Average total size (kbytes): 0
Maximum resident set size (kbytes): 41820
Average resident set size (kbytes): 0
Major (requiring I/O) page faults: 0
Minor (reclaiming a frame) page faults: 2448
Voluntary context switches: 18
Involuntary context switches: 55
Swaps: 0
File system inputs: 0
File system outputs: 0
Socket messages sent: 0
Socket messages received: 0
Signals delivered: 0
Page size (bytes): 4096
Exit status: 0


on a modest i5 w/ 8gb ram. Not seeing any issues.






share|improve this answer
























  • thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

    – hsoft
    Jan 2 at 3:08













  • i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

    – hsoft
    Jan 6 at 9:54













  • if i print the output it even take longer time

    – hsoft
    Jan 6 at 10:01











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53997116%2freading-large-csv-file-contain-comma-with-php%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














So I waited the minute to download the file, grabbed the first 5 records, and used a copy/paste of the fgetcsv example in the PHP manual.



First 5 records - https://termbin.com/23ti - saved as "sm_file.csv"



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num++;
}
fclose($handle);
print_r($data);
}

?>



[0] => Array
(
[0] => از تاريخ وصل 01/07/1397 - با برنامه
[1] => تاريخ گزارش: 29/09/1397
[2] => شماره گزارش: (3-5)
[3] => صفحه 1
[4] => گزارش قطع و وصل فيدرهاي فشار متوسط (نمونه 3)
[5] => ملاحظات
[6] => شرايط جوي
[7] => عملكرد ريكلوزر
[8] => رله عامل
[9] => خاموشي (MWh)
[10] => بار فيدر (A)
[11] => مدت قطع
[12] => زمان وصل
[13] => تاريخ وصل
[14] => زمان قطع
[15] => تاريخ قطع
[16] => نوع اشكال بوجود آمده
[17] => فيدر فشار متوسط
[18] => پست فوق توزيع
[19] => شماره پرونده
[20] => رديف
[21] => ناحيه اسالم
[22] =>
[23] => آفتابي
[24] => ندارد
[25] => ندارد
[26] => 0.21
[27] => 3
[28] => 132
[29] => 11:30
[30] => 1397/07/04
[31] => 09:18
[32] => 1397/07/04
[33] => جهت كار در حريم شبكه
[34] => گيسوم
[35] => اسا لم
[36] => 96,042,429,972
[37] => 1
[38] => 61292.56
[39] => جمع کل بار فيدر:
[40] => 393.85
[41] => جمع کل خاموشي:
[42] => 92,725
[43] => جمع مدت قطع:
)


Looks like data element 36 is the one you are having issues with, as you can see fgetcsv handles it fine, you just need to convert from a string to a number as you process the data. Just strip the commas.



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$data[(count($data)-1)][36]=str_replace(",","",$data[(count($data)-1)][36]);
}
fclose($handle);
print_r($data);
}

?>


Which gives



[36] => 96042429972


As for how long it takes, your full file of 2k records



User time (seconds): 0.12
System time (seconds): 0.09
Percent of CPU this job got: 43%
Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.52
Average shared text size (kbytes): 0
Average unshared data size (kbytes): 0
Average stack size (kbytes): 0
Average total size (kbytes): 0
Maximum resident set size (kbytes): 41820
Average resident set size (kbytes): 0
Major (requiring I/O) page faults: 0
Minor (reclaiming a frame) page faults: 2448
Voluntary context switches: 18
Involuntary context switches: 55
Swaps: 0
File system inputs: 0
File system outputs: 0
Socket messages sent: 0
Socket messages received: 0
Signals delivered: 0
Page size (bytes): 4096
Exit status: 0


on a modest i5 w/ 8gb ram. Not seeing any issues.






share|improve this answer
























  • thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

    – hsoft
    Jan 2 at 3:08













  • i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

    – hsoft
    Jan 6 at 9:54













  • if i print the output it even take longer time

    – hsoft
    Jan 6 at 10:01
















0














So I waited the minute to download the file, grabbed the first 5 records, and used a copy/paste of the fgetcsv example in the PHP manual.



First 5 records - https://termbin.com/23ti - saved as "sm_file.csv"



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num++;
}
fclose($handle);
print_r($data);
}

?>



[0] => Array
(
[0] => از تاريخ وصل 01/07/1397 - با برنامه
[1] => تاريخ گزارش: 29/09/1397
[2] => شماره گزارش: (3-5)
[3] => صفحه 1
[4] => گزارش قطع و وصل فيدرهاي فشار متوسط (نمونه 3)
[5] => ملاحظات
[6] => شرايط جوي
[7] => عملكرد ريكلوزر
[8] => رله عامل
[9] => خاموشي (MWh)
[10] => بار فيدر (A)
[11] => مدت قطع
[12] => زمان وصل
[13] => تاريخ وصل
[14] => زمان قطع
[15] => تاريخ قطع
[16] => نوع اشكال بوجود آمده
[17] => فيدر فشار متوسط
[18] => پست فوق توزيع
[19] => شماره پرونده
[20] => رديف
[21] => ناحيه اسالم
[22] =>
[23] => آفتابي
[24] => ندارد
[25] => ندارد
[26] => 0.21
[27] => 3
[28] => 132
[29] => 11:30
[30] => 1397/07/04
[31] => 09:18
[32] => 1397/07/04
[33] => جهت كار در حريم شبكه
[34] => گيسوم
[35] => اسا لم
[36] => 96,042,429,972
[37] => 1
[38] => 61292.56
[39] => جمع کل بار فيدر:
[40] => 393.85
[41] => جمع کل خاموشي:
[42] => 92,725
[43] => جمع مدت قطع:
)


Looks like data element 36 is the one you are having issues with, as you can see fgetcsv handles it fine, you just need to convert from a string to a number as you process the data. Just strip the commas.



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$data[(count($data)-1)][36]=str_replace(",","",$data[(count($data)-1)][36]);
}
fclose($handle);
print_r($data);
}

?>


Which gives



[36] => 96042429972


As for how long it takes, your full file of 2k records



User time (seconds): 0.12
System time (seconds): 0.09
Percent of CPU this job got: 43%
Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.52
Average shared text size (kbytes): 0
Average unshared data size (kbytes): 0
Average stack size (kbytes): 0
Average total size (kbytes): 0
Maximum resident set size (kbytes): 41820
Average resident set size (kbytes): 0
Major (requiring I/O) page faults: 0
Minor (reclaiming a frame) page faults: 2448
Voluntary context switches: 18
Involuntary context switches: 55
Swaps: 0
File system inputs: 0
File system outputs: 0
Socket messages sent: 0
Socket messages received: 0
Signals delivered: 0
Page size (bytes): 4096
Exit status: 0


on a modest i5 w/ 8gb ram. Not seeing any issues.






share|improve this answer
























  • thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

    – hsoft
    Jan 2 at 3:08













  • i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

    – hsoft
    Jan 6 at 9:54













  • if i print the output it even take longer time

    – hsoft
    Jan 6 at 10:01














0












0








0







So I waited the minute to download the file, grabbed the first 5 records, and used a copy/paste of the fgetcsv example in the PHP manual.



First 5 records - https://termbin.com/23ti - saved as "sm_file.csv"



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num++;
}
fclose($handle);
print_r($data);
}

?>



[0] => Array
(
[0] => از تاريخ وصل 01/07/1397 - با برنامه
[1] => تاريخ گزارش: 29/09/1397
[2] => شماره گزارش: (3-5)
[3] => صفحه 1
[4] => گزارش قطع و وصل فيدرهاي فشار متوسط (نمونه 3)
[5] => ملاحظات
[6] => شرايط جوي
[7] => عملكرد ريكلوزر
[8] => رله عامل
[9] => خاموشي (MWh)
[10] => بار فيدر (A)
[11] => مدت قطع
[12] => زمان وصل
[13] => تاريخ وصل
[14] => زمان قطع
[15] => تاريخ قطع
[16] => نوع اشكال بوجود آمده
[17] => فيدر فشار متوسط
[18] => پست فوق توزيع
[19] => شماره پرونده
[20] => رديف
[21] => ناحيه اسالم
[22] =>
[23] => آفتابي
[24] => ندارد
[25] => ندارد
[26] => 0.21
[27] => 3
[28] => 132
[29] => 11:30
[30] => 1397/07/04
[31] => 09:18
[32] => 1397/07/04
[33] => جهت كار در حريم شبكه
[34] => گيسوم
[35] => اسا لم
[36] => 96,042,429,972
[37] => 1
[38] => 61292.56
[39] => جمع کل بار فيدر:
[40] => 393.85
[41] => جمع کل خاموشي:
[42] => 92,725
[43] => جمع مدت قطع:
)


Looks like data element 36 is the one you are having issues with, as you can see fgetcsv handles it fine, you just need to convert from a string to a number as you process the data. Just strip the commas.



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$data[(count($data)-1)][36]=str_replace(",","",$data[(count($data)-1)][36]);
}
fclose($handle);
print_r($data);
}

?>


Which gives



[36] => 96042429972


As for how long it takes, your full file of 2k records



User time (seconds): 0.12
System time (seconds): 0.09
Percent of CPU this job got: 43%
Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.52
Average shared text size (kbytes): 0
Average unshared data size (kbytes): 0
Average stack size (kbytes): 0
Average total size (kbytes): 0
Maximum resident set size (kbytes): 41820
Average resident set size (kbytes): 0
Major (requiring I/O) page faults: 0
Minor (reclaiming a frame) page faults: 2448
Voluntary context switches: 18
Involuntary context switches: 55
Swaps: 0
File system inputs: 0
File system outputs: 0
Socket messages sent: 0
Socket messages received: 0
Signals delivered: 0
Page size (bytes): 4096
Exit status: 0


on a modest i5 w/ 8gb ram. Not seeing any issues.






share|improve this answer













So I waited the minute to download the file, grabbed the first 5 records, and used a copy/paste of the fgetcsv example in the PHP manual.



First 5 records - https://termbin.com/23ti - saved as "sm_file.csv"



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num++;
}
fclose($handle);
print_r($data);
}

?>



[0] => Array
(
[0] => از تاريخ وصل 01/07/1397 - با برنامه
[1] => تاريخ گزارش: 29/09/1397
[2] => شماره گزارش: (3-5)
[3] => صفحه 1
[4] => گزارش قطع و وصل فيدرهاي فشار متوسط (نمونه 3)
[5] => ملاحظات
[6] => شرايط جوي
[7] => عملكرد ريكلوزر
[8] => رله عامل
[9] => خاموشي (MWh)
[10] => بار فيدر (A)
[11] => مدت قطع
[12] => زمان وصل
[13] => تاريخ وصل
[14] => زمان قطع
[15] => تاريخ قطع
[16] => نوع اشكال بوجود آمده
[17] => فيدر فشار متوسط
[18] => پست فوق توزيع
[19] => شماره پرونده
[20] => رديف
[21] => ناحيه اسالم
[22] =>
[23] => آفتابي
[24] => ندارد
[25] => ندارد
[26] => 0.21
[27] => 3
[28] => 132
[29] => 11:30
[30] => 1397/07/04
[31] => 09:18
[32] => 1397/07/04
[33] => جهت كار در حريم شبكه
[34] => گيسوم
[35] => اسا لم
[36] => 96,042,429,972
[37] => 1
[38] => 61292.56
[39] => جمع کل بار فيدر:
[40] => 393.85
[41] => جمع کل خاموشي:
[42] => 92,725
[43] => جمع مدت قطع:
)


Looks like data element 36 is the one you are having issues with, as you can see fgetcsv handles it fine, you just need to convert from a string to a number as you process the data. Just strip the commas.



<?php

if (($handle = fopen("sm_file.csv", "r")) !== FALSE) {
$data=array();
$num=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$data[(count($data)-1)][36]=str_replace(",","",$data[(count($data)-1)][36]);
}
fclose($handle);
print_r($data);
}

?>


Which gives



[36] => 96042429972


As for how long it takes, your full file of 2k records



User time (seconds): 0.12
System time (seconds): 0.09
Percent of CPU this job got: 43%
Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.52
Average shared text size (kbytes): 0
Average unshared data size (kbytes): 0
Average stack size (kbytes): 0
Average total size (kbytes): 0
Maximum resident set size (kbytes): 41820
Average resident set size (kbytes): 0
Major (requiring I/O) page faults: 0
Minor (reclaiming a frame) page faults: 2448
Voluntary context switches: 18
Involuntary context switches: 55
Swaps: 0
File system inputs: 0
File system outputs: 0
Socket messages sent: 0
Socket messages received: 0
Signals delivered: 0
Page size (bytes): 4096
Exit status: 0


on a modest i5 w/ 8gb ram. Not seeing any issues.







share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 1 at 19:32









ivanivanivanivan

1,683268




1,683268













  • thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

    – hsoft
    Jan 2 at 3:08













  • i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

    – hsoft
    Jan 6 at 9:54













  • if i print the output it even take longer time

    – hsoft
    Jan 6 at 10:01



















  • thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

    – hsoft
    Jan 2 at 3:08













  • i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

    – hsoft
    Jan 6 at 9:54













  • if i print the output it even take longer time

    – hsoft
    Jan 6 at 10:01

















thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

– hsoft
Jan 2 at 3:08







thanks for reply. i solved the numbers with comma problem but there is some text with comma that i can not handle with file_get_content methode. also i dont know why fgetcsv take too long for me to get output. last time i use $array = array_map('str_getcsv', file($file));. it seems working fine but i need to test it on many csv for sure. anyway thank you again

– hsoft
Jan 2 at 3:08















i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

– hsoft
Jan 6 at 9:54







i modified the script and run the script with about 1500 record plus inserting into mysql database. it taking about 2 min. im worried about csv with more than 10000 record. my laptop is i5 and 4gb memory. i increase php memory limit to 512 mb. can u suggest a way to decrease the run time or doit seperately without split csv manualy."my work pc that i want to run this script is dual core" but the run time is like my laptop. i think its not to do with cpu very much. but i dont know how to increase the running and inserting to database speed

– hsoft
Jan 6 at 9:54















if i print the output it even take longer time

– hsoft
Jan 6 at 10:01





if i print the output it even take longer time

– hsoft
Jan 6 at 10:01




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53997116%2freading-large-csv-file-contain-comma-with-php%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

A Topological Invariant for $pi_3(U(n))$