PHP memory limit with SQL result





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







5















I'm facing a problem that shouldn't be hard at all to solve, but somehow I can't. Here is scenario:



I want to fetch results from SQL statement into PHP array, seems simple right?



The problem is that it gives me error about memory_limit (Fatal error: Out of memory (allocated 408944640) (tried to allocate 805306376 bytes)).



Here is my code that is working with statements that gives less results (statement that I need to use contains +/- 4000 records).



$connect = @new mysqli($host, $db_user, $db_password, $db_name);

if($connect->connect_errno!=0)
{
echo "Error: ".$connect->connect_errno."Why: ". $connect->connect_error;
}
else{
$query = "SELECT meta_value, post_date
FROM
wp_postmeta
INNER JOIN
wp_posts
ON wp_posts.ID=wp_postmeta.post_id
where wp_posts.post_type ='shop_order' AND wp_postmeta.meta_key = '_order_total'";
$sql_query=mysqli_query($connect,$query);
if ($sql_query->num_rows > 0){
while(($row = mysqli_fetch_assoc($sql_query))!== false){
$result = $row;
}} else {
echo "0 results";
}
}

echo json_encode($result);


Not really sure why it's uses 0.8 GB of memory (seems pretty much for a simple statement. However to give it a try I have changed memory_limit in php.ini file to: memory_limit=8192M. But it didn't help at all (checked if memory_limit appiled calling php_info() function).



Thank you guys!










share|improve this question























  • It is not the statement, it is the data from the table stored in array $result

    – Quasimodo's clone
    Jan 3 at 13:01











  • Do you really need an INNER JOIN for this?

    – Gobbin
    Jan 3 at 13:01






  • 1





    Your mysqli_fetch_assoc is already an array. So why just not echo that and skip the while loop

    – Gobbin
    Jan 3 at 13:03






  • 5





    Do you really need those 4000 rows at the same time?

    – jarlh
    Jan 3 at 13:03








  • 1





    What aim do you really want to achieve? This might be an XY problem

    – Quasimodo's clone
    Jan 3 at 13:09


















5















I'm facing a problem that shouldn't be hard at all to solve, but somehow I can't. Here is scenario:



I want to fetch results from SQL statement into PHP array, seems simple right?



The problem is that it gives me error about memory_limit (Fatal error: Out of memory (allocated 408944640) (tried to allocate 805306376 bytes)).



Here is my code that is working with statements that gives less results (statement that I need to use contains +/- 4000 records).



$connect = @new mysqli($host, $db_user, $db_password, $db_name);

if($connect->connect_errno!=0)
{
echo "Error: ".$connect->connect_errno."Why: ". $connect->connect_error;
}
else{
$query = "SELECT meta_value, post_date
FROM
wp_postmeta
INNER JOIN
wp_posts
ON wp_posts.ID=wp_postmeta.post_id
where wp_posts.post_type ='shop_order' AND wp_postmeta.meta_key = '_order_total'";
$sql_query=mysqli_query($connect,$query);
if ($sql_query->num_rows > 0){
while(($row = mysqli_fetch_assoc($sql_query))!== false){
$result = $row;
}} else {
echo "0 results";
}
}

echo json_encode($result);


Not really sure why it's uses 0.8 GB of memory (seems pretty much for a simple statement. However to give it a try I have changed memory_limit in php.ini file to: memory_limit=8192M. But it didn't help at all (checked if memory_limit appiled calling php_info() function).



Thank you guys!










share|improve this question























  • It is not the statement, it is the data from the table stored in array $result

    – Quasimodo's clone
    Jan 3 at 13:01











  • Do you really need an INNER JOIN for this?

    – Gobbin
    Jan 3 at 13:01






  • 1





    Your mysqli_fetch_assoc is already an array. So why just not echo that and skip the while loop

    – Gobbin
    Jan 3 at 13:03






  • 5





    Do you really need those 4000 rows at the same time?

    – jarlh
    Jan 3 at 13:03








  • 1





    What aim do you really want to achieve? This might be an XY problem

    – Quasimodo's clone
    Jan 3 at 13:09














5












5








5








I'm facing a problem that shouldn't be hard at all to solve, but somehow I can't. Here is scenario:



I want to fetch results from SQL statement into PHP array, seems simple right?



The problem is that it gives me error about memory_limit (Fatal error: Out of memory (allocated 408944640) (tried to allocate 805306376 bytes)).



Here is my code that is working with statements that gives less results (statement that I need to use contains +/- 4000 records).



$connect = @new mysqli($host, $db_user, $db_password, $db_name);

if($connect->connect_errno!=0)
{
echo "Error: ".$connect->connect_errno."Why: ". $connect->connect_error;
}
else{
$query = "SELECT meta_value, post_date
FROM
wp_postmeta
INNER JOIN
wp_posts
ON wp_posts.ID=wp_postmeta.post_id
where wp_posts.post_type ='shop_order' AND wp_postmeta.meta_key = '_order_total'";
$sql_query=mysqli_query($connect,$query);
if ($sql_query->num_rows > 0){
while(($row = mysqli_fetch_assoc($sql_query))!== false){
$result = $row;
}} else {
echo "0 results";
}
}

echo json_encode($result);


Not really sure why it's uses 0.8 GB of memory (seems pretty much for a simple statement. However to give it a try I have changed memory_limit in php.ini file to: memory_limit=8192M. But it didn't help at all (checked if memory_limit appiled calling php_info() function).



Thank you guys!










share|improve this question














I'm facing a problem that shouldn't be hard at all to solve, but somehow I can't. Here is scenario:



I want to fetch results from SQL statement into PHP array, seems simple right?



The problem is that it gives me error about memory_limit (Fatal error: Out of memory (allocated 408944640) (tried to allocate 805306376 bytes)).



Here is my code that is working with statements that gives less results (statement that I need to use contains +/- 4000 records).



$connect = @new mysqli($host, $db_user, $db_password, $db_name);

if($connect->connect_errno!=0)
{
echo "Error: ".$connect->connect_errno."Why: ". $connect->connect_error;
}
else{
$query = "SELECT meta_value, post_date
FROM
wp_postmeta
INNER JOIN
wp_posts
ON wp_posts.ID=wp_postmeta.post_id
where wp_posts.post_type ='shop_order' AND wp_postmeta.meta_key = '_order_total'";
$sql_query=mysqli_query($connect,$query);
if ($sql_query->num_rows > 0){
while(($row = mysqli_fetch_assoc($sql_query))!== false){
$result = $row;
}} else {
echo "0 results";
}
}

echo json_encode($result);


Not really sure why it's uses 0.8 GB of memory (seems pretty much for a simple statement. However to give it a try I have changed memory_limit in php.ini file to: memory_limit=8192M. But it didn't help at all (checked if memory_limit appiled calling php_info() function).



Thank you guys!







php sql






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jan 3 at 12:58









SinnerSinner

295




295













  • It is not the statement, it is the data from the table stored in array $result

    – Quasimodo's clone
    Jan 3 at 13:01











  • Do you really need an INNER JOIN for this?

    – Gobbin
    Jan 3 at 13:01






  • 1





    Your mysqli_fetch_assoc is already an array. So why just not echo that and skip the while loop

    – Gobbin
    Jan 3 at 13:03






  • 5





    Do you really need those 4000 rows at the same time?

    – jarlh
    Jan 3 at 13:03








  • 1





    What aim do you really want to achieve? This might be an XY problem

    – Quasimodo's clone
    Jan 3 at 13:09



















  • It is not the statement, it is the data from the table stored in array $result

    – Quasimodo's clone
    Jan 3 at 13:01











  • Do you really need an INNER JOIN for this?

    – Gobbin
    Jan 3 at 13:01






  • 1





    Your mysqli_fetch_assoc is already an array. So why just not echo that and skip the while loop

    – Gobbin
    Jan 3 at 13:03






  • 5





    Do you really need those 4000 rows at the same time?

    – jarlh
    Jan 3 at 13:03








  • 1





    What aim do you really want to achieve? This might be an XY problem

    – Quasimodo's clone
    Jan 3 at 13:09

















It is not the statement, it is the data from the table stored in array $result

– Quasimodo's clone
Jan 3 at 13:01





It is not the statement, it is the data from the table stored in array $result

– Quasimodo's clone
Jan 3 at 13:01













Do you really need an INNER JOIN for this?

– Gobbin
Jan 3 at 13:01





Do you really need an INNER JOIN for this?

– Gobbin
Jan 3 at 13:01




1




1





Your mysqli_fetch_assoc is already an array. So why just not echo that and skip the while loop

– Gobbin
Jan 3 at 13:03





Your mysqli_fetch_assoc is already an array. So why just not echo that and skip the while loop

– Gobbin
Jan 3 at 13:03




5




5





Do you really need those 4000 rows at the same time?

– jarlh
Jan 3 at 13:03







Do you really need those 4000 rows at the same time?

– jarlh
Jan 3 at 13:03






1




1





What aim do you really want to achieve? This might be an XY problem

– Quasimodo's clone
Jan 3 at 13:09





What aim do you really want to achieve? This might be an XY problem

– Quasimodo's clone
Jan 3 at 13:09












2 Answers
2






active

oldest

votes


















1














Do not hold full result tables in memory or your server will be vulnerable to DOS attacks easily. The server has to be able to reply to many requests simultaneously. Process only one row at once and release the allocated memory.



You should output the JSON chunks comma-seperated directly.



echo '[';

$sep = '';
while(($row = mysqli_fetch_assoc($sql_query))!== false){
echo $sep . json_encode($row);
$sep = ',';
}

echo ']';


Alternatively you could cache it in a temporary file.



On newer MySql versions you could try:



SELECT JSON_OBJECT('meta_value', meta_value, 'post_date', post_date)
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


This generates a file with JSON lines, but not comma-seperated.
As documentated on mysqlserverteam.com the aggretage function JSON_ARRAYAGG should fix that. (My installation unfortunately has third party repository conflicts, thus I cannot test it.)



SELECT JSON_ARRAGG(JSON_OBJECT('meta_value', meta_value, 'post_date', post_date))
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


If JSON_ARRAYAGG causes problems for some reason, a workaround (at least on unix-like systems) would be:



<?php
$guid = bin2hex(openssl_random_pseudo_bytes(16));
$tmp_filename = "/tmp/mysql-json-{$guid}.json";
/* let MySql generate the tempfile, then */

passthru('sed '1 s/^/[/; $! s/$/,/ ; $ s/$/]/' ' . $tmp_filename);
?>





share|improve this answer


























  • result is [{},{}{}]

    – Artem Ilchenko
    Jan 3 at 13:31













  • now [{},{},] :)

    – Artem Ilchenko
    Jan 3 at 13:33











  • Yes, edited. Should sleep before coding on the fly. :)

    – Quasimodo's clone
    Jan 3 at 13:33











  • No, $sep is the prefix which is set after the first iteration.

    – Quasimodo's clone
    Jan 3 at 13:34













  • you're right :)

    – Artem Ilchenko
    Jan 3 at 13:35



















0














Why not use Wordpress to query and paginate like this:



$args = array(
"post_type" => "shop_order",
"posts_per_page" => 1000,
"post_status" => array("any"),
"paged" => $paged, // pass your pagination here from javascript
"meta_query" => array(
"key" => "_order_total",
"compare" => "EXISTS"
)
);
$custom_posts = get_posts($args);

echo json_encode($custom_posts);





share|improve this answer


























  • The user needs the result to create a chart. Paginating the result is not what he needs.

    – Patrick Simard
    Jan 3 at 13:28











  • I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

    – Sinner
    Jan 3 at 13:29











  • Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

    – Tim Hallman
    Jan 3 at 13:29











  • I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

    – Tim Hallman
    Jan 3 at 13:32






  • 1





    @TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

    – Sinner
    Jan 3 at 13:41












Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54022786%2fphp-memory-limit-with-sql-result%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









1














Do not hold full result tables in memory or your server will be vulnerable to DOS attacks easily. The server has to be able to reply to many requests simultaneously. Process only one row at once and release the allocated memory.



You should output the JSON chunks comma-seperated directly.



echo '[';

$sep = '';
while(($row = mysqli_fetch_assoc($sql_query))!== false){
echo $sep . json_encode($row);
$sep = ',';
}

echo ']';


Alternatively you could cache it in a temporary file.



On newer MySql versions you could try:



SELECT JSON_OBJECT('meta_value', meta_value, 'post_date', post_date)
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


This generates a file with JSON lines, but not comma-seperated.
As documentated on mysqlserverteam.com the aggretage function JSON_ARRAYAGG should fix that. (My installation unfortunately has third party repository conflicts, thus I cannot test it.)



SELECT JSON_ARRAGG(JSON_OBJECT('meta_value', meta_value, 'post_date', post_date))
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


If JSON_ARRAYAGG causes problems for some reason, a workaround (at least on unix-like systems) would be:



<?php
$guid = bin2hex(openssl_random_pseudo_bytes(16));
$tmp_filename = "/tmp/mysql-json-{$guid}.json";
/* let MySql generate the tempfile, then */

passthru('sed '1 s/^/[/; $! s/$/,/ ; $ s/$/]/' ' . $tmp_filename);
?>





share|improve this answer


























  • result is [{},{}{}]

    – Artem Ilchenko
    Jan 3 at 13:31













  • now [{},{},] :)

    – Artem Ilchenko
    Jan 3 at 13:33











  • Yes, edited. Should sleep before coding on the fly. :)

    – Quasimodo's clone
    Jan 3 at 13:33











  • No, $sep is the prefix which is set after the first iteration.

    – Quasimodo's clone
    Jan 3 at 13:34













  • you're right :)

    – Artem Ilchenko
    Jan 3 at 13:35
















1














Do not hold full result tables in memory or your server will be vulnerable to DOS attacks easily. The server has to be able to reply to many requests simultaneously. Process only one row at once and release the allocated memory.



You should output the JSON chunks comma-seperated directly.



echo '[';

$sep = '';
while(($row = mysqli_fetch_assoc($sql_query))!== false){
echo $sep . json_encode($row);
$sep = ',';
}

echo ']';


Alternatively you could cache it in a temporary file.



On newer MySql versions you could try:



SELECT JSON_OBJECT('meta_value', meta_value, 'post_date', post_date)
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


This generates a file with JSON lines, but not comma-seperated.
As documentated on mysqlserverteam.com the aggretage function JSON_ARRAYAGG should fix that. (My installation unfortunately has third party repository conflicts, thus I cannot test it.)



SELECT JSON_ARRAGG(JSON_OBJECT('meta_value', meta_value, 'post_date', post_date))
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


If JSON_ARRAYAGG causes problems for some reason, a workaround (at least on unix-like systems) would be:



<?php
$guid = bin2hex(openssl_random_pseudo_bytes(16));
$tmp_filename = "/tmp/mysql-json-{$guid}.json";
/* let MySql generate the tempfile, then */

passthru('sed '1 s/^/[/; $! s/$/,/ ; $ s/$/]/' ' . $tmp_filename);
?>





share|improve this answer


























  • result is [{},{}{}]

    – Artem Ilchenko
    Jan 3 at 13:31













  • now [{},{},] :)

    – Artem Ilchenko
    Jan 3 at 13:33











  • Yes, edited. Should sleep before coding on the fly. :)

    – Quasimodo's clone
    Jan 3 at 13:33











  • No, $sep is the prefix which is set after the first iteration.

    – Quasimodo's clone
    Jan 3 at 13:34













  • you're right :)

    – Artem Ilchenko
    Jan 3 at 13:35














1












1








1







Do not hold full result tables in memory or your server will be vulnerable to DOS attacks easily. The server has to be able to reply to many requests simultaneously. Process only one row at once and release the allocated memory.



You should output the JSON chunks comma-seperated directly.



echo '[';

$sep = '';
while(($row = mysqli_fetch_assoc($sql_query))!== false){
echo $sep . json_encode($row);
$sep = ',';
}

echo ']';


Alternatively you could cache it in a temporary file.



On newer MySql versions you could try:



SELECT JSON_OBJECT('meta_value', meta_value, 'post_date', post_date)
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


This generates a file with JSON lines, but not comma-seperated.
As documentated on mysqlserverteam.com the aggretage function JSON_ARRAYAGG should fix that. (My installation unfortunately has third party repository conflicts, thus I cannot test it.)



SELECT JSON_ARRAGG(JSON_OBJECT('meta_value', meta_value, 'post_date', post_date))
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


If JSON_ARRAYAGG causes problems for some reason, a workaround (at least on unix-like systems) would be:



<?php
$guid = bin2hex(openssl_random_pseudo_bytes(16));
$tmp_filename = "/tmp/mysql-json-{$guid}.json";
/* let MySql generate the tempfile, then */

passthru('sed '1 s/^/[/; $! s/$/,/ ; $ s/$/]/' ' . $tmp_filename);
?>





share|improve this answer















Do not hold full result tables in memory or your server will be vulnerable to DOS attacks easily. The server has to be able to reply to many requests simultaneously. Process only one row at once and release the allocated memory.



You should output the JSON chunks comma-seperated directly.



echo '[';

$sep = '';
while(($row = mysqli_fetch_assoc($sql_query))!== false){
echo $sep . json_encode($row);
$sep = ',';
}

echo ']';


Alternatively you could cache it in a temporary file.



On newer MySql versions you could try:



SELECT JSON_OBJECT('meta_value', meta_value, 'post_date', post_date)
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


This generates a file with JSON lines, but not comma-seperated.
As documentated on mysqlserverteam.com the aggretage function JSON_ARRAYAGG should fix that. (My installation unfortunately has third party repository conflicts, thus I cannot test it.)



SELECT JSON_ARRAGG(JSON_OBJECT('meta_value', meta_value, 'post_date', post_date))
FROM wp_postmeta
INNER JOIN wp_posts ON wp_posts.ID=wp_postmeta.post_id
WHERE wp_posts.post_type ='shop_order'
AND wp_postmeta.meta_key = '_order_total'
INTO OUTFILE '/tmp/mysql-temp-GENERATED_UUID';


If JSON_ARRAYAGG causes problems for some reason, a workaround (at least on unix-like systems) would be:



<?php
$guid = bin2hex(openssl_random_pseudo_bytes(16));
$tmp_filename = "/tmp/mysql-json-{$guid}.json";
/* let MySql generate the tempfile, then */

passthru('sed '1 s/^/[/; $! s/$/,/ ; $ s/$/]/' ' . $tmp_filename);
?>






share|improve this answer














share|improve this answer



share|improve this answer








edited Jan 4 at 19:52

























answered Jan 3 at 13:29









Quasimodo's cloneQuasimodo's clone

4,66221131




4,66221131













  • result is [{},{}{}]

    – Artem Ilchenko
    Jan 3 at 13:31













  • now [{},{},] :)

    – Artem Ilchenko
    Jan 3 at 13:33











  • Yes, edited. Should sleep before coding on the fly. :)

    – Quasimodo's clone
    Jan 3 at 13:33











  • No, $sep is the prefix which is set after the first iteration.

    – Quasimodo's clone
    Jan 3 at 13:34













  • you're right :)

    – Artem Ilchenko
    Jan 3 at 13:35



















  • result is [{},{}{}]

    – Artem Ilchenko
    Jan 3 at 13:31













  • now [{},{},] :)

    – Artem Ilchenko
    Jan 3 at 13:33











  • Yes, edited. Should sleep before coding on the fly. :)

    – Quasimodo's clone
    Jan 3 at 13:33











  • No, $sep is the prefix which is set after the first iteration.

    – Quasimodo's clone
    Jan 3 at 13:34













  • you're right :)

    – Artem Ilchenko
    Jan 3 at 13:35

















result is [{},{}{}]

– Artem Ilchenko
Jan 3 at 13:31







result is [{},{}{}]

– Artem Ilchenko
Jan 3 at 13:31















now [{},{},] :)

– Artem Ilchenko
Jan 3 at 13:33





now [{},{},] :)

– Artem Ilchenko
Jan 3 at 13:33













Yes, edited. Should sleep before coding on the fly. :)

– Quasimodo's clone
Jan 3 at 13:33





Yes, edited. Should sleep before coding on the fly. :)

– Quasimodo's clone
Jan 3 at 13:33













No, $sep is the prefix which is set after the first iteration.

– Quasimodo's clone
Jan 3 at 13:34







No, $sep is the prefix which is set after the first iteration.

– Quasimodo's clone
Jan 3 at 13:34















you're right :)

– Artem Ilchenko
Jan 3 at 13:35





you're right :)

– Artem Ilchenko
Jan 3 at 13:35













0














Why not use Wordpress to query and paginate like this:



$args = array(
"post_type" => "shop_order",
"posts_per_page" => 1000,
"post_status" => array("any"),
"paged" => $paged, // pass your pagination here from javascript
"meta_query" => array(
"key" => "_order_total",
"compare" => "EXISTS"
)
);
$custom_posts = get_posts($args);

echo json_encode($custom_posts);





share|improve this answer


























  • The user needs the result to create a chart. Paginating the result is not what he needs.

    – Patrick Simard
    Jan 3 at 13:28











  • I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

    – Sinner
    Jan 3 at 13:29











  • Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

    – Tim Hallman
    Jan 3 at 13:29











  • I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

    – Tim Hallman
    Jan 3 at 13:32






  • 1





    @TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

    – Sinner
    Jan 3 at 13:41
















0














Why not use Wordpress to query and paginate like this:



$args = array(
"post_type" => "shop_order",
"posts_per_page" => 1000,
"post_status" => array("any"),
"paged" => $paged, // pass your pagination here from javascript
"meta_query" => array(
"key" => "_order_total",
"compare" => "EXISTS"
)
);
$custom_posts = get_posts($args);

echo json_encode($custom_posts);





share|improve this answer


























  • The user needs the result to create a chart. Paginating the result is not what he needs.

    – Patrick Simard
    Jan 3 at 13:28











  • I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

    – Sinner
    Jan 3 at 13:29











  • Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

    – Tim Hallman
    Jan 3 at 13:29











  • I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

    – Tim Hallman
    Jan 3 at 13:32






  • 1





    @TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

    – Sinner
    Jan 3 at 13:41














0












0








0







Why not use Wordpress to query and paginate like this:



$args = array(
"post_type" => "shop_order",
"posts_per_page" => 1000,
"post_status" => array("any"),
"paged" => $paged, // pass your pagination here from javascript
"meta_query" => array(
"key" => "_order_total",
"compare" => "EXISTS"
)
);
$custom_posts = get_posts($args);

echo json_encode($custom_posts);





share|improve this answer















Why not use Wordpress to query and paginate like this:



$args = array(
"post_type" => "shop_order",
"posts_per_page" => 1000,
"post_status" => array("any"),
"paged" => $paged, // pass your pagination here from javascript
"meta_query" => array(
"key" => "_order_total",
"compare" => "EXISTS"
)
);
$custom_posts = get_posts($args);

echo json_encode($custom_posts);






share|improve this answer














share|improve this answer



share|improve this answer








edited Jan 3 at 13:28

























answered Jan 3 at 13:26









Tim HallmanTim Hallman

658819




658819













  • The user needs the result to create a chart. Paginating the result is not what he needs.

    – Patrick Simard
    Jan 3 at 13:28











  • I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

    – Sinner
    Jan 3 at 13:29











  • Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

    – Tim Hallman
    Jan 3 at 13:29











  • I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

    – Tim Hallman
    Jan 3 at 13:32






  • 1





    @TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

    – Sinner
    Jan 3 at 13:41



















  • The user needs the result to create a chart. Paginating the result is not what he needs.

    – Patrick Simard
    Jan 3 at 13:28











  • I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

    – Sinner
    Jan 3 at 13:29











  • Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

    – Tim Hallman
    Jan 3 at 13:29











  • I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

    – Tim Hallman
    Jan 3 at 13:32






  • 1





    @TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

    – Sinner
    Jan 3 at 13:41

















The user needs the result to create a chart. Paginating the result is not what he needs.

– Patrick Simard
Jan 3 at 13:28





The user needs the result to create a chart. Paginating the result is not what he needs.

– Patrick Simard
Jan 3 at 13:28













I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

– Sinner
Jan 3 at 13:29





I just need an simple application to analyse data and show em on charts, so there is no point to use wordpress I think.

– Sinner
Jan 3 at 13:29













Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

– Tim Hallman
Jan 3 at 13:29





Sure it is. He's OOM. Paginating will allow him to execute his query in chunks without running out of memory.

– Tim Hallman
Jan 3 at 13:29













I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

– Tim Hallman
Jan 3 at 13:32





I would then use ajax or the API to make the queries and populate your chart. This would allow you to process infinite amount of records without running out of memory.

– Tim Hallman
Jan 3 at 13:32




1




1





@TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

– Sinner
Jan 3 at 13:41





@TimHallman That might be a good idea, I might try it, but have to read more about AJAX and API solution.

– Sinner
Jan 3 at 13:41


















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54022786%2fphp-memory-limit-with-sql-result%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

A Topological Invariant for $pi_3(U(n))$