I wrote an update database code but since my csv file is too large for server handle, it exceeded maximum time limit. My question is, how am i going to count every 1000 lines in file then post to server and repeat same step again by using function in order not to exceed the default time. For example, my file has 20,000 records. I know there is some other methods like set time limit function, change time limit in php.ini…but i’m not going to do that as it will mess up the server. Any expert here can help me out? Thanks for looking this up for me
Here is my original code tat result Max Time Limit fatal error:
$query =“UPDATE testdb SET user =’”.$user."’,userid=’".$userid."’,pwd=’".$pwd."’,
email=’".$email."’ WHERE user=’".$user."’ LIMIT 1";
echo $query."<hr>";
mysql_query($query) or die(mysql_error());
The only way off splitting this proces up i could think off is by using multiple ajax calls. But if somebody has a better idea i sure would like to know because i need something similar…
Try this. This script will take your CSV data and read it and on every 100 rows will make a redirect to itself sending the row it reached so you won’t get timed out. It’s not fully tested, it’s just an idea. Try it and tell me if it works.
<?php
set_time_limit(1);
$handle = fopen("test.csv", "r");
fgetcsv($handle, 200000, ",");
$row = !isset($_GET['row']) ? 1 : $_GET['row'];
$maximum_results_in_DB = 1419; // the number of rows you exported into your CSV
$file = fopen("file.txt","a");
while (($data = fgetcsv($handle, 20000, ",")) !== FALSE)
{
$row++;
fwrite($file,$data[0]."
");
if( $row%10 == 0 && $row <= $maximum_results_in_DB )
{
fwrite($file,"end of the line baby
");
header("Location:index.php?row=".$row);
}
}
fclose($file);
fclose($handle);
?>
I tried to implement your code into mine, but it doesnt work out as the result i want. What i really wanna do is a code that have loop every 1000 entries and update the database then continue the second 1000 entries again for the rest of the records. It should be read from csv file then do UPDATE to MYSQL. Any idea? Thanks.
Here is my code:
<?php
set_time_limit(1);
$handle = fopen("test.csv", "r");
fgetcsv($handle, 20000, ",");
$row = !isset($_GET['row']) ? 1 : $_GET['row'];
$maximum_results_in_DB = 10000; // the number of rows you exported into your CSV
while (($data = fgetcsv($handle, 20000, ",")) !== FALSE){
$serial = str_replace(",", "", $data[0]);
$uid = str_replace(",", "", $data[1]);
$reloadDate = str_replace(",", "", $data[2]);
$status = str_replace(",", "", $data[3]);
$row++;
if( $row%10 == 0 && $row <= $maximum_results_in_DB )
{
$query ="UPDATE testdb SET user ='".$user."',userid='".$userid."',pwd='".$pwd."' ,
email='".$email."' WHERE user='".$user."' LIMIT 1";
echo $import."<hr>";
mysql_query($query) or die(mysql_error());
}
}
fclose($handle);
?>
Thank you, actionAction for your suggestion. Its hard for me to break up large csv into smaller chunks, so I have done some modification on my previous code by setting the time limit to 1000 instead of 1 and it can update partial records to database. For example:
set_time_limit(1000);
Can anyone here briefly explain to me that what are these command means:
$row = !isset($_GET[‘row’]) ? 1 : $_GET[‘row’];
and also
$row%10 == 0 && $row <= $maximum_results_in_DB
Thanks. If there is any example code, that would be really helpful
$row = !isset($_GET[‘row’]) ? 1 : $_GET[‘row’];
It’s an alternate syntax for an if/else conditional. This is the same as:
if(!isset($_GET['row']))//if the form hasn't been posted
{
$row = 1;
}else{
$row = $_GET['row'];
}
$row%10 == 0 && $row <= $maximum_results_in_DB
This is saying if the there is no remainder when dividing the $row var by 10 (i.e. it is a multiple of ten) AND the value is less than or equal to the $maximum_results_in_DB variable, do whatever action follows.
Ok, here i rewrite the code to include the sleep() function. All the records can be displayed without any timeout but when they update to MYSQL, it takes too long and result Max time limit error no matter how long I’m gonna set the limit. Any idea how to solve it? Thanks.