I’m trying to split a file with PHP and put it into a LONGBLOB in chunks.
Is there some kind of mysql limit? (the chunking should get past the max_allowable_packet problem)
For some reason it doesn’t work with files in the larger range (10MB or more approximately)
What should I do!?
$docdata = $_FILES['docdata']['tmp_name'];
$docsize = filesize($docdata);
$data = addslashes(fread(fopen($docdata, "r"), filesize($docdata))); //<- RAW DATA
$oneMeg = 1040000; //roughly
if ($docsize > $oneMeg) {
$numMegs = ceil($docsize/$oneMeg); //the uppermost number of megs in this file
$numChars = strlen($data); //the number of characters in the data string
$len = floor($numChars/$numMegs); //how many characters per section
$fd_temp = $data; //placeholder
$i = 0;
while (strlen($fd_temp) > 0) {
$section[$i] = substr($fd_temp, 0, $len); //takes the first LEN characters
$fd_temp = substr($fd_temp, $len); //shortens it by LEN characters for next loop
$i++; //increment the array index
}
$query ="INSERT INTO `documents` (`docname`,`docclass`,`doctype`,`docdata`,`docsize`) VALUES ('$docname','$docclass','$doctype','$section[0]','$docsize')"; //$docname $docclass $doctype are defined earlier
$result = mysql_query($query);
$theID = mysql_insert_id();
if (!$result) {
$err[] = "Error: the first part of the file could not be inserted";
}
else if (!$theID) {
$err[] = "Error: the last insert ID could not be retrieved.";
}
else { //only run the next parts if the first part went in OK
for ($i = 1; $i < count($section); $i++) {
$query = "UPDATE `documents` SET `docdata`=CONCAT(`docdata`,'$section[$i]') WHERE `docid`=$theID";
$result = mysql_query($query);
if (!$result) {
$err[] = "Error inserting a part of the file. Please delete the incomplete file and try again.";
break;
}//end if
}//end for
}//end else
}//end for large files
Thanks so much in advance… Even if you don’t know what’s wrong, but made it this far, I appreciate your effort.
-Karl