sirdavidoff
-
Posts
5 -
Joined
-
Last visited
Never
Posts posted by sirdavidoff
-
-
Hi frost110 - you're right, it sounds like a PHP memory limit problem and indeed the default memory_limit for my version of PHP (4.4.4) is 16MB.
However, I have tried increasing this and it doesn't seem to make any difference. It is interesting to note that the memory_limit variable does not appear in phpInfo() - does this mean that there is no limit set by PHP?
Thanks for your help
-
Unfortunately this is part of a complex application, and it's not really feasible to change the way that the files are stored now. I have found other posts on the internet from people who have had similar problems and ended up giving up though... someone must know the answer!
-
Hmm... I don't think it's a timeout issue, it seems more to be that the PHP-mysql interface can't handle the data. I'm testing this on my local machine, and a 15MB file takes less than a second, whereas a 16.5MB file hangs the server for a minute or so
-
Hi guys,
I have a mysql database, in which I want to store some files as longblobs. I'm using PEAR to connect to the DB. So far so good... until the files get over 16MB; the server takes ages to do the query and finally breaks without actually performing it.
I'm reading in the file to a variable in memory and outputting that variable in the SQL statement, e.g.
"INSERT INTO table VALUES('$fileContents');"
What's weird (and what makes me think it's not a max_allowed_packet problem) is that if I use MySQL's LOAD_FILE() function it works fine:
"INSERT INTO table VALUES(LOAD_FILE('$pathToFile'));"
Any ideas?
David
Inserting large strings into a MySQL DB
in PHP Coding Help
Posted
Come to think of it, the problem doesn't occur when I read the file into memory, it happens later when I execute the INSERT statement using $dbConnection->query($sql).
Doesn't that suggest that it's not a PHP memory problem?