michaelham Posted August 4, 2009 Share Posted August 4, 2009 I have built a script that parses a tab delimited file and inserts records into the sql server database one at a time. For each record it calls sqlsrv_query to run the insert statement which consists of an insert of about 200 columns. I am having a problem with memory issues. It appears that the memory that is used on each call of the query function is not freed therefore I run out of memory around record 15000. I am currently set up using the default 128M. I know it is this function because when I comment out the function and just let it produce the insert statements the memory stays fluxuating between .1 and .3 megs. currently my sqlsrv_query function does not return into a variable. I have tried returning it to a variable then unsetting the variable with each iteration but that doesn't work either. If I am not approaching this the best way please let me know. I have tried a bulk insert but cannot seem to get it to work. I think this is because my db columns are ordered 1,2,3,4,5 and the tab delimited file is ordered 3,5,2,4,1. Getting my tab delimited file reordered isn't an option that I have as it is a return pulled from querying a RETS server. Link to comment https://forums.phpfreaks.com/topic/168828-sqlsrv_query-memory-problem/ Share on other sites More sharing options...
Bjom Posted August 4, 2009 Share Posted August 4, 2009 It would help to see the code.... The order of the columns (i.e.: fields) can be changed for the insert like so: INSERT INTO myTbl(field3,field5,field2,field4,field1) VALUES(...) Link to comment https://forums.phpfreaks.com/topic/168828-sqlsrv_query-memory-problem/#findComment-890801 Share on other sites More sharing options...
michaelham Posted August 4, 2009 Author Share Posted August 4, 2009 if(file_exists($file)){ echo "File Exists...Opening $file\n"; $f = fopen("$file","r"); } else { die("There was an error opening file $file"); } $insert = "INSERT INTO $tableName ("; $columns = ""; $values = ""; $count = 1; $params = array(2, 709); $rows_affected = 0; while($array = fgetcsv($f,0,"\t")){ $values = ""; foreach ($array as $value) { if($count == 1) { $columns .= "$value, "; /*if($value == "ListingNumber") { $listingNumberIndex = key($array) - 1; echo $listingNumberIndex; }*/ } else { $value = str_replace(''', '\'\'', $value); $value = str_replace('&', '&', $value); $values .= "'$value', "; } } if($count == 1) { $columns = substr_replace($columns,"",-2); $count = 2; } else { $sql = $insert . $columns . ") VALUES (" . substr_replace($values,"",-2) . ")"; echo "CURRENT MEMRY USAGE = " . memory_get_usage()/1048567 . "\n"; // PRINT CURRENT MEMRY LIMIT: sqlsrv_query( $conn, $sql,$params); } //END ELSE COUNT ==2 } //END WHILE GET NEXT LINE I know I can change the order the fields are in the insert but I would then have to hardcode it to a particular order, which I don't necessarily want to do, because columns can be added and the order can change in the future. The only thing I know about the files contents is that it will always have all of the columns that are fed from the RETS server which should match all of the columns in my database but not necessarily the order. I also don't want to hardcode them because there are about 10 tables that all have 200+ columns. MH Link to comment https://forums.phpfreaks.com/topic/168828-sqlsrv_query-memory-problem/#findComment-890832 Share on other sites More sharing options...
Bjom Posted August 4, 2009 Share Posted August 4, 2009 that has nothing to do with 'hardcoding' the query... You retrieve the fieldnames from the first line of the csv file? well....then you should end up with a query that has exactly the right order in both the fieldlist and the valuelist. did you do an echo on the query to inspect it? It should have the fieldlist in order of the csv.... What you actually do with sqlsrv_query is to prepare a query, then execute it...over and over again. For this kind of stuff there is the prepare/execute way of running queries, where you can separate the steps and prepare the query only once (with parameters for the values, then fill those...). You use sqlsrv_prepare and sqlsrv_execute for this. saves a lot of traffic and might as well alleviate your memory problems. If that alone won't do the trick you can try to use sqlsrv_free_stmt to clean up things. and an other thing (just a suggestion, but a good one ): never use "die". It can be replaced with "trigger_error" which has, among others, the big advantage of reacting to the error reporting level that you set... Link to comment https://forums.phpfreaks.com/topic/168828-sqlsrv_query-memory-problem/#findComment-890886 Share on other sites More sharing options...
michaelham Posted August 4, 2009 Author Share Posted August 4, 2009 Thank you, The querys I'm putting together are in the order that they come in the tab file, just not in the same order as the database, which is why I think I have problems with using bulk insert. I will give the prepare and execute a try. Thanks for the tip on die() and trigger_error(). I will let you know how it works out. Thanks again, MH Link to comment https://forums.phpfreaks.com/topic/168828-sqlsrv_query-memory-problem/#findComment-890889 Share on other sites More sharing options...
Bjom Posted August 4, 2009 Share Posted August 4, 2009 there seems to be some way with this bulk insert to get the header info, but.... maybe it might be simpler to import to a temporary table and transfer data from there? If however you should find out how to implement bulk insert.... let me know Link to comment https://forums.phpfreaks.com/topic/168828-sqlsrv_query-memory-problem/#findComment-890903 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.