Jump to content

sammermpc

Members
  • Posts

    16
  • Joined

  • Last visited

    Never

Everything posted by sammermpc

  1. Apologies, you are absolutely right, PHP does support multidimensional arrays (I too use them).  I think they use somewhat more memory than regular arrays, though I am not sure, so possibly could make the memory issue worse.  I don't know what it means or if it is significant, just read it off've this bug report (only reference I could find) [url=http://bugs.php.net/bug.php?id=13598]http://bugs.php.net/bug.php?id=13598[/url], though it may be quite outdated.
  2. Thanks for the responses gentleman: I think I've hit on the problem--simply said, I spoke too soon.  One or two of the files are as large as 4.7mb, just a couple out of hundreds.  I read their contents into a multi-dimensional array (which php does not have native support for), which balloons (I'm conjecturing here) the arrays size to many times the filesize.  That is, a multi-dimensional array of a file's contents takes up many times more space than the file itself (should be obvious I suppose). Of course PHP memory managment is smart enough not to store arrays around that have already been used.  It was one massive array that bombed the whole thing. Of course, the best way to do it would be to not read in the entire file at once.  This is possible, and I in fact wrote a different script that did just that.  As I am searching for certain minimum and maximum values within the files, I simply did so on a line by line basis. I ran into some problems.  First, the file is delimited by carriage returns, so i was forced to use stream_get_line(), which, as of PHP 5.0.4 at least, does not work as documented.  So then I was forced to use fgetc.  If you can imagine, that took so long that simple reading one file took over 30 seconds. I'll post the entirety of the code below (both options).  If anyone has suggestions, I'd be glad to hear them, though I have temporarily got over the hump by bumping the memory_limit up to a whopping 64mb. Here is what I am currently using: [code] function getfile_whole() {     //Grab the entire file in a single string     $file_str = stream_get_contents($this->fs);     if(strlen($file_str) == 0)       throw new Exception("$this->fname: filesize of zero");     //Strip lagging carriage return(to avoid putting in a trailing empty element)     $file_str = rtrim($file_str);     //Replace all carriage returns with delimiting commas     $file_str = strtr($file_str,"\r", ",");     //Use delimited commas into an array.     $full_file_arr = explode(",", $file_str);     unset($file_str);  //free memory     if(sizeof($full_file_arr) == 0)       throw new Exception("$this->fname: data parse failed");     //re-explode into a nested array     $i = 0;     $success = false;     foreach($full_file_arr as $row) {       //Omit rows with undecipherable data       if(!eregi('[:graph:]', $row)) { $temp = explode(" ", $row); //Omit rows where the lat or long is zero if($temp[1] != 0 && $temp[2] != 0) {   $file_arr[$i] = $temp;   $i++;   $success = true;  //to ensure that at least one element has been added } [/code] Here is a variant of what I probably SHOULD be using (with some replacement to halt on carriage returns): [code] function getfile_line() {     $first = true;              //allow for first-loop only events     while(!feof($this->fs)) {     $line = "";                  //define $line as empty       //retrieve a single line       while(($c = fgetc($this->fs)) !== "\r" && $c !== false) { $line .= $c;       }                 //Parse into an array       $line = explode(" ", $line);       //Check to make sure data line matches accepted format       if(sizeof($line) !== 7) { throw new Exception("File $this->fname contains unknown data.");       }       //Assign starting values to variables             if($first) { //Match exploded array values to their appropriate variables $start_time = $line[0]; $finish_time = $line[0]; $start_lat = $line[1]; $finish_lat = $line[1]; $start_long = $line[2]; $finish_long = $line[2]; $first = false;       }             //Find the minimum and maximum variable values       if($line[0] < $start_time){ $start_time = $line[0]; }       if($line[0] > $finish_time){ $finish_time = $line[0]; }       if($line[1] < $start_lat){ $start_lat = $line[1]; }       if($line[1] > $finish_lat){ $finish_lat = $line[1]; }       if($line[2] < $start_long){ $start_long = $line[2]; }       if($line[2] > $finish_long){ $finish_long = $line[2]; }     } [/code] Glad to satisfy anyone's curiosity. Whoops, didn't even answer the questions.  Apache Server/Unix
  3. I've written a script that processes multiple large files one by one for insertion into a MySQL database.  After  processing a number of files, I get this error: [code]Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 35 bytes)[/code] Now, I know the typical solution to this is to chance the php memory limit, but as you can see, the limit is already at 32mb, and potentially this script could deal with directories whose contents reach into the gigabytes.  The key here though, is that the individual size of each file is quite small (under a megabyte).  I think is simply the summation of all of them that pushes it over the limit as no single file approaches 32mb. I've been looking into unset(), but it doesn't seem to be having much of an effect, though perhaps I am just using it in the wrong places. Thanks for any suggestions!
  4. Wow shoz.  Wow.  Thank you so much.  That works perfectly--ridiculously perfectly.  Essentially the difference between what I was doing is the CONCAT statement--how does that somehow make it all possible?  Or is that not the essential part? Thanks again.
  5. Even if you are no familiar with the spatial elements of MySQL, I think you will likely be able to help, as this problem has largely to do with executing functions across tables, not the spatial elements of MySQL per se.  In any case, here is the situation: I'm trying to put together a searchable spatial database--as of now, I have a large deployments table which stores data from various marine instruments--each row includes a start coordinate and an end coordinate (GPS).  As of now these are stored as integers, but for the purposes of a spatial search, I transform the two coordinates into a LINESTRING.  I want to check whether this LINESTRING intersects with the search queries input by the user (another LINESTRING, called, say, if the first is LS1, this is LS2). This can be done quite well with the function MBRIntersects(LS1, LS2), and indeed, this works perfectly on a one to one basis (if I enter in the numbers for LS1 by hand, then LS2, and then run the function).  For example: [code] SET @LS1 = GeomFromText('LineString(118.123 36.223, 123.132 33.232)'); SET @LS2 = GeomFromText('LineString(117.233 36.329, 122.423 33.348)'); SELECT MBRIntersects(@LS1, @LS2) [/code] This works fine.  Now, considering I have some 2500 entries within the deployments table, how do I automate this for a search function?  I've been trying various combinations of [code]SELECT start_lat, start_long, finish_lat, finish_long FROM deployments...[/code] but I havn't been having any luck. Once you enter in starting coordinates, I want it to repeat the operations above through the entire deployments table, and return every row where MBRIntersects returns a positive result.  I think this is trickier than it sounds, thanks for any help--I'm on deadline! :-\
  6. No, I am entering information into a database.  This is the problem at hand.  A lab group goes out in the field with various instruments--they return a week later with literally hundreds of files from said instruments.  They then upload these from their laptops to a central server, which processes them, and stores data (time, date, location, ect...), in a database. The question is, how can I let them simply select their data folders (ie../marine/data/07122006/), and then let the scripts do their work.  I have already written the batch processing, and it works if they type in the path by hand.  I want them to able to simply use a browse dialogue box, similar to the one <input type="file"..../> uses.
  7. Any other conventional solutions for this?  Or anyone who can point me in the right direction for doing this with AJAX?
  8. I have to go through all that just to get such simple functionality?  I mean, I could practically just have them select a file within a folder, and then chop off the file name.  Incredibly clumsy (and unacceptable for this application), but it would work.  In any case, the AJAX implementation would depend on sending the right HTTP headers correct?  In that case--what are they?
  9. I feel like this should be simple, but I don't know how to do it.  I am creating a web database that  will have many files contained within large directories that need to be processed.  I wrote a PHP parser, processor, ect, that handles this no problem.  Now, all I want is for the user to be allowed to SELECT a directory.  Like the html [browse...] button, but allowing the user to select a directory.  Any ideas?
  10. Thanks toplay, the ob_end_close() solution is working for now. I originally did have it set up in a kind of link.php, downloads.php situation, though perhaps because I included the downloads.php file (to use a function written there), it didn't take care of my original issues. I can see that if I linked to it instead it would. Anyhow, it's working, and working well--so thanks!
  11. [!--quoteo(post=389078:date=Jun 28 2006, 10:00 PM:name=toplay)--][div class=\'quotetop\']QUOTE(toplay @ Jun 28 2006, 10:00 PM) [snapback]389078[/snapback][/div][div class=\'quotemain\'][!--quotec--] You have to know the HTTP protocol (and the headers you're sending) in order to understand why it's doing that. [/quote] So do you think that is the root of the problem? I'll admit I know virtually nothing of header protocols. I noticed that in many examples people use the header "Content-Length: $size", where $size is returned from filesize(). Unfortunately, that function does not seem to work for me (with urls?). [!--quoteo(post=389078:date=Jun 28 2006, 10:00 PM:name=toplay)--][div class=\'quotetop\']QUOTE(toplay @ Jun 28 2006, 10:00 PM) [snapback]389078[/snapback][/div][div class=\'quotemain\'][!--quotec--] Just don't output anything other than the file after sending those headers. [/quote] It looks trivial to omit other output in the example I put, but this code is going to be embedded in a html representation of a large MySQL database that provides links to many remote files. I tried to plug it in, and though it happily downloads the files in question, it then proceeds to write the rest of the table to the file! This bloats little 10k calibration files to 300k behemoths. I switched the implementation to readfile(), convenient because you don't have to use fopen, flcose. Thanks! Anyhow, could the problem simply be in the header protocols? Anyone know the correct ones, or how I should approach this problem?
  12. Delete DB entries the same way you add them, but with a correspondingly different entry.  Instead of querying the database with an INSERT command, query it with a DELETE.  Check out the manual on mysql queries, the documentation is very complete. Also check out the php reference on the commands mysql_query, as well as the many other embedded mysql php functions.
  13. I'm just trying to put together a simple download script to allow users to click on a hyperlink and download the corresponding file. Here's the code I have. [code] <?php include('downloads.lib.php'); $fname = 'sent-mail'; $fpath = 'http://xxxxxxxx-blocked for privacy-xxxxxxxxxx/data/; $bufsize = 4096; if($fs = fopen ($fpath.$fname,"r")) {   header("HTTP/1.1 200 OK");   header("Content-Type: application/unknown");   header("Content-Disposition:attachment; filename=$fname");   header("Content-Transfer-Encoding: binary");   while(!feof ($fs)) {     $buf = fread ($fs, $bufsize);     print($buf);       flush();   } fclose ($fs); } echo "TEST"; ?> [/code] This seems to work fine--the strange thing is, that last echo "TEST" is written to the end of the file. In fact, in other tests, everything within php tags seems to be written straight into the downloaded file. In this case, every file that is downloaded has a TEST appended to the end. Needless to say, this is not good. Running PHP 5.0.4 on Apache 2.0. I'm not using file_exists, or filesize, because they don't seem to work with URLs, or at least, they either fail or return false, even if the file assuredly exists. Could there be a problem with the header() calls? I admittedly do not know a great deal about them. ARGH! Thanks, Sam
  14. Running apache 2.0.54, php 5.0.4, and mysql 5.1.4 Anyone know what's up with these errors? PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php/extensions/mbstring.so' - /usr/lib/php/extensions/mbstring.so: undefined symbol: _zval_dtor_func in Unknown on line 0 PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php/extensions/mcrypt.so' - /usr/lib/php/extensions/mcrypt.so: undefined symbol: _zval_copy_ctor_func in Unknown on line 0 PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php/extensions/mysql.so' - /usr/lib/php/extensions/mysql.so: undefined symbol: _zval_copy_ctor_func in Unknown on line 0 PHP 5.0.4 (cli) (built: Sep 6 2005 15:21:04) Copyright © 1997-2004 The PHP Group Someone able to point me in the right direction? Google doesn't come up with much, but my fingers are crossed
  15. [!--quoteo(post=383004:date=Jun 12 2006, 05:50 PM:name=sammermpc)--][div class=\'quotetop\']QUOTE(sammermpc @ Jun 12 2006, 05:50 PM) [snapback]383004[/snapback][/div][div class=\'quotemain\'][!--quotec--] I'm working on a php parser that has to batch process many files in different directories, some mounted on different file servers. Everything goes well, and during testing, everything works perfectly for LOCAL directory access. Unforunately, when I use opendir with a URL (opendir(http://ect...), I get the error: I've been browsing around the web and most people's problems with this error seems to be that they are using the web address syntax when they should be using the local folder one. I am pretty sure this is not my problem, as I need to log on to a remote server to obtain the necessary files (it would be impractical to move them). So what gives? I've seen some vague talk about changed within config.php, though nothing conclusive. [/quote] UPDATE: I think the central issue is that though fopen can work with both paths and urls, it appears that opendir cannot, so that even for the purpose of opening a remote directory you must use fopen. That sound likely to anyone?
  16. I'm working on a php parser that has to batch process many files in different directories, some mounted on different file servers. Everything goes well, and during testing, everything works perfectly for LOCAL directory access. Unforunately, when I use opendir with a URL (opendir(http://ect...), I get the error: [!--quoteo--][div class=\'quotetop\']QUOTE[/div][div class=\'quotemain\'][!--quotec--]Warning: opendir(http://somewebsite.someplace.edu/folder/folder/) [function.opendir]: failed to open dir: not implemented in /home/myname/public_html/folder/file.php on line 229[/quote] I've been browsing around the web and most people's problems with this error seems to be that they are using the web address syntax when they should be using the local folder one. I am pretty sure this is not my problem, as I need to log on to a remote server to obtain the necessary files (it would be impractical to move them). So what gives? I've seen some vague talk about changed within config.php, though nothing conclusive.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.