Jump to content

pthurmond

Members
  • Posts

    81
  • Joined

  • Last visited

    Never

About pthurmond

  • Birthday 03/23/1983

Contact Methods

  • AIM
    PEThurmond
  • Website URL
    http://www.pjerky.com
  • Yahoo
    p_thurmond

Profile Information

  • Gender
    Male
  • Location
    Peculiar, MO

pthurmond's Achievements

Member

Member (2/5)

0

Reputation

  1. That is what we have built up in cache over the course of about 2 weeks. When the file has hit its expiration age it is simply overwritten on the next page visit. Though its not a bad idea to just setup a cleanup cron job for it. The reason I want to compress the cache files is that unfortunately this site is sitting on a cluster server system with a ton of other sites (all owned by the same company). So they are sharing resources, an internal shared server. Because of that we are more limited on space and since the compression/decompression time is negligible on overall load time, I think picking the middle ground is the right balance here. If I had my choice the site would be on its own dedicated server. But alas, we can't always get what we want. -Patrick
  2. Ok, so I finally had the chance to do some benchmarking for this. I compared the Zlib functions to the Bzip2 functions. Name Run Time (in seconds) Load File 0.000370979309082 BZip2 Compression 0.0489809513092 BZip2 DeCompression 0.00877499580383 ZLib Compression 0.0100789070129 ZLib DeCompression 0.000354051589966 FileSize Original 258,699bytes BZip2 21,503 ZLib 27,123 So it depends on what you want to trade off here. I would say that for my purposes I would be willing to sacrifice a little extra disk space for the speed gain. Though its obviously almost negligible in terms of significance that could change as server load increases. Thanks, Patrick
  3. The problem is right now our cache is about 40,000 files and it is taking up about 815MB. If it were to be compressed then it would take up only 65.22MB. So this may be beneficial. On the flip side I also realize that disk space is really cheap. That said, it would still be interesting to know or see some good comparisons. Thanks, Patrick
  4. Hello, I am working on a caching class and all I have left to figure out now is the string compression. I am storing files locally on the server and may eventually expand the option for storing it in the database. But I want to reduce file size while not sacrificing speed as much as possible. The first function I used was the bzip2 bzcompress function. It did a great job with a 92% compression rate, but I noticed a half-second to full-second delay in the decompression. So what I would like to know is does anyone know which compression functions offer the most speed and what their compression ratios look like? This is for a simple string (well XML that I have reduced with some string replacements), so there is no pre-compression to it. The compression libraries available to me on my server are these: zlib, zip, and bz2 Thoughts? Thanks, Patrick
  5. In this particular case I am searching for nearby cities in a zip code table. I am just taking old code that someone else wrote a long long time ago and trying to improve its performance. I have since realized that a good portion of the speed issues were coming from a lack of proper indexing on some joined tables. Also when I first got to this code the origin var (that you will see in the code below) was actually full sub-queries that returned the same information each time, just varying either the longitude or latitude field. So I pulled the sub-queries out and made a single call to the DB before this call that gathered those two data points then output them to this query string. (3959 * ATAN(SQRT(1 - SQRT(sin(Latitude / 57.29577951) * sin(({$Origin['Latitude']}) / 57.29577951) + cos(Latitude / 57.29577951) * cos(({$Origin['Latitude']}) / 57.29577951) * COS(ABS(({$Origin['Longitude']}) - Longitude ) / 57.29577951))) / (sin(Latitude / 57.29577951) * sin(({$Origin['Latitude']}) / 57.29577951) + cos(Latitude / 57.29577951) * cos(({$Origin['Latitude']}) / 57.29577951) * COS(ABS(({$Origin['Longitude']}) - Longitude ) / 57.29577951))) ) < 50 The $Origin variable replaced sub-queries such as this: SELECT Longitude FROM zip_code_table WHERE City = 'LUBBOCK' AND State = 'TX' LIMIT 1 Removing the sub-queries and replacing them with the actual values seems to have helped some. However, the biggest improves seems to have come from improved indexing. None-the-less my original question still stands, more out of curiosity at this point. Thank you Pikachu2000 for your comment. I am curious if complex math such as what is used above would be any different (PHP or SQL is faster?) then your basic math equations. Also if someone knows a more efficient formula for finding the distance between two points while searching a database then I would welcome the knowledge. Thanks, Patrick
  6. I am trying to clean up a complex SQL query and I am trying to see if the math in the query is faster or slower then doing the math in PHP and then putting the result in the query. Does anyone know the answer to this? Thanks for your time! -Patrick
  7. I know I am a bit late to this topic but here is my take on this. First of all if you have a whole lot of data to pull from the database then it would be better to have C++ grab it from the database and do all the processing from a compiled application. I have never interfaced with a database from C++, but I know for a fact that it can be done. Next question up is executing the script. This is very simple with the php exec function (http://php.net/manual/en/function.exec.php). You can use this to execute code as if you were sitting at a command line. I am currently using it on several sites to simulate multi-threading in PHP to speed up the user experience at processing complex form submissions. The method I employ just causes the server to open another PHP instance from an invisible command prompt and execute more PHP code. When configured correctly you can just fire the code and let it run in the background. This is actually something that Facebook does with their systems. Their engineers have written blog postings about how they converted a lot of PHP code into C++ and stored procedures so that they could run faster compiled code and then just use PHP as the front-end interface that requests information from the compiled code. (http://www.facebook.com/notes.php?id=9445547199). There are more articles than that, I actually first read about this more than 6 months ago in a completely different article. Anyway, the benefits of a system like this are definitely in the performance metrics. You can harness the multi-threading capabilities of a compiled language and improve the user experience drastically. Now I am not sure how you get the data back to the user if you do the fire and run method that I use on my sites (probably an AJAX setup that has a timer checking the DB for changes every so often). But if the back-end code is fast enough you can just tell PHP to wait for the response. One last note, in executing the command line program you can either pass all the needed variables via the command line or save all the variables to the database with PHP and then just pass a DB row for the back-end program to gather the data from. This is certainly a cleaner approach. Anywho, just my two cents. -Patrick
  8. Sorting and filtering output results is really easy and you can do it from two different areas. The best way is on the server side in PHP. You just either use a small form or links with _GET vars in them and then check those vars for certain conditions and then use those conditions in your SQL query for filtering and sorting. If you are looking for the CMS they use then good luck. I didn't see anything I recognized in the output as coming from a particular CMS (not that that is not easy to hide). Are there any other specific features you want to be able to do? I can give you some SQL examples if you want.
  9. The outer loop does not execute, so neither does the inner one. As for the output of the loop, each iteration should output something like this (sample taken from the working dev server): <div class="question_block"> <div class="question"> <input type="hidden" name="questions[]" value="50" /> <div class="question_num"><label for="50">Question #2</label></div> <div class="question_text">I'd prefer a job that requires less than 2 years training.</div> </div> <div class="question_answers"> <div class="answer"> <input type="radio" name="qag_50" value="28" class="required" /> <label>Strongly Agree</label> </div> <div class="answer"> <input type="radio" name="qag_50" value="27" /> <label>Agree</label> </div> <div class="answer"> <input type="radio" name="qag_50" value="26" /> <label>Neutral</label> </div> <div class="answer"> <input type="radio" name="qag_50" value="25" /> <label>Disagree</label> </div> <div class="answer"> <input type="radio" name="qag_50" value="24" /> <label>Strongly Disagree</label> </div> </div> </div> I have tried it with braces instead of the method you see in my first sample, but that didn't seem to change anything.
  10. Hey guys, I thought I would bounce something off you to see what you thought. I have a project that I have worked for months on. I finally got it done and it works perfectly. To go live with it the IT dept had to copy the site to a production server... and that is when I noticed a problem. I use a lot of objects in my code, including database objects with data returned as objects. Simple matter, keeps code a little bit cleaner than arrays. But I noticed that foreach loops that are iterating through objects (at least in the places I have looked so far) are not running. The server seems to just skip over the code. If I do a print_r() on the variable I see the object in all its glory looking great. I did that right before the loop. So then I moved into the loop itself and tried to echo something at the very beginning, nothing. Not good. Now I want to remind you that this is on the production server, if I check out the same code and output on the dev server it goes into the loop and works fine. So my next thought was that maybe there is a corrupt file on the server. So I tried copying over just that file. That didn't work. Scratching my head I decided to check out trusty phpinfo(). It told me that the dev server is running 5.2.6 and that the production server is running 5.1.6. So my thoughts at this point are maybe there is a PHP version problem. Unfortunately getting IT to upgrade to 5.2.6 on the production server will be difficult since there are a lot of websites running from that server. Are there any known issues with foreach and objects in 5.1? Are there any changes you can suggest aside from switching all my code to use arrays instead of objects to hold data? The only other thing I can add is I did all this in CodeIgniter v1.7.1. Here is a sample of the code... foreach($questions_answers AS $qa): ?> <div class="question_block"> <div class="question"> <?=form_hidden('questions[]', $qa->question_id)?> <div class="question_num"><?=form_label('Question #'. $i, $qa->question_id)?></div> <div class="question_text"><?=$qa->question?></div> </div> <div class="question_answers"> <?php $validate_tag = FALSE; foreach ($qa->answers AS $a): if ($a->answer_id == set_value('qag_'. $qa->question_id)) $checked = TRUE; else $checked = FALSE; $input_params = array( 'name' => 'qag_'. $qa->question_id, 'value' => $a->answer_id, 'checked' => $checked ); if ($validate_tag != TRUE) { $input_params['class'] = 'required'; $validate_tag = TRUE; } ?> <div class="answer"> <?=form_radio($input_params)?> <?=form_label($a->answer)?> </div> <?php endforeach; ?> </div> </div> <?php $i++; endforeach;
  11. Actually I just figured out that it was how Firefox 3 was handling it. The output doesn't have that problem with IE. I wonder if this could be considered a browser bug or it just needs something else.
  12. I am working on creating an automatic CSV file output system for one of my admin pages and I have it most of the way there. The problem I am running into is that when the download comes up the file extension is no longer on the name, even though it is in the filename section of the header. So I get a file name with no extension on it. I just want to add the ".csv" part to the end. With the way it currently is I have to manually type the file extension in the dialog box. Below is a sample of my headers... //Outputs the CSV file as a page. Just call this and it will do the rest. function return_file($filename = 'Output') { header("Expires: 0"); header("Cache-control: private"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Description: File Transfer"); //header("Content-Type: application/vnd.ms-excel"); //header("Content-type: text/plain"); header("Content-type: text/x-csv"); header("Content-disposition: attachment; filename=" . $filename . ".csv"); echo $this->file; exit(); } As you can see I have tried several content types in the hopes that the change would solve the problem. However, it did not work. I am new to creating dynamic file downloads so I am probably missing something obvious. Thanks, Patrick
  13. Hi everyone, I have a rather interesting problem that I am currently at a loss for a solution to. Several months ago the company I work for put me on a project to move a site from Wordpress to Drupal (despite my objections). I was charged with the entire build out and with copying data from Wordpress to Drupal. I found a nice little module for Drupal that was supposed to take care of this for me and I set out to do the import. At first I thought that the import had worked. All of the categories were copied, a bunch of stories were populated, everything looked good. Then we started to notice that it had imported only a little over 200 posts whereas the output xml file from Wordpress contained 1523 posts. So I set out to work on my own version of this tool. This time not making a module for Drupal, but simply using a few built in functions from Drupal to make this easier for me. I wrote an script to take the parsed XML and put it into a specially organize array that attaches the comments to the posts and keeps all the attributes tracked and correct. That parsed out everything just fine and I see 1523 posts in the resulting array. Then I took pieces of that old Drupal module and made it work with my script. I pretty much rewrote the entire thing. The only thing I stuck with was the array formatting. So I tried my data import and I wait... and I wait... and still waiting... finally it offers to let me download the script as if it was a downloadable file. I decided to save the file to my desktop. It has the same file name as my script and is completely empty despite output that should have started prior to the loop that inserts the posts into the database. If I leave the parser going, but disable the import I don't have this problem, it just outputs to the browser for me. When I check the database it indeed creates the new category list and vocabulary. It even imported 230 posts. But that is all it does. And I can't find any errors in the server log file. I have attached my script to this, let me know if you have any ideas as to what is causing any of these strange behaviors please. I could really use some external insight. [attachment deleted by admin]
  14. Actually I think I know the problem now. We are working in Expression Engine and instead of creating an extension, the other developer decided to start the session in the layout head template. Which leaves us with a session start location that is a bit unpredictable. So I think that the code execution order is part of the problem. I will work on developing an extension for it.
  15. Hmm, well when I try to start a session it tells me that it is already started, yet when I then check to see if the session superglobal is available it says no. To make things even more confusing I save a variable to the superglobal then try to check it a few pages later and the data is gone.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.