Jump to content

jnich05

New Members
  • Posts

    8
  • Joined

  • Last visited

    Never

Posts posted by jnich05

  1. http://tools.pingdom.com/fpt/#!/uyNTkbXZR/http://www.phpfreaks.com

     

    When I do this same speed test on my own website, all the loading takes place quickly as it should, but what I'm noticing is that there are huge "wait" times from the server. In this example, I'm talking about the blue colored sections... described as "The web browser is connecting to the server".

     

    Any idea what could be causing this? The slow files do all seem to be images of sizes between 1-5k. The wait times seem to be from 350ms to 1.7 seconds.

  2. Googlebot is somehow crawling some extremely long urls on my server like:

     

    http://website.com/<br /><b>Notice</b>:  Undefined variable: content_type in <b>/home/webuser/helloworld/htdocs/include/page_tag.php</b> on line <b>620</b><br />http://website.com/rss/rss_tags?item=links&type=&tag=room

     

    I have no idea how these urls are being created, but each time they are viewed, it maxes out my memory and CPU for a good 10+ minutes. I'm thinking that my various .htaccess redirects are getting stuck somewhere in a loop to cause this.

     

    I'd like to redirect all these bad urls to my homepage, and have tried to detect/redirect them with:

    RewriteCond %{REQUEST_URI}    ^/(,|;|:|<|>|">|"<|/|\\\.\.\\).{0,9999}.* [NC,OR]

     

    It seems to work with short urls, but when the urls get super long, it still crashes my site.

     

    Anyone have ideas of how I could redirect these without doing too much processing?

  3. This code caches 2 things - an rss feed, and images inside the rss feed. It places them both into a folder "c/" on the server. I want to keep the images cached basically forever, so I've set the $cachelife to the something like 20 years (only shows 7200 seconds here in this script though).

     

    What I'm trying to figure out is how to remove the caching of the rss files. The rss feeds get updated fairly often and I'm stuck with ancient feeds because both are set to the time limit of the image caching.

     

    In the script where it says "// now the rss cache file", there's gotta be a way to set it to 0, or no-cache somehow.

     

    I've been working on this all weekend and I've just about lost my mind. I really need to learn php better.

     

    <?php
    
    class cacheMgr {
    
    // how long is the cache lifetime - in seconds? 
    var $cachelife = 7200;
    
    /*
    * Constructor 
    *	- verifies that cache dir exists and is writeable
    *	- sets up images and pages dirs
    *	
    */
    function __construct() {
    
    // verify that the cache dir exists andis writeable...
    if(!file_exists( 'c' )) {
    	echo 'Cache directory "c" needs to be created for cache support to work';
    	exit;
    }
    if(file_exists( 'c' ) && !is_writeable( 'c')) {
    	echo  'Cache directory "c" exists, but is not writeable' ;
    	exit;
    }
    }
    
    /*
    * Checks to see if an image id is cached 
    *
    * @param $cacheid -  the image file to test - (ex: abcdefg.jpg)
    *
    * @return true if image exists in cache, false otherwise
    *
    */
    function is_image_cached( $cacheid ) {
    
    if( file_exists( 'c/' . $cacheid )) {
    
    	$cachetime = time() - $this->cachelife;
    	if(  filemtime( 'c/' . $cacheid ) < $cachetime ) {  // expired - blast it
    		unlink( 'c/	'. $cacheid );
    		return false;
    	}
    	return true;
    }
    return false;
    }
    
    /*
    * Iterate over cached resources, removing any that have expired
    *
    * @return - true if the cache cleanup completes sucessfully
    *
    */
    function clean_cache( ) {
    
    // images first..
    $dh = opendir( 'c' );
    $cachetime = time() - $this->cachelife;
    
    while (false !== ($fname = readdir($dh) ) ) {
    
    	if( is_dir( $fname )) continue;  // ignore '.' and '..'
    
    	if( filemtime( 'c/'. $fname ) < $cachetime ) {  // expired -- blast it.
    		unlink( 'c/'. $fname );
    	}		
    }
    
    closedir( $dh );
    
    // now the rss cache file
    $dh = opendir( 'c' );
    $cachetime = time() - $this->cachelife;
    while (false !== ($fname = readdir($dh) ) ) {
    
    	if( is_dir( $fname )) continue;  // ignore '.' and '..'
    
    	if(strstr($fname, "rsscache" )) {
    		unlink( 'c/'.$fname);	
    	} 	
    
    }
    
    return true;
    }
    
    /*
    * Set the lifetime of the cache (in seconds) 
    *
    * @param $lifetime - seconds to keep the cache alive (3600 = 1hr)
    *
    */ 
    function set_lifetime( $lifetime ) {
    $this->cachelife = $lifetime;
    }
    
    /*
    * Get the lifetime of the cache (in seconds)
    *
    * @return $lifetime - returns the number of seconds to keep the cache alive (3600 = 1hr)
    *
    */
    function get_lifetime() {
    return $this->cachelife;
    }
    
    
    }
    
    
    ?>

  4. I think I've figured out a few more php tidbits this weekend trying to get this to work. It turns out they all worked, and I'm now using the code by asmith.

     

    I'm not sure how to explain why it didn't work at first, but it had something to do with not liking any parameters I was using in it. It worked perfectly when I used non-parameter words in the script, but eventually got it working by using a different parameter.

     

    Much thanks!

  5. The script is just a feed parser. It pulls data from an rss feed, but I'm trying to make it pull a different feed based on whatever words are made with $full_key_word. When I try running the new code you made, it just pulls back an empty feed which either means it's not working, or it's pulling back a feed for $full_key_word which there are no feed items for.

     

    Is it possibly pulling this:

    http://blogsearch.google.com/blogsearch_feeds?hl=en&as_drrb=q&as_qdr=d&q=$full_key_word&ie=utf-8&num=10&output=rss

     

     

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.