Jump to content

drifter

Members
  • Posts

    189
  • Joined

  • Last visited

    Never

Everything posted by drifter

  1. Hey I am not sure if anyone cares, but I made some changes to my site. Hope someone will find this information useful. Smarty->template-lite(modified) Pear DB ->ezSQL(modified) >>SMARTY>> First I removed SMARTY - I went with template lite (source forge)- it is supposed to be a SMARTY drop in replacement, but that did not work so well - I spend about 5 hours debugging and changing features. I also stripped out all the caching and debugging code. My page has very few static parts, that are not changed by user preferences and queries etc, so I do not cache. Biggest problems I had were with the sections and the append/merge - these were both there but when they were ported from SMARTY there were some errors. The result was first that my pages are MUCH faster. after doing this most of my pages (ones without huge database work) now loaded between .05-.08 seconds. When running SMARTY, I was getting pages times of .09-.12 - so this is quite an improvements. I have also notices since SMARTY was so large, my APC is now able to cache a lot more pages with the extra space cleared out... not sure if this is part of the speed up. The new component is about 1/7th the size. >>PEAR DB>> Next I went ahead and started work on removing Pear DB. I chose to go with ezSQL as the replacement You can find it [url=http://drifterlrsc.users.phpclasses.org]HERE[/url] just search for ezsql. This class is much difference then pear DB so I had to work on my code a bit. This change took me about 3 hours. Basically everyplace I have a select, I replace $db->query() with $db->get_results. Everplace I had a while($row=$res->getRow())- I replaced it with a foreach($res AS $row) - there are a few other tweaks, and functions I added to the class such as to get the insert ID. I did not use any of the Pear DBs prepare functions or anything like that so I did not have to worry. I do like having this in a seperate class rather then having the mysql functions in my code just incase I want to change DBs. With Pear that would have been easy, but even how I only have to rewrite like 10 small functions. The results here were also far better then expected. Now that me average page speeds were at .05-.08 after getting rid of SMARTY, I did not think I would get much faster. Well.... Now most of my pages load between .02-.04 - that is a lot faster. I was noticing these speed increase even on pages that only have one small query. That tells me that just loading Pear DB was the problem. One my page that returns a large record set. 20 rows, about 100 columns - I noticed an even bigger speed increase. My page went from .30 to .17 using this new class. I am not sure if this is due to the way Pear DB handles the result set or what. This new class is also about 1/7th the size of the old class. So overall, in about 8 hours, I droped my page creation times by 66% just getting rid of these classes. Now these page times a not under heavy load testing, but on my live server. All page times are average of all page views on a script over the course of a day. The other nice side benifit is these new classes are so much smaller, that it is not intimidating to get in there and change something if you need to. With those other classes, making a minor tweak left me with a feeling of "where do I even begin" Hope Someone finds this information usefull.
  2. that sounds like it may be very useful in the right spots.
  3. I was just doing some more mysql tweaking, and I was wondering about the options to return a row as a object or an assoc array. What are the differences? I use assoc 95% of the time. Is it just a matter or preference, or is there really a difference. Thanks
  4. OK I was looking at my code - I have had notices turned off for a while, but when I turn them on I am getting a lot of undefined indexes. Usually they are things like assigning a $_GET['something'] to a variable for use later. So how bad are notices such as these? do they slow things down? should I just turn notices back off and forget it? Would having hundreds of if statements slow it down even more then just leaving it? Thanks
  5. After much research, this is the newest plan seems the best... I am running apache as nobody again. Got APC going again. Now you can run you cron as nobody by editing the nobody crontab. you do need to remove nobody from your cron.deny list.
  6. what if your mailer was open to header injection, and people were using your form to spam and your IP is now blocked by them? just a thought
  7. OK this is from the template engine. [code] // matches bracket portion of vars // [0] // [foo] // [$bar] // [#bar#] $this->_var_bracket_regexp = '\[[\$|\#]?\w+\#?\]'; // matches $ vars (not objects): // $foo // $foo[0] // $foo[$bar] // $foo[5][blah] $this->_dvar_regexp = '\$[a-zA-Z0-9_]{1,}(?:' . $this->_var_bracket_regexp . ')*(?:\.\$?\w+(?:' . $this->_var_bracket_regexp . ')*)*'; [/code] So I want to tweak the last one so that it will accept $foo + $bar or $foo - $bar[0] or $foo + 4 basically I want to take the same expression and be able to add 0 or more of these expressions to the end  (or integers) with +- operations I know this should be easy since most of it is there. Thanks Scott
  8. [quote] It's coming from the database in this instance, so should be fine, but good point. [/quote] I saw $postdate and I though $_POST['date']... oops
  9. do not forget to check how users enter dates - you may also find mktime() usefull
  10. OK last week I posted about a problem I was having. Cron was running as my account name, the webserver was running as nobody. The result, when I create a file via web server, it could not be deleted by cron. (something as simple as upload and purge) OK so my hosting place said to go with phpsuexec. - Did that last week - took almost 20 hours to change permissions on all the files, not I find out... if you are running phpsuexec, you can't run eaccelorator or APC!!! What the heck! - so I fear this brings me back to my original problem. How do I make it so files can be writeable to cron and the web server. I know this is a very common task - people have clients upload all the time and then delete with cron. Thanks Scott
  11. are you talking about like a where age='something' AND (race='a' OR height='c' OR field='d') where you require the first one and are happy with any from the second group?
  12. well the first thing about the query cache is that it stores an exact query - including limits - I have very few truly duplicate queries. I do however have a lot of people query page one (limit 0-20) then page 2 (limit 21-40) then view an item say 15 (which is derived from the same query) - I have noted that an average of 2.5 pages of results per page being views with about 2 items per page - that is about 4.5 queries plus 2.5 queries for count... so a total of 6.5 Different queries in the eyes of mysql cache, but really one query with different limits.  so what I am suggesting and testing is taking any query - limit 1500 - dump it all in the table. then I can search that small table quick using just limits - no joins, no = or != - just a select * if the query cache would cache without the limits, that would help a lot... do they? is there a setting or something?
  13. [quote author=mjdamato link=topic=119415.msg489073#msg489073 date=1166637810] In any event you could probably improve performance by redisigning your database, your queries, or both. Not to mention hardware upgrades. With your benchmarks (if done properly) you should be able to detemrmine where you bottlenecks are and where you could get the most performance gains. [/quote] I have been doing a lot of DB tweaking and query tweaking and it has helped a lot. I project problems when my traffic is about 5 times what it is now - I project that with in a year. Really I want to told over until I can afford a second server and move my DB and some data files over there. I do still want efficiency though - even if a company is HUGE they can buy more hardware, but you would rather only pay for 50 servers rather then 60.
  14. Well I put the code in for a couple hours - splitting the traffic based on IP to the new and old script, and the results were a lot bigger then I thought -at the busiest time I saw up to 70 extra tables in my DB - so if I went with this and was peakingout at 200+ extra tables, that does not cause a problem for mysql does it? (note  time are with apc disabled) [code] RUNS avg time   median time script test_period ------------------------------------------------------------ 800 0.24488356305286 0.122019 ITEM NEW 592 0.44784119426952 0.1331 ITEM OLD ----------------------------------------------------------- 1251 0.60897496259279 0.205683 SEARCH NEW 440 0.82222866654735 0.616499 SEARCH OLD [/code]
  15. well I am debating... As long as I keep a middle class and do not have php mysql commands in my code, switching should not be too bad. I also have a script that does about 100,000 inserts/updates in a row. I use the query() method and I have noticed that my memory keeps going up with every call - by about 4,000 likes I am at 50% ram - so I ended up putting a mysql_query() in my code which did not make me very happy. I was looking and I can not really clean up what they have because it is so complicated and all the functions are tied together. - I could just ad another method for the insert/update that is very simple, or if I am going to go that far, should I just do something like start with ezmysql from phpclasses.org and modify that to be drop in.
  16. OK I use the pear DB class - I have for a long while, so I have many many places that call it. I do however only call the basic methods such as query() - I do not use any of the stuff such as prepare() So it really comes down to me using only about 25% of the functions. So how much would I gain by trimming this down or rewriting a drop in class that only uses the needed calls. Would that make a big difference? The same thing for Smarty - I only use a small part of it. I would really like to strip it down a lot, but it is a lot of work if does not make a big difference. thanks
  17. OK I was using diesel test to put some load on my server and see what happens. Anyway, I run it with 50 users for 5 minute and I get 7296 sucess full 6357 http errors 42 time out Now I look at the documentation and anything but a 200 code is an http error - and I had the time out a 1 minute. But I can not find out what pages were giving errors there is not place in this software I can find. My guess is it may be because of permenant redirects and such that are not 200? I have never had any client have a problem with failed anything, let alone almost 50% (numbers were the same ratio with only 5 virtual users) So does anyone know how I can see what is giving errors in diesel test, or can someone else recommend a good load/testing software?
  18. sms anywhere they have an api - about $4 a month - works easy, and they give you code to modify
  19. OK so I was noticing the other day that the average visitor to my site goes through about 2.5 pages of search results - same search, just different page number (limits) - they also veiw several items (which are passed through the same search criteria to verify permissions.) So I had this idea to take any search that comes to my site and create a table from select and write all of the results to this table. On each call I check to see if there is a table holding that query, and if there is I query that table instead. All tables expire after 10 minutes. So the first time through, I lose some speed because I create a new table, but on the addition search pages, I gain a ton because I no longer have a complex sql, I have a select * from xxx LIMIT 20,20 on a much smaller table. (twice - once for query, once for count) So now I set it up so all people on even IP address run on the old system and the odd IPs run on the new so I can compare at the same time and account for server load. The results are a gain of an average of about  10% on pages that take about .2-.3 seconds to generate. So my question is... (finally) - is gaining the extra  .02-.04 seconds really worth all the extra complexity this brings to my code? What about on very busy servers? I am obviously not worried about visitors seeing pages .02 seconds faster, I am more wondering how much of a difference that can make as the server gets busier - also will it help on a busy server if search are querying 20 different tables rather then all querying the one main table?
  20. actually it is part of a filename for caching a query result. I can save the long string in the DB and generate a guid for the file name, but then that is just an extra step I need to take and I am working on speeding up things and eliminating steps.
  21. OK I have a long string that is really a bunch of characters to identify a query. It will range from 20 characters up to about 100 characters. The problem is that I can not use a string longer then 64 characters, so I want find a way to shorten it and keep them unique so truncating is out. I had the idea to md5() them - that returns a fixed 32 characters right? Obviously a md5 is a finite number of characters, so there will be repeats, but I do not have to worry about that with 100 characters right? is there a better way to do this? Thanks
  22. is there a way to find out if a mysql table exists in my db?
  23. found out I can get around it by not using the *
  24. do you at least have a start to some code? I know I learned a lot about AJAX from google maps  - I know you do not want to map, but there is a whole ton of people there that all they do is AJAX. may at least get you a start.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.