Anidazen Posted January 11, 2007 Share Posted January 11, 2007 Hi guys.I've got a script I've worked very hard on, and it's a commercial script - fetching prices from various sites while a user waits.Anyway, quite obviously when you're paying big for customers, a script randomly failing is an absolute disaster. My script will perform well 90% of the time, usually loading in < 10 sec with all prices fetched.However, sometimes it will randomly go up to 30 seconds, then die with an "Insert Failed!" - which is what I put in for SQL failure. I can only assume this means an SQL timeout or some such shinanegans. With the [i]exact[/i] same parameters, the results literally vary from time to time - 90% smooth, 10% this random issue.Is it something I can correct? How can I fix this? If it [b]is[/b] random SQL timeouts, then 1) How can I improve that performance, and 2) How can I avoid the problem?I don't suppose there's a way to set my SQL timeout to be like 5 seconds just for certain files, then in case of failure just loop (with a max tries) until the bloody thing works, lol...Any thoughts? I'm used to all kinds of problems with PHP, but at least they're usually consistant! - It does occur to me that because the script is very complex and processes outside info, I don't have 100% control over all variables... it could still be something external perhaps? Quote Link to comment Share on other sites More sharing options...
matto Posted January 12, 2007 Share Posted January 12, 2007 Are you setting a timeout in your script? PHP defaults to a 30 second timeout (max_execution_time)...This can be changed with [code]ini_set("max_execution_time",number_of_secs);[/code] Quote Link to comment Share on other sites More sharing options...
Anidazen Posted January 12, 2007 Author Share Posted January 12, 2007 I am assuming it must be an SQL timeout, as I will eventually get an "INSERT FAILED!" message - which is what's set to my mysql_query() or die("INSERT FAILED!").No idea on random causes?What I think I'm gonna have to do is to program some error detection for when this happens - presumably I could attach:or die("Insert failed" . mail("my email","Script Died...",mysql_error() . "-/-/-/-/-/-/-/Original query: " . $query)); - That would at least tell me if it was a SQL error? Problem is, it's so rare that I can't simply debug it... Quote Link to comment Share on other sites More sharing options...
btherl Posted January 12, 2007 Share Posted January 12, 2007 More detailed errors will definitely help. You might even want to log at important points through the script to a file, and then check that logfile later.Such huge slowdowns are usually caused by network issues.. dns lookup or requests which time out. If none of those are present, then it's possible that there's a database locking issue. MySQL with MyISAM tables can have locking problems if it's under heavy load. Then the query may fail if it can't obtain a lock. More detailed error messages should pick that up if that's the case. Quote Link to comment Share on other sites More sharing options...
Anidazen Posted January 12, 2007 Author Share Posted January 12, 2007 Thanks for the advice.How would I go about logging the script to a file? Also - the only thing I know is mysql_error().Should I just record all kinds of variables and everything for reference? Quote Link to comment Share on other sites More sharing options...
HuggieBear Posted January 12, 2007 Share Posted January 12, 2007 [quote author=Anidazen link=topic=122014.msg502582#msg502582 date=1168552571]I've got a script I've worked very hard on, and it's a commercial script - fetching prices from various sites while a user waits.[/quote]For starters you're getting data from other sites, so you're always going to be at the mercy of external constraints.You haven't posted any code (which I'm assuming is because it's a commercial script) so we can only assume, but if you're using fopen() to try and get remote pages then my first suggestion would be to forget that and change to use the CURL library. This allows you to set a time-out and you can then act accordingly.RegardsHuggie Quote Link to comment Share on other sites More sharing options...
Anidazen Posted January 12, 2007 Author Share Posted January 12, 2007 It was once fopen(), then upgraded to CURL, and then on to a semi-custom third party class which is higher still performance.I put in those logs to email me when it crashes, and - apart from the stress of knowing 8 customers were let down today - I got some information. The errors were all one of the following two: - Error: Lost connection to MySQL server during query - Error: MySQL server has gone awayI'm assuming that this is simply beyond my power to fix without changing host? In the mean time, is it possible to set a small max SQL timeout, then loop until it actually works? Quote Link to comment Share on other sites More sharing options...
ShogunWarrior Posted January 12, 2007 Share Posted January 12, 2007 Here are a few situations that could be the problem: * Query data too large. (If you have too much info in your query (such as alot of text/binary) * If there is a problem with the server like stress from other hosts.I don't think looping is a good idea really. While it doesn't relate to MySQL I think you should consider caching the off-site data for a period of time which will _seriously_ improve performance if your site is being accessed frequently enough. Quote Link to comment Share on other sites More sharing options...
Anidazen Posted January 12, 2007 Author Share Posted January 12, 2007 Cacheing has always been a big "what if" decision with the site, and one we've gone against. The reason is that queries are split amongst 720 distinct queries, and there's no way to consolidate that. It's like a price comparison, but with 720 different pages of products - and 20 sellers to consider for each.The data also changes frequently. So until I'm getting > 720 queries a day, cacheing is a bigger load overall, and a real pain in the ass for my partners' sites - apart from anything else.Size of query data might just be the problem here - move from text to blob would help?Edit: The query is an INSERT, with 3 small values ( < 10 characters) and then one massive one. Here's a sample of the bigger one:".19.114.130','32.99~64.65~~155.98~302.98~448.83~591.86~~###23.27~46.21~~114.72~224.63~336.95~449.27~673.90~###~68.99~101.99~169.99~~502.99~665.99~990.99~###22.98~45.88~68.68~114.23~226.40~339.28~451.95~~###20.40~~59.42~97.09~192.25~285.49~376.85~559.63~923.38###31.99~54.99~76.99~122.99~227.99~337.99~445.99~654.99~###35.77~61.77~81.77~129.77~236.77~348.77~459.77~674.77~###~~~~~~~~###31.63~61.94~~149.20~289.42~428.50~564.74~~###~~~~~~~~###28.94~56.49~73.99~135.29~261.17~366.99~~~###22.98~44.98~67.98~112.98~224.98~~~~###26.09~47.95~67.16~103.99~192.19~278.09~365.63~~###23.59~47.18~70.77~117.95~231.18~343.23~452.93~679.39~1,108.73###23.29~46.58~69.87~116.45~228.24~340.62~451.83~670.75~###29.00~~79.08~131.80~263.59~395.39~527.19~~###~40.35~60.40~100.27~199.14~~397.48~595.02~987.72###21.50~42.50~63.50~105.50~210.50~313.50~417.50~625.50~###38.20~~~124.15~233.02~343.80~454.58~676.14~###~~~~~~~~###'" Quote Link to comment Share on other sites More sharing options...
ShogunWarrior Posted January 12, 2007 Share Posted January 12, 2007 It should still be able to hold something like 2MB by default I think so unless you're looking at a string length greater than half a meg then you can probably scratch that off. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.