skarfr Posted June 11, 2009 Share Posted June 11, 2009 Hello everybody, I am developping a php script who parse a lot of remote rss feeds and save rss's datas on a local mysql database. Every morning, i have a cron who launch this kind of function: $listRss = getAllRssURLFromByDB(); //I put on an array all rss urls with an SQL statement from my local database foreach($listRss as $rssURL) { //foreach remote rss url $dataToPutOnLocalDB = parseRSS($rssURL); //I launch a parser (curl and co) to get remote datas sql_insertDataOnDatabase($dataToPutOnLocalDB); //and i save datas on my local database } My problem is that the execution is too long. I have initialise my server with these values: ini_set('mysqli.reconnect', 'On'); ini_set('max_allowed_packet', '128M'); ini_set('memory_limit', '128M'); ini_set('max_execution_time', '200'); But i still have errors like "Maximum execution time exceeded" and "MySQL server has gone away" At last, i will have hundred of rss url in my database (so it will be longer). To resolve errors, i thought about multi threading (like in C#, java...). Do you think that launch 1 thread by RSS parsing (i mean the content of the foreach block) will increase performances and resolve errors like "Maximum execution time exceeded" and "MySQL server has gone away"? How to do that in PHP (if you know a good class or a good tuto...)? Do you have other ideas to resolve this kind of problems ? Thank you (and sorry for my english...) Peter Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.