Jump to content

chrisrusso

New Members
  • Posts

    1
  • Joined

  • Last visited

chrisrusso's Achievements

Newbie

Newbie (1/5)

0

Reputation

  1. Hi, I'm Chris, first time here, I usually post on stackoverflow but I think that this is a little bit more complex than usual. I believe I really need a PHP expert to solve this issue. We have an application which is crawling the web. There's 2 main PHP scripts that are doing the work, using using proc_open, and exec() and running over a circular reference. Both scripts that compose the main structure are saving all and every single error on a custom error log, defined this way: //error_log @ini_set('error_reporting', -1); @ini_set('log_errors','On'); @ini_set('display_errors','On'); @ini_set('error_log','/var/www/vhosts/xxx/xxx/resonance/such_a_mess'); The problem is simple, after running for some hours or days, the application stops crawling. And there's no error information at all on the error_logs that could explain the problem or the reason why it stopped. I've been trying to get more details using Newrelic and XHProf, no luck.- There's no HTTP server involved on the execution of the scripts as they are being executed like I mentioned, using exec() and proc_open: exec("sh -c \"$cmd | logger\" > /dev/null &"); It has been around 3 months on the same situation... and to be honest the only thing I want at this point is to see a fatal error on the logs, to understand what's going on...
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.