Jump to content

[SOLVED] Why does my crawler script suddenly end with no error?


daydreamer

Recommended Posts

Hi.

 

I have written a web crawler script. It will visit a large number of URL's with cURL.

 

After around 2-3 minutes of running, it will just stop, with no error output or notices.

 

I have these settings:

set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);

 

Any ideas why it would just stop?

Link to comment
Share on other sites

If it's a long running task, it should almost certainly be run from the command line rather than from a we browser.

However, you do need to check the php.ini settings for the command line, it's quite possible that it's running against a different php.ini file. Running a phpinfo() script from the command line should provide that information

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.