tibberous Posted October 22, 2007 Share Posted October 22, 2007 I have a script that downloads a website every half-hour. The site is very large, I'm probably downloading 100 meg a day from it, and I'm afraid of my ip getting baned. It also kind of looks like I'm doing a DOS attack against it, but I'm really just spidering it. Is there some way I can reliably run all my connections through different proxy servers? I am using a curl alternative library to randomly forge my HTTP_USER agent, but I'm making a single hit, to every page on the site, every half hour - pretty easy to tell what is going on. Quote Link to comment Share on other sites More sharing options...
Azu Posted October 22, 2007 Share Posted October 22, 2007 First of all, it would be great if you could somehow check when his page was last modified. Maybe it's stored in a header or something? And only download the page if your copy of the page is old. This could cut down on the bandwidth usage A LOT. Unless every single page changes every half hour. As for proxying in curl.. http://www.google.com/search?q=PHP+curl+proxy&start=0&ie=utf-8&oe=utf-8 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.