anon Posted December 20, 2007 Share Posted December 20, 2007 Hi, how would I go about making a web crawler which indexes URL's listed in a database. The web crawler would be initiated by me, so it could be a program, like in PERL or something. Quote Link to comment Share on other sites More sharing options...
Jessica Posted December 20, 2007 Share Posted December 20, 2007 Uhm, if you want to write a program in PERL, you might wanna check what forum you're in For PHP, you could use CURL. What do you mean by indexes URLs? Quote Link to comment Share on other sites More sharing options...
anon Posted December 20, 2007 Author Share Posted December 20, 2007 Ok, check it. There is a database which contains links that the crawler must visit and then index the file. How would i do this in PERL or can i do it in PHP Quote Link to comment Share on other sites More sharing options...
Ninjakreborn Posted December 20, 2007 Share Posted December 20, 2007 You can do it in either language. It's called a "bot" or a "Data Harvester". There are multiple ways to do it with any language. However you are looking at 20-45 hours worth of programming (and knowing what your doing) before having anything decent. There are 3rd party systems out there, and if you are trying to make one to resell it'll have to be better than most out there (a time-consuming feat). If you just want to use one because you need it, google for one. If you are doing it for learning, then google "PHP, link crawling tutorials". Quote Link to comment Share on other sites More sharing options...
anon Posted December 20, 2007 Author Share Posted December 20, 2007 oh dang. Will post results. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.