willg Posted April 5, 2007 Share Posted April 5, 2007 Hi, I don't know if this is possible with PHP so apologies if I posted in the wrong place. I would like something that can "mass query" a search engine I've made. The thousands of queries to be made would be listed in a text file for the PHP to pick out. Then, the PHP would "harvest" the URLs in the resulting list of search results and save them in the same text file next to the associated query separated from it by a sem-colon (URLs would be separated by commas). The script should then move on to the next query in the list and so on until all the queries have been completed. The query structure would be http://www.(website).com/<directory>/search.cgi?zoom_query=QUERY&zoom_per_page=5000 To help, URLs are always preceded by this code in search results: <div class="result_title"><a href="URL Here I hope I made sense. Link to comment https://forums.phpfreaks.com/topic/45733-mass-querying/ Share on other sites More sharing options...
monk.e.boy Posted April 5, 2007 Share Posted April 5, 2007 You have two problems to solve. 1. A list of URLs that you can add and remove items. 2. An HTML parser and URL extractor. 3. An engine to co-ordinate things. I suggest a database of 1. and cURL and regular expressions for 2. The engine will take a URL from the list, and pass it to the parse. This will download the page, and return the URL(s), the engine will insert these URLs into the DB. monk.e.boy Link to comment https://forums.phpfreaks.com/topic/45733-mass-querying/#findComment-222148 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.