magnetica Posted October 7, 2007 Share Posted October 7, 2007 I wondered if its possible to get links from a webpage using html? Kind of like a bot that goes through the site, or would I have to use something like VB or C Quote Link to comment https://forums.phpfreaks.com/topic/72216-gather-links/ Share on other sites More sharing options...
GingerRobot Posted October 7, 2007 Share Posted October 7, 2007 What exactly do you mean? Do you mean there is a website which has lots of <a href=...> tags that you would like to find and get the addresses of the websites they link to? If so, then yes, its completely possible. You'll firstly need to get the contents of the website, which can, in most cases, be achieved with the use of the file_get_contents() function. With some websites, you will need to use cURL however. Once you have the contents of the website in a string, you'll need to use regular expressions to find the links. You'll be wanting the preg_match_all() or the ereg() functions for that. Edit: I just noticed you said 'using html'. If thats what you meant, then no, its not possible. I misread that as 'using php'. Quote Link to comment https://forums.phpfreaks.com/topic/72216-gather-links/#findComment-364153 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.