etrader Posted September 12, 2011 Share Posted September 12, 2011 I want to parse some tiny lines from a webpage. I have few questions: 1. What is faster and better: cURL or file_get_contents? 2. Is it better to run preg_match commands form every line I want OR first get a shorter part (e.g. between <head> tag) then running preg_match for the lines? 3. Actually, I need to load the whole page before parsing. Is there a way to stop loading the whole page. For example, when I want something between <head> tag; stop loading <body>? Thank you in advance! Quote Link to comment https://forums.phpfreaks.com/topic/246944-get-contents-from-web-by-preg_match/ Share on other sites More sharing options...
aruns Posted September 12, 2011 Share Posted September 12, 2011 file_get_contents is the easiest way to crawl data ., but curl is secure and fast .. Quote Link to comment https://forums.phpfreaks.com/topic/246944-get-contents-from-web-by-preg_match/#findComment-1268232 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.