Jeremysr Posted May 4, 2007 Share Posted May 4, 2007 A couple of weeks ago I searched for something on Google about storing a webpage into a variable and it said to use file_get_contents(). I tried and it worked. I remember it said sometimes websites would block it though, and it showed some code for a workaround. But I ignored that part because just file_get_contents() worked for me. But NOW, I need to use this function again (my program will get the URLs of a bunch of webcomics I want to download.) And now it says "permission denied" for the site I'm accessing. I tried searching Google and couldn't get the site I was at before or anything else that worked. :-\ So, can someone here help me with this please? Quote Link to comment Share on other sites More sharing options...
john010117 Posted May 4, 2007 Share Posted May 4, 2007 Try setting directory permissions to 777 (but note that it'll cause a major security hole). Quote Link to comment Share on other sites More sharing options...
trq Posted May 4, 2007 Share Posted May 4, 2007 Try setting directory permissions to 777 (but note that it'll cause a major security hole). He obviously has no control over these remote sites. You'll want to take a look at the curl extension. I would say your getting denied errors because you are blatantly ripping content. With curl, you can make it look like your an actual browser making a normal request. Quote Link to comment Share on other sites More sharing options...
trq Posted May 4, 2007 Share Posted May 4, 2007 PS: Nice avatar. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.