Search the Community
Showing results for tags 'file_get_contents'.
-
Hi, I am trying to grab some data from a site, but my blocker at the moment is that the site has a button near the end of the page, that opens up more data. It's th ebutton that says "Show All" Using file get contents, I don't get this "hidden data". Any ideas please? I'm a novice and am using this code: echo strip_tags(file_get_contents("http://www.sportinglife.com/football/live/vidiprinter")); Thank you.
-
Hi, I have a few wordpress websites that I host for my clients because of the number of attacks they have been receiving lately I have implemented a .htaccess file to block any ip address that's not in the whitelist, the problem I face is everytime the client moves from location to location or there ip address changes I have to update the .htaccess file with there new ip. I'm trying to build a script where they can access a url with a key in and submit there new ip address the php script would then read the .htaccess and add the new ip, however the 'echos' in the file seem not to be echoing any information to screen and I'm faced with a blank white screen can anyone give me any ideas on how to do this or have alook at the script below I have wrote. <?php if (isset($_GET('key')) && $_GET('key') = '78J89ke93k93HJ883j003') { $htaccess = '.htaccess'; //read the entire file $str = file_get_contents($htaccess); //delete deny from all from file $str = str_replace('deny from all', '', $str); $ip = 'allow from ' . $_get('ip'); //'allow from 92.27.111.112'; $str .= $ip; //re add deny from all to end of file $str .= "\n" . 'deny from all'; if(file_put_contents($htaccess, $str)){ echo 'IP ' . $ip . ' Added to .htaccess file'; } } else { echo 'Invalid Key'; } ?>
-
I have two domains - http://domaina.com, http://domainb.com I'm calling domainb.com's url from domaina.com: file_get_contents('http://domainb.com/a.php'); In http://domainb.com/a.php, I'm trying to creating a session or cookie for domainb.com <?php setcookie("key", "value", time()+3600); session_start(); $_session["key1"] = "value1"; ?> But this code is not working. Please help in solving this issue. Thank you.
- 2 replies
-
- curl
- file_get_contents
-
(and 2 more)
Tagged with:
-
Hi to everyone, I'm new to the forum and I'm posting here because I ended up in a logical problem for my next script development. I need to get some data of external websites (with vbulletin board), perfectly legal. Using file_get_contents i can print the page content on my server and then use jquery's powerful selectors to get my data. The problem is that these data are shown only to logged in users so i would need this script (maybe using cURL?) to either login to the external website and then persists the connection or maybe if the user who is executing my script is already logged in that website then use his login? (most likely impossible I think..) This is my code so far (found on some sites and merged into this) $data = array('vb_login_username' => 'Scanu', 'vb_login_password' => 'grgfgrgrfbtgbt'); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, "http://www.vbulletin.org/forum/login.php?do=login"); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_COOKIESESSION, true); curl_setopt($ch, CURLOPT_FAILONERROR, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, false); curl_setopt($ch, CURLOPT_FRESH_CONNECT, true); curl_setopt($ch, CURLOPT_HEADER, true); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30); curl_setopt($ch, CURLOPT_POSTFIELDS, $data); $result = curl_exec($ch); curl_close($ch); $pattern = "#Set-Cookie: (.*?; path=.*?;.*?)\n#"; preg_match_all($pattern, $result, $matches); array_shift($matches); $cookie = implode("\n", $matches[0]); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, "http://www.vbulletin.org/forum/"); curl_setopt($ch, CURLOPT_COOKIE, $cookie); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_COOKIESESSION, true); curl_setopt($ch, CURLOPT_FAILONERROR, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, false); curl_setopt($ch, CURLOPT_FRESH_CONNECT, true); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_POST, false); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30); $result = curl_exec($ch); curl_close($ch); echo $result; ?> It just shows the same page for unregistered users. Any help or advice is appreciated, i'm very new to this type of script..
-
- curl
- file_get_contents
-
(and 3 more)
Tagged with:
-
Hi I am writing this code for a wordpress plugin that gets content from a json file and then decodes them <table border="1"> <tr> <th>Thumbnail</th> <th>Title</th> <th>Excerpt</th> </tr> <?php $url = http://localhost/wordpress/json; $json = file_get_contents($url); $safe_json = str_replace("\n", "\\n", $json); $query = json_decode($json, true); foreach ( $query as $item ) { echo '<tr>'; echo '<td>' . $item['image'] .'</td>'; echo '<td>' . $item['title'] .'</td>'; echo '<td>' . $item['excerpt'] .'</td>'; echo '</tr>'; } ?> Now I need help in paginating the data. Any help will be appreciated.
- 2 replies
-
- json_decode
- file_get_contents
-
(and 2 more)
Tagged with:
-
Ok , there is an issue with the rotten tomatoes API , It gives developers the ability to set a limit of movies to return from the API from a specific category , like the movies in the box office are pulled dynamically, but when I set the limit to 50 , or even 10 movies , it only returns an array containing 3 movies instead of 10 or 50 or whatever the limit of movies is in the API call ... What could be causing that ? is there maybe something wrong with the way I am using the file_get_contents function $box_office_movies = file_get_contents('http://api.rottentomatoes.com/api/public/v1.0/lists/movies/box_office.json?limit=50&country=us&apikey=[key]',FILE_USE_INCLUDE_PATH); $box_office_movies = json_decode($box_office_movies, true); var_dump($box_office_movies); // returns only 3 movies ..
-
My website involves loading several other pages at a time to produce results from all of them. The problem I had was loading times with some websites taking upto 10 seconds, where as others would take less than 1 second. I ended up setting the timeout feature in the PHP.ini file to 1 second, which is fine not to include results from slow loading pages. But I would like to be able to load all 10 web pages using "file_get_contents" simultaniously since the connection on my server is a good 100Mb. Is this possible? Thanks
- 4 replies
-
- file_get_contents
- multiple
-
(and 1 more)
Tagged with:
-
Issue: Retrieving data from remote HTTP locations. "failed to open stream: HTTP request failed!" Doesn't Work: fopen(), file_get_contents(), SoapClient('[url=]http://remote_wsdl[/url]') Works: curl, wget via CLI & exec() Tried: Set all ini's to allow_url_fopen = 'On' root@OW-WS01:~$ php -v PHP 5.3.3-7+squeeze15 with Suhosin-Patch (cli) (built: Mar 4 2013 14:05:25) Copyright (c) 1997-2009 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2010 Zend Technologies with Suhosin v0.9.32.1, Copyright (c) 2007-2010, by SektionEins GmbH root@OW-WS01:~# php -i --php-ini /etc/php5/apache2/php.ini | grep 'fopen' allow_url_fopen => On => On root@OW-WS01:~# grep 'fopen' -R /etc/php5/ /etc/php5/cli/php.ini:; http://php.net/allow-url-fopen /etc/php5/cli/php.ini:allow_url_fopen = On /etc/php5/apache2/php.ini:; http://php.net/allow-url-fopen /etc/php5/apache2/php.ini:allow_url_fopen = On Code Debug: ini_set('default_socket_timeout', 10); ini_set('display_errors', 1); ini_set('error_reporting', E_ALL); echo 'allow_url_fopen:' . (ini_get('allow_url_fopen') ? 'TRUE' : 'FALSE') . '<br/>'; echo 'default_socket_timeout: ' . ini_get('default_socket_timeout') . '<br/>'; $Remote_XML = "valid_path.xml"; $todays_XML = __DIR__ .'/imports/' . date( 'Ymd', time() ) . '.xml'; echo '<hr/><br/> Testing fopen()<br/>'; $from = fopen( $Remote_XML, 'r' ); $to = fopen( $todays_XML, 'w+' ); stream_copy_to_stream( $from, $to ); echo '<hr/><br/> Testing file_get_contents()<br/>'; $fgc = file_get_contents($Remote_XML); file_put_contents($todays_XML, $fgc ); echo '<hr/><br/> Testing SoapClient()<br/>'; try{ $soap = new SoapClient( '/?wsdl', array( 'trace' => TRUE ) ); $soap->login( 'user', 'pass' ); echo 'soap passed'; } catch( SoapFault $e ) { var_dump($e); } phpinfo(INFO_CONFIGURATION); Results in image format: http://i.imm.io/10Hle.png So, how do I overcome this? Have sniffed packets and these requests aren't even getting generated. This code works just dandy on local and on another server, so there's got to be something I haven't figured out yet. Any idea what could be causing this or what to try next? Thanks
- 1 reply
-
- soapclient
- fopen
-
(and 1 more)
Tagged with:
-
i want to manipulate a joomla component for my site. component has no ability to do what i want. i create a db row for component, for every new users. but i coudn't resolve parent id column to insert the correct value. i have to login to panel and open that category id, and just click "save and close" button for every new users. it is a manuel solution. but can i do it by running "file_get_contents". like this: $url = "/index.php?option=com_joomgallery&controller=categories&task=edit&cid=59"; $postdata=http_build_query(array("usr" => "xxxx", "pass" => "xxx", "cid" => "59", "javascript" => ""Joomla.submitbutton('save')"")); $opts = array('http' => array('method' => 'POST', 'header' => 'Content-type: application/x-www-form-urlencoded', 'content' => $postdata)); $context = stream_context_create($opts); $result = file_get_contents($url, false, $context); source code part: <li class="button" id="toolbar-save"> <a href="#" onclick="Joomla.submitbutton('save')" class="toolbar"> <span class="icon-32-save"> </span> Save & Close </a> </li> could i explain the problem clearly??? categories.php
- 1 reply
-
- php
- file_get_contents
-
(and 1 more)
Tagged with:
-
Hi, I am new to PHP. Usually I use file_get_contents to get the content of the URL. however somehow, I am unable to get content for below website. http://fullmovie-hd.com/jurassic-attack-hindi-dubbed-nowvideo/ I used http://onlinecurl.com/ website to look at the result of curl and the response header is HTTP/1.1 200 OK Server: nginx Date: Thu, 03 Sep 2015 02:37:29 GMT Content-Type: text/html; charset=UTF-8 Transfer-Encoding: chunked Connection: keep-alive X-Powered-By: PHP/5.4.35 Link: <http://fullmovie-hd.com/?p=42820>; rel=shortlink I have tried various different things like using curl, changing init header etc. but nothing worked. Can anyone please help? I am writing a scrapper for Kodi and it will help everyone as it will be free scrapper to use. Thanks in advance.