-
Posts
3,584 -
Joined
-
Last visited
-
Days Won
3
Everything posted by JonnoTheDev
-
Automatic script execution through proxy server
JonnoTheDev replied to JonasLAX's topic in PHP Coding Help
Does your script use CURL? -
Unfortunatly you will probably have to scrap what you have done already and download a complete uploader script. Adding a progress bar isn't straigt-forward. You will see the complexity after downloading one.
-
It won't. Your missing about 75% of the required code.
-
[SOLVED] Curl not retrieving image properly
JonnoTheDev replied to scottybwoy's topic in PHP Coding Help
The other thing that you code isn't doing is checking for curl errors $errors = curl_error($ch); You function should only try to save a local file if there are no errors. Your script just continues processing. For instance is your server able to resolve the url's that you are feeding in. You may have a dns issue on the server. The remote url may have blacklisted your server ip address. Could be anything. As I mentioned above I would not even use curl for this task. -
[SOLVED] Curl not retrieving image properly
JonnoTheDev replied to scottybwoy's topic in PHP Coding Help
Just run a test with a url you have. Simple: $image = file_get_contents("http://www.google.co.uk/intl/en_uk/images/logo.gif"); print $image; // more code down here to save the image as a file on my server -
you cannot do this with purely php. requires ajax method. with 100mb you may reach the web server timeout period or the php max upload limit. check this out. http://uber-uploader.sourceforge.net/
-
https://bugs.launchpad.net/ubuntu/+source/php5/+bug/343870 Only option is comment internals from regParser() class - see if script runs. May need to post a bug http://bugs.php.net/ Unfamiliar with this distro release. Could also be apache.
-
Probably your script causing this. Check the CPU usage on your server when its running. I would guess your draining RAM. Only seen this once before with a script that tried to process too much data. You need to post the code.
-
[SOLVED] Curl not retrieving image properly
JonnoTheDev replied to scottybwoy's topic in PHP Coding Help
CURL will just browse to the page, not save the image. You should be using fopen() or file_get_contents() for this operation. -
Make sure that the script has the file included where this function resides. include('path/to/function/file.php'); // rest of script
-
[SOLVED] Curl not retrieving image properly
JonnoTheDev replied to scottybwoy's topic in PHP Coding Help
Why use CURL for this job. WGET far easier. exec("wget http://www.xyz.com/images/image.jpg"); -
You have no name for the select list value <select id="select1"> Should be ><select id="select1" name="con_id">< Then the selected value will be in: $_POST['con_id']
-
[SOLVED] Properly Evaluating mysql_num_rows()
JonnoTheDev replied to hellonoko's topic in PHP Coding Help
$rows = mysql_num_rows($query); $rows = ($rows) ? 1 : 0; -
All looks OK. I would write the sql within your ssession class to a log file for test purposes. Run your login page and then check the log file to see what was written.
-
Using a url parameter i.e. post.php?id=1 On post.php the value is stored in $_GET['id'];
-
You require callback functions for all session actions i.e. <?php // override php default session handler session_set_save_handler('sessionOpen', 'sessionClose', 'sessionRead', 'sessionWrite', 'sessionDestroy', 'sessionClean'); // open session function sessionOpen($savePath, $sessionName) { } // close session function sessionClose() { } // read session data from database using the sessionId function sessionRead($key) { } // write a session value to database function sessionWrite($key, $val) { } // destroy a session - remove from database function sessionDestroy($key) { } // clean up old sessions - remove all sessions past a certain time function sessionClean($maxlifetime) { } ?>
-
Your code to delete the file should not be contained with the loop. It should be at the very top of the page - before you start printing anything to the screen. Once you have deleted the file reload the page. Check my post above. It would also be a lot easier to use a standard hyperlink to action the delete rather than a form which is not needed as the other posts suggest however it is upto you.
-
This is not the issue at all. You cannot call a php function within a javascript event handler. All you are doing is looping through the files and the function unlink() will remove each file. You should replace with a hyperlink that calls an action when clicked. Example code i.e. <?php // user has clicked a delete hyperlink if($_GET['action'] && $_GET['action'] == 'delete') { unlink($_GET['filename']); header("Location:files.php"); exit(); } ?> <a href="files.php?action=delete&filename=xyz.jpg">delete file</a>
-
[SOLVED] header sent to wrong page... wierd one i dont understand
JonnoTheDev replied to jesushax's topic in PHP Coding Help
header should be followed by exit() If you want the script to stop processing and redirect the user you should use the following header("Location:index.php"); exit(); // this code will not be executed $x = "Hello World"; print $x; -
How to get a toolbar for an email form ?
JonnoTheDev replied to tmyonline's topic in Javascript Help
http://www.google.com/search?q=javascript+html+editor -
Change this line $headers = "From: $email"; To $headers = "From: ".$_REQUEST['email']; And that will give you the return address in your email client
-
You cant just add extra parameters into a function. You aren't including the fields in the message. Build up the message first: $emailBody = "Name: ".$_REQUEST['name']."\n"; $emailBody .= "Phone: ".$_REQUEST['phonenumber']."\n"; $emailBody .= "Email: ".$_REQUEST['email']."\n\n"; $emailBody .= $_REQUEST['message']."\n"; mail("myemail@mydomain.com", "Contact Enquiry", $emailBody, $headers);
-
How to get a toolbar for an email form ?
JonnoTheDev replied to tmyonline's topic in Javascript Help
http://www.fckeditor.net/ -
how to build up a object relational database
JonnoTheDev replied to friedemann_bach's topic in MySQL Help
Its possible that this could work however an unorthodox approach. I would change the type, table1, and table2 fields to integers and separate into tables of their own as you are creating duplicate rows of information and this is not normalised so: relations ======= id type table1 table2 id1 id2 tables ======= tableId name types ======= typeId name tables ======= tableId: 1 name: persons tableId: 2 name: locations types ======= typeId: 1 name: lives in relations ======= id: 1 type: 1 table1: 1 table2: 2 id1: 1 id2: 2 With your approach, in your queries you must know the table names that you will be joining the id1 and id2 fields to prior or you will end up doubling up on the number of queries to make to fetch the desired results -
I was not talking about adding a fulltext index to your tables text fields but using a Full Text Search Engine instead of using your database at all. This builds an index from your database tables and the index created gets searched rather than the database making it much faster and taking the load away from the database. Think of Google. When you search Google you are searching their index. You are not searching through a database containing millions of urls (just think how long that would take to return results). If you want to stick with your hosting then it should be possible for you to upload the Zend Framework to your webspace. Then I would make use of Lucene. http://framework.zend.com/manual/en/zend.search.lucene.html Upto you but creating tables with thousands of records and using the LIKE operator will be slow. If this is a search field on your website that users will be using then that may be sat there waiting a long time for results.