u2xs Posted October 11, 2013 Share Posted October 11, 2013 I use PHP to bridge between an obscure programming language (AutoHotKey) and a MySQL backend. This leads me to 2 questions 1) Should I be concerned about latency? There are times that I have to make a few hundred inserts at a time and then there's only a short pause before a few hundred more inserts. 2) I'm using a system that seems like it's not the best way to use PHP to bridge the gap. I use HTTP encoded URLs to get & set information. So this query would translate into that URL. Is there a better way with PHP? INSERT INTO testTable (id, channel, status, datetime) VALUES ('1', '23', '3', NOW()); http://Mywebsite.com/php/ahk2sql.php?key=01010AuthenticationKey0101010&query=INSERT%2520INTO%2520testTable%2520%28id%2C%2520channel%2C%2520status%2C%2520datetime%29%2520VALUES%2520%28%271%27%2C%2520%2723%27%2C%2520%273%27%2C%2520NOW%28%29%29%3B Quote Link to comment Share on other sites More sharing options...
kicken Posted October 11, 2013 Share Posted October 11, 2013 It'd probably be better if you tried to explain more about what you are trying to accomplish and why. An insert could take a little time so depending on how fast you need them to process, yes you may have issues with latency. The second part of your question is pretty much impossible to answer without knowing more about your overall goals. This sounds like more of a PHP/General programming question that a Mysql question too, probably should be in another forum but not sure which one yet, need more info. Quote Link to comment Share on other sites More sharing options...
requinix Posted October 11, 2013 Share Posted October 11, 2013 Don't put SQL in your URLs. Doesn't matter if it's only executed if authenticated, doesn't matter if you include a hash to validate the string, don't do it. If you need to insert data, pass the data by itself. ahk2sql.php?key=01010AuthenticationKey0101010&action=test&id=1&channel=23&status=3 Quote Link to comment Share on other sites More sharing options...
vinny42 Posted October 12, 2013 Share Posted October 12, 2013 If you are planning to insert a few hundred recors using separate URL's then you'll force PHP to open and close it's connection to MYSQL for every record, that adds a delay of 20-30ms per record, which quickly adds up. If you have a dedicated server you can try using pconnect (do not *ever* use that on a shared server) which will keep the connection open when the PHP script ends, saving the time to open the connection again later. Again, do not use pconnect on a server that you do not control. Insertint records quickly works best if you can do it in batches. If you can send a POST request htat holds the dat for 1000 inserts at a time, you can use transactions to speed the inserts up. If all the records are inserted into the same table you can make it even faster by writing the data to a CSV file and using the LOAD DATA INFILE function of MySQL, which will copy the data from the CSV directly into the table. Quote Link to comment Share on other sites More sharing options...
u2xs Posted October 14, 2013 Author Share Posted October 14, 2013 It'd probably be better if you tried to explain more about what you are trying to accomplish and why. An insert could take a little time so depending on how fast you need them to process, yes you may have issues with latency. The second part of your question is pretty much impossible to answer without knowing more about your overall goals. This sounds like more of a PHP/General programming question that a Mysql question too, probably should be in another forum but not sure which one yet, need more info. I'm running a data mining software that can return anywhere between 1 and 100 results. Through this HTTP method, I'm noticing that my server does not process the request if the URI is longer than 2000 characters. For whatever reason, at 1000 characters, it stopped showing an error message, but the multiple insert commands were not actually inserting into the db. When I broke it down to unique requests, it did work. My only concern was that it might affect the server/db adversely due to it's volume. Don't put SQL in your URLs. Doesn't matter if it's only executed if authenticated, doesn't matter if you include a hash to validate the string, don't do it. If you need to insert data, pass the data by itself. ahk2sql.php?key=01010AuthenticationKey0101010&action=test&id=1&channel=23&status=3 I appreciate the concern. Thanks. If you are planning to insert a few hundred recors using separate URL's then you'll force PHP to open and close it's connection to MYSQL for every record, that adds a delay of 20-30ms per record, which quickly adds up. If you have a dedicated server you can try using pconnect (do not *ever* use that on a shared server) which will keep the connection open when the PHP script ends, saving the time to open the connection again later. Again, do not use pconnect on a server that you do not control. Insertint records quickly works best if you can do it in batches. If you can send a POST request htat holds the dat for 1000 inserts at a time, you can use transactions to speed the inserts up. If all the records are inserted into the same table you can make it even faster by writing the data to a CSV file and using the LOAD DATA INFILE function of MySQL, which will copy the data from the CSV directly into the table. Ahh, that makes sense to me, thanks Quote Link to comment Share on other sites More sharing options...
vinny42 Posted October 14, 2013 Share Posted October 14, 2013 , I'm noticing that my server does not process the request if the URI is longer than 2000 characters. That's because a URL usually cannot be longer than 2000 characters. If you need to send more, you should use the POST method. so you can create an HTTP body that has as much data in it s you need, hunderds of megabytes if you want. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.