Jump to content

cytech

Members
  • Posts

    87
  • Joined

  • Last visited

    Never

About cytech

  • Birthday 10/29/1985

Contact Methods

  • Website URL
    http://www.cytech-services.com

Profile Information

  • Gender
    Male

cytech's Achievements

Member

Member (2/5)

0

Reputation

  1. Hey All, I'm working with Amazon's payment system and I have a 'success url' set for amazon that sends them to: mydomain.com/transactions/receivetrans That works just fine, when amazon sends the user back they give off tons of GET variables. One of them being the confirmation location for the security file: https%3A%2F%2Ffps.sandbox.amazonaws.com%2Fcerts%2F090909%2FPKICert.pem This throws my controller for a loop and I get: Error: 403.shtmlController could not be found. If I remove that particular link from the return manually the controller works great and shows the user a thank you message like it should. Controller for that particular view: function receivetrans() { } Just a simple show a page and say thanks... Nothing crazy in there haha. I'm not sure why the above variable would throw off cake, I'm assuming multiple / / but it's not like I can change how amazon sends their info. Any insight from a guru??
  2. That's not a bad idea.. I was reading though and people where saying the overhead of using ON DUPLICATE was a dangerous matter? This true? They where saying if the entry is found ON DUPLICATE removes it and then inserts a new one which in turn is 2 queries. Or am I just over thinking this and I should not worry about that..
  3. Hello, The thing is you have one form for all "edit" submit buttons. So what will happen is it will submit the "last" Num_Pages entry instead of the one you want. What I suggest is removing the "form" from the script and adding a textual link like so. echo "<a href=\"count.php?pages=".$data->Num_Pages."\">Edit ".$data->Num_Pages."</a>"; the above would replace <input type='submit' value='Edit' name='editbutton'></form>
  4. Hey nafetski, Thanks for the ideas. ON DUPLICATE seems like the best option in this situation however I have two unique keys and not just one. REPLACE INTO even though would be a single query, would have its down fall because if the record does exist (which most do) it would take 2 queries to "update" it. So after all is said and done it would be 3 queries for one entry. Even though its faster then doing 3 queries yourself, if I can get it down to a single query on both sides its great. What I have now is doing one insert query into a temporary table (this takes 1-2 seconds if that). I then do an update based from the temporary table to the live table in one update command (this still takes a lot of time - just so many records). Then those that where not updated from the temporary table I insert into the live table. Then truncate temp able and move onto a new file. This works great, except for when you have 30k updates to do haha. Oh well.
  5. Well the suggestions put out have helped overall with performance but when updating 5-10k records its still VERY slow. This is just due to the database. The performance hit isn't with inserts but with the update commands, so I re-did it so I could do the update command in one go instead of doing multiple calls. This seems to have improved performance slightly. However, when doing thousands of records its still very slow. Not sure there's a way around this one. Once again thank you all for your help.
  6. Very interesting... I really appreciate the input mjdamato - I will give this a go.
  7. mysql_query("SELECT * FROM nybygninger WHERE id='".$_GET['itemid']."'"); Be sure to add "." the period before and after the variable. It tells php that you can to add this variable tot he string. Also be sure to use mysql_real_escape_string on your variables when collecting them from _GET, if you don't it will open you up to some nasty sql injection attacks. so: $itemid = mysql_real_escape_string($_GET['itemid']); mysql_query("SELECT * FROM nybygninger WHERE id='".$itemid."'"); Hope that helps
  8. Not sure yet, will attempt it this weekend. Of course, I shall post the results once it is completed.
  9. BOX_INFORMATION_4PRESCRIPTION - looks to be a globally defined variable, check in a config file or "customization" file to see where they are setting the others - such as "BOX_INFORMATION_FAQS".
  10. Check this out: http://www.softwareprojects.com/resources/programming/t-how-to-use-mysql-fast-load-data-for-updates-1753.html
  11. $mailbody .= "Name: ".$_POST['name']."\n"; $mailbody .= "Message: ".$_POST['message']."\n"; $mailbody .= "Phone: ".$_POST['phone']."\n"; $mailbody .= "Email: ".$_POST['email'];
  12. loadata infile... brilliant..
  13. Hey Cags, Agreed... I just figured that out, I switched out the foreach loops and it did run a "hair" faster, not enough to worry about it. So that didn't help. However, when running the import function "without" the database queries it ran through the csv perfectly and decently fast. I put the queries back in and it went to a halt, slow as a snail. So it has to do with the database and queries at this point. With that being said, I need to look at the database and figure out a more appropriate fix. Thank you!
  14. Thank you for your post and for the benchmark code, I will plop that in and see how it looks. It is somewhat essential, users upload their csv files and the system needs to import it into the database. I wonder if there is a solution that upon upload it turns the csv into something that will make importing a lot easier... I will give that a few google searches.
  15. Doing some research and found some benchmarks that state - foreach loops used with large amounts of data are way to inefficient and you should use for loops. Anyone care to back up this claim?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.