Rommeo Posted December 17, 2018 Share Posted December 17, 2018 (edited) Hi, I have two xml files, one has 750 products, and the other one has 900 products (including product informations, stock information, sizes and photos) so shortly one product in xml file is like this; product_name: x t-shirt product_code: x-t-shirt-101 (Alphanumerical) Stocks(loop): +Small: 50 +Large:200 Photo1 : http://....../main.jpg Photo2: http://....../2.jpg And my tables are like this: Product (product_id (numerical), product_code (alphanumerical), name) Photos (product_id, id.. ) Stocks (product_id, id..) (So that I need to insert one product in my db, and take the id, and insert the xml-photos-data with the correct product_id to photos table, same goes with the stocks table) So I m thinking about the easiest and most healthy way to import this amount of data, and I need to update it daily or weekly (planning to do it by cron jobs for now). I have found some solutions but since I have not done this before could not decide which is the most secure way. Need to add that I can have other xml files from different companies to import later on, so I think I should code for long term. Thank you in advance for your help. Edited December 17, 2018 by Rommeo Quote Link to comment Share on other sites More sharing options...
gw1500se Posted December 17, 2018 Share Posted December 17, 2018 I believe this is what you are looking for. Quote Link to comment Share on other sites More sharing options...
Rommeo Posted December 17, 2018 Author Share Posted December 17, 2018 33 minutes ago, gw1500se said: I believe this is what you are looking for. Thank you for the suggestion. What I actually wonder is; how to insert that amount of data? Should I put in a loop and insert? That may take time and cause an error 500 maybe? Quote Link to comment Share on other sites More sharing options...
gw1500se Posted December 17, 2018 Share Posted December 17, 2018 Is this a one time load? If so does it matter? You could write a SQL file from the data then have MySQL read that in and do it all at once. You could then schedule it to be loaded into the database in background so your web page does not time out. Depending on your server I'm not sure inserting 900 records would really take that long but if it did you can increase the time out. Quote Link to comment Share on other sites More sharing options...
Rommeo Posted December 17, 2018 Author Share Posted December 17, 2018 34 minutes ago, gw1500se said: Is this a one time load? If so does it matter? You could write a SQL file from the data then have MySQL read that in and do it all at once. You could then schedule it to be loaded into the database in background so your web page does not time out. Depending on your server I'm not sure inserting 900 records would really take that long but if it did you can increase the time out. Well It's not actually 900, // So the loop is gonna be something like; for( <"900 data") { $query1 = "insert a data of 900" $query2 = "select productid of data where = product_code" || "take the last inserted row". for(<"count of stock lets say 5") $query insert productid,stock; for(<"count of photos lets say 6") $query insert productid,photoname; } Well, I thought about it but I wanna see if there is any better alternatives.. Quote Link to comment Share on other sites More sharing options...
gw1500se Posted December 17, 2018 Share Posted December 17, 2018 As I said you should consider writing the inserts to a file. Then you can load it into the database in background. The web page would be returned after writing the file rather than after the load. Quote Link to comment Share on other sites More sharing options...
Barand Posted December 17, 2018 Share Posted December 17, 2018 (edited) Fastest way is with mysql's load xml statement Edited December 17, 2018 by Barand 1 Quote Link to comment Share on other sites More sharing options...
gw1500se Posted December 17, 2018 Share Posted December 17, 2018 Cool. I didn't know that existed. This is what makes these forums so great. Even an old timer like me can learn something new. Quote Link to comment Share on other sites More sharing options...
Rommeo Posted December 18, 2018 Author Share Posted December 18, 2018 Thank you Barand. I m working on it. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.