Search the Community
Showing results for tags 'website'.
Simply put, I every night I have to pull a HUGE xml file (5mb 4500+ records). with this data ( small sales site), I create a webpage using the data provided. Pretty simple. However, I first used XML and converted it to arrays. This was problematic for functions and manipulating etc.. or picking out discounts and specific items etc. So Now I use the xml file, and convert it to sql - this as you know is very time consuming (longest time is about 10-12min using INSERT UPDATE ON DUPLICATE). So I have to use a cronjob to perform this. I would rather have the page dynamicly loaded when user is visiting. Now SQL works nicely as all the manipulation features are fast. However, I still think it feels sluggish. And it feels sloppy. I do not want to learn XSLT (LAZY?), I am comfortable with PHP. I can already parse the xml file fast. Just need a way to manipulate the data. mainly sorting, and picking out specific items in the xml - then sort those results. One method I did use was to cache what I manipulated and then overnight I pull the new feed, I delete the cache. Hopefully this is not answered already, gonna feel like an idiot. What would you pro's recommend? Leave it at the SQL? Or keep pursuing the XML?
Regarding the messages that pop up when one clicks on the Mark Forum Read link. Why do we have to endure (every time!) the message that we have to reply Ok to and then a soon-to-follow message confirming that it is done? We click. We expect it to happen. So why the enforced conversation? Perhaps if something DIDN'T happen the conversation might be helpful.