mrjiggles Posted October 5, 2014 Share Posted October 5, 2014 Hi,I need to rack some peoples brains! I have here a script that allows a GSM/ Remote receipt printer to poll for data in a specific format. The issue is that I will be deploying a few 100 of these damn things and need to know what ones are on or off! I can get the time I polls the file and have it writing to a DB table. The only issue is that even with only two printers running its filling up the table fast! Over 600 rows in 30 mins for only two printers! I need to find out a better way to do this but have the same flexibility in knowing what exact times it was on and off and then to take this data and put into a nice viewable format! This should be safe with a few hundred printers at once. Hope someone can help me with this as I am totally stumped! =/ Quote Link to comment https://forums.phpfreaks.com/topic/291455-help-working-out-how-to-process-data/ Share on other sites More sharing options...
trq Posted October 5, 2014 Share Posted October 5, 2014 Do you have a question? Quote Link to comment https://forums.phpfreaks.com/topic/291455-help-working-out-how-to-process-data/#findComment-1492824 Share on other sites More sharing options...
mrjiggles Posted October 5, 2014 Author Share Posted October 5, 2014 Just away of using the dates and times of when the device last polled to create uptime and down time charts. Also ways of stopping this from making a database table HUGE! I worked out if I have a 100 of these devices on one day I would have over 1.4million rows! That could get ugly quickly. Never mind putting out the data in a nice format... I'm normally good at working out issues and problems but this one has me totally confused. Quote Link to comment https://forums.phpfreaks.com/topic/291455-help-working-out-how-to-process-data/#findComment-1492826 Share on other sites More sharing options...
jcbones Posted October 5, 2014 Share Posted October 5, 2014 AFAIK, rows are not the problem, it is the amount of data being stored. While rows definitely are a consideration in this, the greater consideration is the amount of data stored as a whole. You could go with scaleDB which is an extension to MySQL, that handles large amounts of transactions, as well as a large data.Then you have NoSQL, but I don't think that would be a good fit unless you are using a bunch of concurrent users. You could very well stay with MySQL with no extensions and be perfectly fine. Depending on if you are writing to a hard drive, or a SSD, or a dedicated DB server. A lot depends on the architecture here. If it were me, I would get it up and running, do extensive testing to see if the database holds up. Perhaps picking a short interval (1 week?) for an archive interval. Only when/if it starts to bottleneck would I look at scaling up or scaling out. Quote Link to comment https://forums.phpfreaks.com/topic/291455-help-working-out-how-to-process-data/#findComment-1492830 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.