Jump to content

Recommended Posts

Hi there - Im busy putting together a basic POS system for a client. Thing is all my prev work has been online, so now I have to face several challenges getting this to work in all circumstances -

 

I want your opinion if this seems to be a fairly legit approach - bear in mind im not familiar with any frameworks - I code ground up procedural style with functions (have been for years - and now is not the time to lecture me about it or digress :tease-03: - please bear with me)..

 

Lets briefly cover the main operations of the POS system:

 

  • Read current products / prices / stock
  • Create supplier orders / receive orders / process new stock (to reflect on all other systems / locations)
  • Capture customer orders and counter sales / invoices etc

In a perfect world this will all run directly from/to the main online server - from any terminal anywhere with complete data synchronicity across all terminals. 

 

 

 

Now in one location there might be say - 3 stations and a local server..

 

So this is my plan of action:

 

  1. The Main live server will be the primary data source (ultimately) 
  2. Each physical location will have a local server (network shared) <- more on this below
  3. Each POS terminal will / must be able to run the data without even a network connection

 

 

Problem 1 / Solution:

 

Now lets say the internet connection drops - Now we have a situation where the main server is unreachable.. The terminals and server now have no connectivity to the products etc - and are operating "blind".

 

My solution is to store a copy of all products in a json/xml/csv file on the main server - so when the system goes offline they switch immediately to the local stored copies. This will be a fixed filename on the local server read only to all POS stations - so they can continue pulling products. (bear in mind this will apply to suppliers, orders, whatever else will be need to perform the base functions). 

 

 
Problem 2 / Solution:
 
Similar as above - except now the POS station is running completely as a single unit. Once again - there will be a copy of the json/xml files stored locally to read from.
 
 
 
For each of the above cases - there will also be a unique json/xml/csv file for new captures - ie,
 
  • a new supplier order from a single station would be stored into the local file
  • a new product created would also be stored locally
  • soon as the local server is accessible a sync can be done
  • as soon as the main server becomes available - a sync can be done to that
  • unique transaction and operation codes will be created to avoid duplicate entries.

 

So in scenario 1 - all POS stations will be saving to the main local server, in scenario two they will be storing everything to local files.

 

When reading / writing the files - i will create php arrays and loop each one, look for a duplicate in table and insert / clear file - so each insert / sync will be removed from local file - be it station to local server, station to main server, or local server to live server.

 

 

 

Im not sure if theres a magic button to press that could sync a mysql database - the reason Im using files is because:

 

  • the use of files might bypass certain security issues sharing data between mutliple sources
  • faster performance using files rather than running a db server on station, local server AND live server
  • the product list is only about 3-4 thousand and maybe i can cache the files plus i dont think they would be too big?

 

In a nutshell - if connectivity is lost, the system will resort to using flat files..

 

One argument against is perhaps it might be easier to have the same databases all over, then the code wont change  - just the location / db server.. that could be a very small piece of code. Id just have to magically sync the dbs tho... this is one issue im having to consider - flat files vs db - 

 

 

 

So thats the short  of it - id like to know some suggestions, queries and comments - if you think I should consider a different approach please let me know what you think - 

 

many thanks!

 

 aaarr  :pirate:
Edited by CamaroMan

Live online server not local.

Designate a main local server with a local address changing hosts file and servername

Any additional stations always connect to the local main server. These wouldn't require local servers installed.

When has internet... local main server can communicate to the live server in the background and make any updates. Since all other stations access just the one main local server should work out well.

Since is just one server syncing should simplify it all.

I suppose the data storage is up to you. I would use mysql the live server and pass json files using curl with a cron job from the main local server. The main live server can check for any new files a cron job and update accordingly.

 

I could mention sqlite but I don't like the file locking for writes while also trying to read.

 

The alternate would be to install local servers every machine and do updates in the background to the live server and using cached json.

 

Don't really know if this is a single or multiple clients pos, how many stations and amount of work setting up. Client computers puke a lot so is probably best to do less installs each one. In that same respect is hard to just rely on a single computer as well. But could always set up a backup server syncing using copy commands.

 

There is remote mysql options in which do not need multiple copies same database.

 

I would never copy the mysql data files directly and always run through a process for newest data with timesstamps and checks.

Edited by QuickOldCar

yeah the issue i have, is if say this shop is offline to main internet server, if station 1 adds a product, the other stations in the shop shd be able to pick it up.. hence the need for a second level server in house connected via the lan..

 

Hence:

 

Live server (internet)

 

Local Server (connecting local stations)

 

Local pos (operating as a sole entity)

 

I was thinking about a manual php script that could add a timestamp and a random string / code for each locally created line / row, that can be used to search the db for duplicate inserts and removed from local if found on local server..

 

then of course the local machines "could" also talk directly to ain server -  its a multi database setup I have to sync under various real world scenarios - luckily the data is not complex at all, jsut sales, prod ids, numbers etc..

 

Im looking at db replace code and replication - just so many ways.. still digging - 

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.