Calling on the collective wisdom for some technical recommendations.
There are only a few hard problems in computer science and offline master-master server replication is one of them...
The charity I'm trying to help has a single-user FPW app currently. When a second user wants to add data, the Master user has to export lists of available serial numbers and export the existing inventory, the non-master user imports those files, does the data entry, and exports the new data that the Master must now re-import. Export/Import uses an ancient PKZIP to pass ZIP files back and forth, which they email or floppy-transport. Ah, the memories.
In the best of all possible worlds, the data would go up on the net to be immediately accessed by all.
Unfortunately, these folks are working in a developing country with unreliable internet, and are working in a brief time period and high throughput environment. They need both the ability to function when disconnected, and to re-connect and resync changes, with all the edge case issues that brings up.
Years ago, Fox shipped with an "off-line view" method that had hooks to solve a lot of the typical conflict problems, though I really thought it was more of a curiosity than a robust framework.
In the decades (and decades) since then, surely someone has come out with a replication scheme that can be managed without a post-graduate degree in database set theory.
Has anyone come across a web framework with an offline component support like this? The core data the client is using is pretty small (~10,000 records in the main table)
Something like: https://developers.google.com/web/fundamentals/instant-and-offline/web-stora... (Just the first hit in a web search, not a recommendation).
I'm hoping to find some appropriate technology and then line up a dev in that technology (or a college-level project if appropriate)