I am starting to develop a new laboratory test data management system with many (about 30) test stations.
The system should be able to collect data offline in the event of a network failure.
Each station will keep an updated, read-only copy of the test structures (specifications, entity types, business rules / workflows, etc.), but not test data. Actual test data will be stored locally if the server cannot be found.
To prepare for a network failure, some kind of synchronization is required. Synchronization will update the test structures. Another synchronization will cause unsaved test data.
How do you recommend me to achieve this? Any reservations?
Ideas / Thoughts:
- Install an SQL server on each computer and write scripts to synchronize the server and clients (seems expensive and redundant).
- Save a local copy of the data in data files with installed applications using synchronization scripts.
- Is there anything built into SQL Server so clients can collect data offline?
- Save all data locally, then click the "Transfer" button to send the data to the network.
Environment: MS SQL Server running on Windows Server 2008
source
share