I'm building a relatively complex and data heavy web application in AngularJS. I'm planning to use php as a RESTful backend (with symfony2 and FOSRESTbundle). I have spent weeks looking around for different solutions to on/off line synchronization solutions and there seem to be many half solutions (see list below for some examples). But non of them seem to fit my situation perfectly. How do I go about deciding which strategy will suite me?
What issues that might determine “best practices” for building an on/off line synchronization system in AngularJS and symfony 2 needs some research, but on the top of my head I want to consider things like speed, ease of implementation, future proof (lasting solution), extensibility, resource usage/requirements on the client side, having multiple offline users editing the same data, how much and what type of data to store.
Some of my requirements that I'm presently aware of are:
- The users will be offline often and then needs to synchronize (locally created) data with the database
- Multiple users share some of the editable data (potential merging issues needs to be considered).
- User's might be logged in from multiple devices at the same time.
- Allowing large amount of data to be stored offline(up to a gigabyte)
- I probably want the user to be able to decide what he wants to store locally.
- Even if the user is online I probably want the user to be able to choose whether he uses all (backend) data or only what's available locally.
Some potential example solutions
- PouchDB - Interesting strategies for synchronizing changes from multiple sources
- Racer - Node lib for realtime sync, build on ShareJS
- Meteor - DDP and strategies for sync
- ShareJS - Node.js operational transformation, inspired by Google Wave
- Restangular - Alternative to $resource
- EmberData - EmberJS’s ORM-like data persistence library
- ServiceWorker
- IndexedDB Polyfill - Polyfill IndexedDB with browsers that support WebSQL (Safari)
- BreezeJS
- JayData
- Loopback’s ORM
- ActiveRecord
- BackBone Models
- lawnchair - Lightweight client-side DB lib from Brian Leroux
- TogetherJS - Mozilla Labs’ multi-client state sync/collaboration lib.
- localForage - Mozilla’s DOMStorage improvement library.
- Orbit.js - Content synchronization library
(.864mpiz510wz)
Any help would be much appreciated :)
I'm building a relatively complex and data heavy web application in AngularJS. I'm planning to use php as a RESTful backend (with symfony2 and FOSRESTbundle). I have spent weeks looking around for different solutions to on/off line synchronization solutions and there seem to be many half solutions (see list below for some examples). But non of them seem to fit my situation perfectly. How do I go about deciding which strategy will suite me?
What issues that might determine “best practices” for building an on/off line synchronization system in AngularJS and symfony 2 needs some research, but on the top of my head I want to consider things like speed, ease of implementation, future proof (lasting solution), extensibility, resource usage/requirements on the client side, having multiple offline users editing the same data, how much and what type of data to store.
Some of my requirements that I'm presently aware of are:
- The users will be offline often and then needs to synchronize (locally created) data with the database
- Multiple users share some of the editable data (potential merging issues needs to be considered).
- User's might be logged in from multiple devices at the same time.
- Allowing large amount of data to be stored offline(up to a gigabyte)
- I probably want the user to be able to decide what he wants to store locally.
- Even if the user is online I probably want the user to be able to choose whether he uses all (backend) data or only what's available locally.
Some potential example solutions
- PouchDB - Interesting strategies for synchronizing changes from multiple sources
- Racer - Node lib for realtime sync, build on ShareJS
- Meteor - DDP and strategies for sync
- ShareJS - Node.js operational transformation, inspired by Google Wave
- Restangular - Alternative to $resource
- EmberData - EmberJS’s ORM-like data persistence library
- ServiceWorker
- IndexedDB Polyfill - Polyfill IndexedDB with browsers that support WebSQL (Safari)
- BreezeJS
- JayData
- Loopback’s ORM
- ActiveRecord
- BackBone Models
- lawnchair - Lightweight client-side DB lib from Brian Leroux
- TogetherJS - Mozilla Labs’ multi-client state sync/collaboration lib.
- localForage - Mozilla’s DOMStorage improvement library.
- Orbit.js - Content synchronization library
(https://docs.google.com/document/d/1DMacL7iwjSMPP0ytZfugpU4v0PWUK0BT6lhyaVEmlBQ/edit#heading=h.864mpiz510wz)
Any help would be much appreciated :)
Share Improve this question asked Apr 9, 2014 at 9:13 KrissKriss 3963 silver badges12 bronze badges 2- 2 Wondering what you went with in the end ? Seems like a lot of options out there but nothing 100% fit for use case.. – jumlk Commented Nov 1, 2014 at 11:33
- 1 I would be interested to find out what you eventually chose. – Tony O'Hagan Commented Nov 28, 2015 at 4:59
5 Answers
Reset to default 3You seem to want a lot of stuff, the sync stuff is hard... I have a solution to some of this stuff in an OSS library I am developing. The idea is that it does versioning of local data, so you can figure out what has changed and therefore do meaningful sync, which also includes conflict resolution etc. This is sort-of the offline meteor as it is really tuned to offline use (for the London Underground where we have no mobile data signals).
I have also developed an eco system around it which includes a connection manager and server. The main project is at https://github.com/forbesmyester/SyncIt and is very well documented and tested. The test app for the ecosystem will be at https://github.com/forbesmyester/SyncItTodoMvc but I have yet to write virtually any docs for it.
It is currently using LocalStorage but will be easy to move to localForage as it actually is using a wrapper around localStorage to make it an async API... Another one for the list maybe?
To work offline with your requeriments I suggest to divide problem into two scenarios: content (html, js, css) and data (API REST).
The content
Will be stored offline by appcache for small apps or for advanced cases with the awesome serviceworkers. Chrome 40+.
The data
Require solve the storage and synchronization and It becames a more difficult problem. I suggest a deep reading of the Differential Synchronization algorimth, and take next tips in consideration:
FrontendStore the resource and shadow (using for example url as key) into the localstorage for small apps or into more advanced alternatives (pouchdb,indexdb,...). With the resource you could work offline and when needs synchronize with the server use jsonpath to get diffs between the resource-shadow and to send it to server the PATCH request.
BackendAt backend take in consideration storage the shadow copies into redis.
The two sides (Frontend/Backend) needs to identify the client node, to do so you could use x- syn-token
at HTTP header (send it in all request of the client with angular interceptors).
https://www.firebase.com/
it's reliable and proven, and can be used as a backend and sync library for what you're after. but, it costs, and requires some integration coding.
https://goinstant.com/ is also a good hosted option.
In some of my apps, I prefer to have both: syncing db source AND another main database. (mogno/express, php/mysql, etc..). then each db handles what's its best with, and it's features (real-time vs. security, etc...). This is true regardless to sync-db provider (be it Racer or Firebase or GoInstant ...)
The app I am developing has many of the same requirements and is being built in AngularJS. In terms of future proofing, there are two main concerns that I have found, one is hacking attempts requiring encryption and possible use of one time keys and an backend key manager and the other is support for WebSQL being dropped by the standards consortium in preference to indesedDB. So finding an abstraction layer that can support both is important. The solution set I have come up with is fairly straight forward. Where offline data is is loaded first into the UI and a request goes out to the REST Server if in an online state. As for resolving data conflicts in a multi user environment, that becomes a business rule decision. My decision was to simplify the matter and not delve into data mergers but to use a microtime stamp comparison to determine which version should be kept and pushed out to clients. When in offline mode, store data as a dirty write and the push to server when returning to an online state.
Or use ydn-db, which I am evaluating now as it has built in support for AWS and Google cloud storage built in.
Another suggestion: Yjs leverages an OT-like algorithm to share a wide range of supported data types, and you have the option to store the shared data in IndexedDB (so it is available for offline editing).