I have an application in which I want to load some amount of initial data (acplished with Firebase.once('value')
) and then at some point later I'd like to receive events of child nodes that have been added to that Firebase reference.
For this, I'd like to use Firebase.on('child_added')
, but according to the definition (and seen in practice) it loads ALL of the data at that Firebase reference first.
Is there a way to get around this behavior and only listen for child_added
events. Also, workarounds like dumping the initial set of data are not solutions (imagine a dataset with over a million data points - I don't want to have to every data point just to listen for when one gets added!).
EDIT: Firebase.on('child_added')
can be bined with limit()
to limit the amount of data that es from the initial request. Likely for my application, only one data point will be added to the Firebase at a point in time so I've used Firebase.limit(1).on('child_added')
which limits the amount of initial data loaded to a single data point. But, I don't like this workaround for two reasons:
- It's still a workaround. I'm still having to ignore data that I shouldn't have to worry about ignoring.
- If for some reason the nature of my application was to change and I was to add multiple children to the Firebase reference at one time - would imposing a
limit(1)
limit the application from receiving all added children? I'm not sure if this is the case or ifchild_added
would get called for each individual child added regardless if they were added as a batch.
It seems like providing a true/false
flag as an argument to the call would be a nice solution, but I'll wait to see if there's an answer I've been glossing over...
I have an application in which I want to load some amount of initial data (acplished with Firebase.once('value')
) and then at some point later I'd like to receive events of child nodes that have been added to that Firebase reference.
For this, I'd like to use Firebase.on('child_added')
, but according to the definition (and seen in practice) it loads ALL of the data at that Firebase reference first.
Is there a way to get around this behavior and only listen for child_added
events. Also, workarounds like dumping the initial set of data are not solutions (imagine a dataset with over a million data points - I don't want to have to every data point just to listen for when one gets added!).
EDIT: Firebase.on('child_added')
can be bined with limit()
to limit the amount of data that es from the initial request. Likely for my application, only one data point will be added to the Firebase at a point in time so I've used Firebase.limit(1).on('child_added')
which limits the amount of initial data loaded to a single data point. But, I don't like this workaround for two reasons:
- It's still a workaround. I'm still having to ignore data that I shouldn't have to worry about ignoring.
- If for some reason the nature of my application was to change and I was to add multiple children to the Firebase reference at one time - would imposing a
limit(1)
limit the application from receiving all added children? I'm not sure if this is the case or ifchild_added
would get called for each individual child added regardless if they were added as a batch.
It seems like providing a true/false
flag as an argument to the call would be a nice solution, but I'll wait to see if there's an answer I've been glossing over...
-
You could add the creation timestamp to each dataset and use the current timestamp as
startAt
query firebase./docs/ordered-data.html firebase./docs/javascript/query/startat.html – Prinzhorn Commented Jun 19, 2014 at 17:50 - This would work well with the way things are structured currently -- the "key" of the data (in a key:val format) is the timestamp of the data. Thanks for the info! – MandM Commented Jun 19, 2014 at 18:13
1 Answer
Reset to default 6delete old messages
If you aren't interested in historical data, then you are utilizing this as a message queue. If that is the case, than old records should be deleted (or archived to an alternate path) after reading them. In this way, the problem is self resolving.
If you want to maintain this model, a couple good options are available:
store the last key read and use that as the starting point of your next load
You could even store the key in Firebase
var fb = new Firebase(URL);
fb.child('last_read_key').once('value', function(lastReadSnap) {
var lastKey = lastReadSnap.val();
var pri = lastReadSnap.getPriority();
fb.child('data_to_read').startAt(pri, lastKey).on('child_added', function(snap) {
if( snap.name() !== lastKey ) {
console.log('new record', snap.name());
fb.child('last_read_key').setWithPriority(snap.name(), snap.getPriority());
}
});
});
Use priorities to mark timestamp of old records
When writing records, use setWithPriority:
fb.child('records/$recordid').setWithPriority(data, Firebase.ServerValue.TIMESTAMP);
Now you can read records starting at "now":
fb.child('records').startAt(Date.now()).on('child_added', function(snap) {
console.log('new child', snap.name());
});
Note one caveat of using timestamps here. I used Date.now() in the startAt() mand, because it doesn't currently support Firebase.ServerValue.TIMESTAMP (bug filed for this).