This article describes the refCount operator and explains that in order to prevent the unsubscription of obervable A we have to add delay(0)
to the source observable such that th import { Observable } from "rxjs/Observable";
const source = Observable.defer(() => Observable.of(
Math.floor(Math.random() * 100)
)).delay(0);
Is 0
always enough? In other words does passing zero guarantee that the notification will be delayed until all m.subscribe()
statements have run, assuming they are all run immediately following the multicast
statement like this:
const m = source.multicast(() => new Subject<number>()).refCount();
m.subscribe(observer("a"));
m.subscribe(observer("b"));
In the above case we are only subscribing observers a
and b
. If we subscribed a million observers after the multicast statement would running delay(0)
still guarantee taht they all will be subscribed before the first source notification happens?
This article describes the refCount operator and explains that in order to prevent the unsubscription of obervable A we have to add delay(0)
to the source observable such that th import { Observable } from "rxjs/Observable";
const source = Observable.defer(() => Observable.of(
Math.floor(Math.random() * 100)
)).delay(0);
Is 0
always enough? In other words does passing zero guarantee that the notification will be delayed until all m.subscribe()
statements have run, assuming they are all run immediately following the multicast
statement like this:
const m = source.multicast(() => new Subject<number>()).refCount();
m.subscribe(observer("a"));
m.subscribe(observer("b"));
In the above case we are only subscribing observers a
and b
. If we subscribed a million observers after the multicast statement would running delay(0)
still guarantee taht they all will be subscribed before the first source notification happens?
-
3
I've had problems with
delay(0)
it does not work the same assetTimeout(..., 0)
and the reason (I think..) is that the default scheduler is usingsetInterval(..., 0)
which has a different behavior, because the specs say that the shortest allowed interval in JS is 10ms. Where as,setTimeout(..., 0)
means to run immediately in the next digest. So they aren't exactly the same thing, but I am not sure if this has any effect on your question. It's just a FYI. – Reactgular Commented Jun 14, 2019 at 16:29
1 Answer
Reset to default 8To understand the issue you must know that:
- Javascript is single threaded;
- Asynchronous events runs in event loop (a.k.a. Micro Task and Macro Task)
- When Async event happens, it is added to the Event loop;
- After async event added to the Event loop, Javascript continues with synchronous code;
- After no synchronous code left, it runs events code from Event loop.
This Observable would be synchronous if you wouldn't add delay(0)
:
const source = Observable.defer(() => Observable.of(
Math.floor(Math.random() * 100)
)).delay(0);
When first Subscription happens (subscribing is synchronous code), Observable emits immediately, because it is also synchronous. But if you add delay(0)
(similar to setTimeout
), Javascript will wait until all synchronous code (all source.subscribe()
in this case), are executed. After that it will run asynchronous delay(0)
).
And here:
const m = source.multicast(() => new Subject<number>()).refCount();
m.subscribe(observer("a"));
m.subscribe(observer("b"));
You have source
Observable which bees asynchronous after its emition is passed to delay(0)
. At that point, synchronous code will continue (all your other source.subscribe()
calls) and after they are done, synchronous delay(0)
will emit.
So it is safe even for millions source.subscribe()
calls to get executed in this case.
p.s.
multicast(() => new Subject<number>()).refCount()
is exactly the same as share()
- it takes multicast with Subject factory and counts active subscriptions with refCount
.