Think about a situation where we have a dashboard with 15 widgets, each widget is making its own API call to the back-end to fetch data. 1000 users are on our dashboard. It comes down to a maximum of (1000 * 15) 15000 API calls. What if the number of users increases?

Depending on the requirement, specific situation, infrastructure, there may be different ways to handle this.

  • Make sure our DB handle these many + 50% (I randomly picked this number) more API calls

  • Cache these API calls responses, if possible, at the back-end.

  • Add more nodes to handle this load

In this post, I want to try another way to handle this in Angular Single Page Application framework. How about throttling API calls to 2/3 per user? In this case, the number of API calls reduces to 20% for the same number of users.

Wait!! Browser already throttle requests. Right?

Yes. But it depends on which protocol we are using.

HTTP 1.1

When we use HTTP 1.1, browsers do throttle requests and it depends on the browser. Google Chrome allow only six simultaneous requests per domain with HTTP 1.1 protocol. Check the following image.

Requests with HTTP 1.1 protocol


When we use HTTP 2, browsers can send more simultaneous requests. Check the following image.

Requests with HTTP 2 protocol

As everybody is moving towards HTTP 2 or already moved, we definitely need to throttle these requests.


Instead of making 15 API calls simultaneously, how about keeping this number always ≤ 3? Once we get a response/error from one API, we send another.

Loading State

Make sure that we have a loading state per widget instead of blocking entire page till we complete 15 API calls.


We can debounce. But we can’t exactly guess how much time it will take to complete an API call. What if we changed the duration of data that we are fetching from one month to 5 years? We can’t guarantee that we always keep the simultaneous number of API calls ≤ 3. Also, what if the API calls are completed a little bit early and we are keeping the app idle and showing loading state.

Throttle in the components?

We can control the throttling in the components. But we need a lot of communication between components to track which one is completed and when the other one can fire the API call.


How can we solve this problem in such a way that we can re-use this in some other part of the application? Above mentioned throttling in the components can’t be re-usable.

HTTP Interceptor?

Can we solve this at HTTP Interceptor level? If we solve this at HTTP Interceptor level, we don’t need to bother about communication between components. All widgets trigger their API calls at a time. HTTP Interceptor will take care of throttling.


As mentioned earlier there can be many ways to solve this problem. Think about this as a POC.

As HTTP Interceptors is a multi-provider token, we can’t keep track of what we sent, what are pending etc. Let’s create a new API Throttle Service to keep a track of this.

Here is it’s code.

// Create a REGEX of API calls which we want to throttle
const URL_REGEX = /\/api\/width\/[a-zA-Z0-9\-]+/;

// What is the Throttle Limit

export class APIThrottleService {
    public intercept(
        req: HttpRequest<any>,
        next: HttpHandler,
    ): Observable<HttpEvent<any>> {

        if (URL_REGEX.test(req.url)) {
            // Handle throttling here
        } else {
            return next.handle(req);

Let’s create a new HTTP Interceptor for throttling API calls, which uses above API Throttle Service.

export class APIThrottleInterceptor implements HttpInterceptor {
        private apiThrottleService: APIThrottleService
    ) {}

        req: HttpRequest<any>,
        next: HttpHandler,
    ): Observable<HttpEvent<any>> {
        return this.apiThrottleService.intercept(req, next);

Add APIThrottleInterceptor to HTTP_INTERCEPTORS

    useClass: APIThrottleInterceptor,
    multi: true,

Let’s go back to API Throttle Service

  • Let’s keep a counter to track active API calls, their URLs. And also an object to track observables that we send back to Interceptor if throttle limit is reached.

    private activeCount = 0; private reqURLs = []; private reqObs: { [key: string]: Subscriber } = {};

  • When we find a matching URL, first make sure to remove that URL from reqURLs and reqObs . Also make sure to reduce the activeCount if we already have that URL in the list

    const indx = this.reqURLs.indexOf(url); if (indx > -1) { this.reqURLs.splice(indx, 1); const observer = this.reqObs[url]; observer.error(); delete this.reqObs[url]; if(this.activeCount > 0) { this.activeCount–; } }

  • If THROTTLE_LIMIT is not reached, just send the regular request. Make sure to handle the response (in both success/failure) and continue pending API calls. processResponse() is explained in the next step

    if (this.activeCount < THROTTLE_LIMIT) { this.activeCount++; return next.handle(req).pipe( tap(evt => { if (evt instanceof HttpResponse) { this.processResponse(); } return evt; }), catchError(err => { this.processResponse(); return of(err); }), ) }

  • In either case (success/failure), make sure to reduce the activeCount . If there are any pending API calls in the reqURLs send them by completing it’s observable. Once we send them, remove them from our list, so that we don’t need to process them later. Don’t go with syntax and types :).

    if (this.activeCount > 0) { this.activeCount–; } if (this.reqURLs.length > 0) { const url = this.reqURLs[0]; const observer = this.reqObs[url];‘done!’); observer.complete(); this.reqURLs = this.reqURLs.slice(1); delete this.reqObs[url]; }

  • If THROTTLE_LIMIT is reached, just create an observable, store it for later use and send it back.

    this.reqURLs.push(url); this.reqObs[url] = null; const obs = Observable.create(ob => { this.reqObs[url] = ob; }); return obs.pipe( concatMap(_ => { return next.handle(req).pipe( tap( …. // Same as previous to previous step ), ) } );

With this, we don’t need to touch any existing components and can add throttling of API calls using HTTP Interceptors.

What’s Next?

As mentioned earlier, there are many ways to handle this, and this solution may not suit in all cases. This is a simple POC and needs proper testing, and also need to check if there are any memory leaks and performance issues.