Running Promises in Series with a delay between each

#rate limit  




Thu Mar 31 2022

I wanted to call an endpoint updating an entity and I wanted to do it for 500 entities. The API server had "rate limit" and I couldn't just use Promise.all since it will call the endpoint in parallel and the rate-limit would block my calls.

So I decided to write a script to bulk call the endpoint in series and have a 2 second wait before each call to make sure API server would not block them.

As I did that and it worked great, I thought it'd be advantageous to share. Maybe it'll help someone or maybe there's a better way to do this.

How I'm doing this is starting from 0 and then adding up to that based on the array of data that I want to update. In the chain of promises that I have, I just added a delay before the API call, every time that I add 1 to the value that I'm passing to update the data till that number would be equal to my array length.

import fetch from "node-fetch";
import { data } from "./data.js";

const bearerToken = ""

const updateData = (id) => {
  return fetch(
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        Authorization: `Bearer ${bearerToken}`,

const delay = () => {
  console.log(`Waiting: 2 seconds.`);
  return new Promise((resolve) => {
    setTimeout(() => {
    }, 2000);

const startTime =;

const doNextPromise = async (id) => {
    .then((x) => {
        `Waited: ${x} seconds now calling the endpoint for updating data ${data[id]}`
      return updateData(data[id])
        .then((res) => {
          if (res.status !== 200) {
            throw `Error updating data ${data[id]}: ${res.status}`;
          return res.json();
        .then((res) =>
          console.log(`Response: ${JSON.stringify(res)} for data ${data[id]}`)
        .catch((e) => console.log(`Error: ${e}`));
    .then((res) => {
      if (id < data.length) doNextPromise(id);
      else console.log(`Total: ${( - startTime) / 1000} seconds.`);

© Copyright 2022 Farmin Farzin