Never Block the Event Loop

Never Block the Event Loop

Articles in series:
  1. Don't block the Event Loop - Part 1
  2. Never Block the Event Loop
Daily short news for you
  • Manus has officially opened its doors to all users. For those who don't know, this is a reporting tool (making waves) similar to OpenAI's Deep Research. Each day, you get 300 free Credits for research. Each research session consumes Credits depending on the complexity of the request. Oh, and they seem to have a program giving away free Credits. I personally saw 2000 when I logged in.

    I tried it out and compared it with the same command I used before on Deep Research, and the content was completely different. Manus reports more like writing essays compared to OpenAI, which uses bullet points and tables.

    Oh, after signing up, you have to enter your phone number for verification; if there's an error, just wait until the next day and try again.

    » Read more
  • I just found a quite interesting website talking about the memorable milestones in the history of the global Internet: Internet Artifacts

    Just from 1977 - when the Internet was still in the lab - look how much the Internet has developed now 🫣

    » Read more
  • Just thinking that a server "hiding" behind Cloudflare is safe, but that’s not necessarily true; nothing is absolutely safe in this Internet world. I invite you to read the article CloudFlair: Bypassing Cloudflare using Internet-wide scan data to see how the author discovered the IP address of the server that used Cloudflare.

    It's quite impressive, really; no matter what, there will always be those who strive for security and, conversely, those who specialize in exploiting vulnerabilities and... blogging 🤓

    » Read more

Issues

All requests from receiving to responding go through the Event Loop. This means that if the Event Loop spends too much time at any point, all current requests and new requests will be blocked.

We should ensure that we never block the Event Loop. In other words, each callback function should complete as quickly as possible. This also applies to await, Promise.then, etc.

A good way to ensure this is to consider the "algorithmic complexity" of your callback functions. If a callback function doesn't care about the input size, we can ensure "fairness" for each request. If callbacks perform different processing steps depending on their arguments, then we should be concerned about the worst-case scenario of how much time it takes.

Here's an example of a request that doesn't care about the input:

app.get('/constant-time', (req, res) => {
  res.sendStatus(200);
});

And here's an example of a request where the processing time depends on the input:

app.get('/countToN', (req, res) => {
  const n = req.query.n;
  for (let i = 0; i < n; i++) {
    // do something on each iteration
  }
  res.sendStatus(200);
});

Node.js uses the V8 Engine, which is pretty fast for many common operations. However, it also has some exceptional cases when it comes to working with regexps or JSON.

REDOS: Denial of Service attacks with regexp patterns

A common way to block the Event Loop is to use a "vulnerable" regexp pattern. That's why we should avoid using vulnerable regexps.

Understandably, sometimes we need to use regexps to determine or search for a certain string. Unfortunately, in some cases, combining regex patterns can take an exponential amount of time depending on the input string.

A vulnerable regexp pattern is a regexp pattern that can take an exponential amount of time, leading to REDOS. Determining whether regexp patterns are truly vulnerable or not is a difficult question, and it depends on whether you are using Perl, Python, Ruby, Java, JavaScript, etc., but here are some rules that apply to all of these languages:

  • Avoid nested quantifiers, such as `. The V8 regexp engine can handle some of them quickly, but others are vulnerable. (a+)*
  • Avoid OR with nested disjunctions, such as . Again, these can be fast some of the time. (a|a)*
  • Avoid using backreferences, such as . No regexp engine can guarantee evaluating those in linear time. (a.*) \1
  • If you are performing simple string matching, use or local equivalents. It will be cheaper and will never take more than .indexOfO(n)

If you are unsure whether your regexp pattern is vulnerable or not, keep in mind that Node.js typically doesn't have issues reporting match results even for vulnerable regexps and long input strings. The exponential behavior is activated when there is a mismatch but Node.js cannot be certain until it tries many paths through the input string.

There are some tools to check the safety of regexp patterns:

However, they won't necessarily catch all vulnerable regexps.

Another approach is to use a different regexp engine. You can use the node-re2 module, which uses the fast regexp engine RE2 by Google. But be warned, RE2 is not 100% compatible with V8 regexps, so test your regexps when swapping in the node-re2 module to handle your regexps. And complex regexps are not supported by node-re2.

Time-consuming Core Modules

Certain core modules of Node.js have "expensive" synchronous APIs, including:

  • Encryption
  • Compression
  • File System
  • Child Process

These APIs are expensive because they involve significant computation (encryption, compression), require I/O (file I/O), or have the potential for both (child processes). These APIs are designed for convenience scripting, but not intended for use in a server context. If you execute them on the Event Loop, they will take longer to complete compared to regular JavaScript constructs, blocking the Event Loop.

In a server, you should not use the following synchronous APIs from these modules:

Encryption:

  • crypto.randomBytes (sync)
  • crypto.randomFillSync (sync)
  • crypto.pbkdf2Sync (sync)

You should also be cautious when providing large input to encryption and decryption routines.

Compression:

  • zlib.inflateSync
  • zlib.deflateSync

File System:

Avoid using synchronous file system APIs. For example, if the files you access are in a distributed file system like NFS, the access time may vary significantly.

Child Process:

  • child_process.spawnSync
  • child_process.execSync
  • child_process.execFileSync

JSON DOS

JSON.parse is also an "expensive" operation. It depends on the length of the input data, so it can unexpectedly take a long time. JSON.stringify also has the same behavior, with a complexity of up to O(n).

If your server deals with JSON objects, especially processing data received from a client, you should be mindful of their size.

For example, let's create a string object with a size of 2^21, and then JSON.parse it. The string has a size of 50MB. It takes 0.7 seconds to stringify the object, 0.03 seconds to indexOf, and 1.3 seconds to parse the string.

var obj = { a: 1 };
var niter = 20;

var before, str, pos, res, took;

for (var i = 0; i < niter; i++) {
  obj = { obj1: obj, obj2: obj };
}

before = process.hrtime();
str = JSON.stringify(obj);
took = process.hrtime(before);
console.log('JSON.stringify took ' + took);

before = process.hrtime();
pos = str.indexOf('nomatch');
took = process.hrtime(before);
console.log('Pure indexof took ' + took);

before = process.hrtime();
res = JSON.parse(str);
took = process.hrtime(before);
console.log('JSON.parse took ' + took);

To mitigate this, there are some npm modules that provide async JSON APIs like:

  • JSONStream.
  • Big-Friendly JSON, which has stream APIs as well as async versions of standard JSON APIs using the partitioning-on-the-Event-Loop model.

Conclusion

The above article has presented some seemingly simple behaviors that have a significant impact on the Event Loop. In the next article, we will explore together the solutions to handling Event Loop "blocking".

Premium
Hello

Me & the desire to "play with words"

Have you tried writing? And then failed or not satisfied? At 2coffee.dev we have had a hard time with writing. Don't be discouraged, because now we have a way to help you. Click to become a member now!

Have you tried writing? And then failed or not satisfied? At 2coffee.dev we have had a hard time with writing. Don't be discouraged, because now we have a way to help you. Click to become a member now!

View all

Subscribe to receive new article notifications

or
* The summary newsletter is sent every 1-2 weeks, cancel anytime.

Comments (0)

Leave a comment...