JavaScript made its first appearance nearly 30 years ago as a new programming language of its time. It certainly needed some time to stabilize and gain widespread use. Until now, we can't deny the success of JavaScript, as it appears everywhere in the web world. Furthermore, it has expanded beyond web browsers to accomplish even more. A product of technology never stops evolving. Over the years, JavaScript has continuously evolved, introducing more useful features and addressing its inherent limitations. TC39 is a group established within the ECMA association with the goal of standardizing JavaScript while actively developing and expanding it.
JavaScript is more commonly known than ECMAScript, even though ECMAScript is a specification, and JavaScript is a successful implementation of this specification. The first stable version to mention is ECMAScript 2009 (ES5, ES2009), released in 2009, which introduced many features that have made it popular to this day. However, the most significant change came in 2015 when ES6 (ES2015) was introduced with substantial improvements. Since then, TC39 has aimed to release a new ECMAScript version approximately every year to introduce new features to the language. As of now, the latest version is ECMAScript 2022 (ES13).
Returning to JavaScript, one of its most prominent features is its ability to handle asynchronous operations, meaning that working with JavaScript involves dealing with asynchrony. However, in its early days, handling asynchrony was somewhat inconvenient. Recognizing this, TC39 went through several updates to make asynchronous handling as optimal as possible.
Callback, Promise, and Async/Await are the ways TC39 introduced to handle asynchrony. In this article, let's explore how this evolution unfolded and whether it's worth it!
Node.js has only one JavaScript execution thread, meaning it needs a mechanism to perform I/O tasks (which don't heavily consume CPU and depend on disk or network speed) without blocking the main thread. It achieves this by applying non-blocking I/O processing. This model allows us to perform asynchronous tasks without waiting for the results before moving on to other tasks.
The non-blocking I/O model of Node.js brings many benefits. First, it enhances the performance and responsiveness of applications, especially in network applications like REST APIs. Instead of waiting for I/O tasks to complete, we can continue executing other tasks.
To understand more about asynchrony, you can refer to these articles:
Because results will be returned at some point when calling asynchronous functions, there must be a way to handle them.
A callback is a function passed as an argument to another function, to be executed later after an asynchronous operation completes. Callbacks are the most primitive way of handling asynchrony in JavaScript. For example, when reading a file in Node.js, we pass a callback to handle the read result.
const fs = require('node:fs')
fs.readFile('readme.md', (err, data) => {
if (err) throw err;
console.log(data);
});
The second parameter passed to readFile
is a function with two parameters: err
and data
. After readFile
has a result, it will call this function with either the retrieved data or an error if one occurred. This is why this approach is called a callback; the function is called back after the I/O operation returns a result.
While callbacks are a common approach, they have some drawbacks. The most significant issue is callback hell, where code becomes deeply nested and hard to read. This occurs when we need to perform multiple sequential asynchronous tasks and use callbacks to handle their results.
asyncFunc1((err1, data1) => {
asyncFunc2((err2, data2) => {
asyncFunc3((err3, data3) => {
asyncFunc4((err4, data4) => {
// ...
});
});
});
});
ES6 (also known as ES2015) brought a significant improvement in handling asynchrony by introducing Promises. Promises provide a powerful mechanism for handling asynchronous tasks in a more readable and understandable way.
Instead of using callbacks, we can use Promises to create a chain of asynchronous tasks and handle their results. Promises provide methods like then
and catch
to handle the success and failure outcomes of a task.
Here's an example of reading a file asynchronously using Promises instead of callbacks:
const fs = require('node:fs/promises');
fs.readFile("readme.md")
.then((data) => {
console.log(data);
})
.catch((err) => {
console.error(err);
});
Additionally, the beauty of Promises lies in "chaining functions," where we can capture the result of previous Promises through then
and process it sequentially. This chaining can help alleviate "callback hell."
fs.readFile("readme.md")
.then(handleFile1)
.then(handleFile2)
.then(handleFile3)
...
.catch()
At this point, the community embraced Promises and provided a way to convert non-Promise-supporting functions into Promise-supporting ones using new Promise
.
The following example converts the readFile
function from the beginning of the article into a Promise
without using Node's built-in modules:
const fs = require('node:fs')
const readFilePromise = (file) => {
return new Promise((resolve, reject) => {
fs.readFile(file, (err, data) => {
if (err) reject(err);
resolve(data);
});
});
}
readFilePromise('readme.md')
.then((data) => {
console.log(data);
})
.catch((err) => {
console.log(err);
});
The readFilePromise
function wraps the readFile
function, originally using callbacks. Instead of directly returning the result within the callback, it uses reject
and resolve
functions to handle the Promise's outcome. Now we can use the readFilePromise
function with then
and catch
as if it were a Promise.
Additionally, Node.js provides the util.promisify library to convert asynchronous functions from callback-based to Promise-based, reducing the work and improving compatibility when using Promises.
const fs = require('node:fs')
const util = require('util')
const readFile = util.promisify(fs.readFile)
readFile("readme.md")
.then((data) => {
console.log(data);
})
.catch((err) => {
console.error(err);
});
However, Promises are not without their shortcomings. Managing multiple Promises can become challenging and may lead to a phenomenon similar to "callback hell." Handling errors can also be challenging when using "chaining functions." Recognizing these challenges, Async/Await was introduced.
Async/Await is an effort to write asynchronous code as if it were synchronous. With Async/Await, we can write asynchronous code in a simpler and more understandable way.
Async/Await is based on Promises for handling asynchronous tasks. We can mark a function with the async
keyword and then use the await
keyword inside that function to wait for the result of a Promise.
For example, we can write an asynchronous function to read a file like this:
const fs = require('node:fs/promises');
async function readFileAsync(filename) {
try {
const data = await fs.readFile(filename);
console.log(data);
} catch(err) {
console.log(err);
}
}
readFileAsync("readme.txt");
Async/Await significantly improves code readability and understanding, abstracting away the complexity of Promises while making debugging easier.
Having a strong connection with JavaScript since 2017, I've learned to handle asynchrony through Promises and Async/Await much more. Callbacks felt foreign and challenging due to their inconveniences, making me wonder why people or libraries still provided callback-based asynchronous handling. Only by delving into the past did those doubts become clearer; callbacks are gradually being replaced by Promises. Even the "father" of Node.js had to admit that the lack of initial Promise support had caused Node.js's current mess.
Callbacks, Promises, and finally Async/Await—JavaScript has undergone significant evolution in asynchronous handling. Node.js's non-blocking I/O model has brought many benefits to application development. Callbacks are the primitive approach, while Promises and Async/Await offer simpler syntax and more readable code. Each method has its advantages and disadvantages, so it's not necessary to exclusively use one of them.
Hello, my name is Hoai - a developer who tells stories through writing ✍️ and creating products 🚀. With many years of programming experience, I have contributed to various products that bring value to users at my workplace as well as to myself. My hobbies include reading, writing, and researching... I created this blog with the mission of delivering quality articles to the readers of 2coffee.dev.Follow me through these channels LinkedIn, Facebook, Instagram, Telegram.
Comments (0)