History of Node.js

Introduction to Node.js:

Node.js is a cool tech that lets you run JavaScript code outside of a web browser.

Ryan Dahl created it in 2009 and the Node.js community has been keeping it up ever since.

The Birth of Node.js:

Ryan Dahl was tired of dealing with the limitations of web browsers and the inefficiencies of server-side scripting languages, so he decided to create Node.js. He wanted a way to build scalable network applications that could handle a large number of connections with high throughput.

- The First Release of Node.js:

Node.js was released in 2009. It was based on the V8 JavaScript engine developed by Google for its Chrome web browser. The first version of Node.js included a limited set of core modules, including a HTTP server, a file system module, and a module for working with streams.

- The Rise of Node.js:

Node.js became popular quickly because it could handle a large number of concurrent connections with high throughput. It was also popular because it allowed developers to use JavaScript on the server-side, which was already familiar to many front-end developers. As Node.js gained popularity, the Node.js community grew, and many new modules and frameworks were developed to extend its capabilities.

Node.js Today:

Node.js is now a mature and stable platform used by many companies to build high-performance network applications. It has a large and active community of developers who continue to develop new modules and frameworks to extend its capabilities. Node.js is often used in combination with other technologies, such as databases, front-end frameworks, and cloud platforms, to build full-stack web applications.

Non Blocking I/O

Node.js is known for its efficient and scalable I/O model, which is based on a non-blocking, event-driven architecture.

This means that I/O operations are handled asynchronously and do not block the execution of the main program. In contrast to traditional blocking I/O models, where the program waits for each I/O operation to complete before continuing to the next one, Node.js can execute multiple I/O operations simultaneously.

How does it achieve that ?

Here are some key points that explain how Node.js achieves non-blocking I/O:

Event Loop

  • 💡 Node.js uses an event loop to manage I/O operations. The event loop is a loop that listens for events and triggers the corresponding callback functions.
  • 💡 When an I/O operation is initiated, Node.js registers the corresponding callback function with the event loop. Once the operation completes, the event loop triggers the callback function, which can then process the result.

Callbacks

  • 💡 Callbacks are functions that are passed as arguments to other functions. In Node.js, callbacks are used to handle the results of I/O operations.
  • 💡 When an I/O operation is initiated, a callback function is provided that will be called when the operation completes.
    • By using callbacks, Node.js can execute other code while waiting for I/O operations to complete.

Non-Blocking APIs

  • 💡 Node.js provides a set of non-blocking APIs that allow developers to perform I/O operations asynchronously.
    • ➡️ For example, the fs module provides non-blocking methods for reading and writing files. When these methods are called, Node.js immediately returns control to the program, allowing it to continue executing other code.
    • ➡️ Once the I/O operation completes, the callback function is called with the result.

Single-Threaded

  • 🔥 Node.js is a single-threaded environment, which means that all I/O operations are handled by a single thread. This allows Node.js to handle large numbers of simultaneous connections without consuming a lot of system resources. In contrast, other environments like Python and Ruby use multiple threads to handle I/O operations, which can lead to higher resource consumption and slower performance.

Example

Blocking code example:

const getUserSync = (userId) => {
  const users = {
    1: { name: "John", age: 35 },
    2: { name: "Jane", age: 28 },
  }
  return users[userId]
}
 
const user = getUserSync(1)
console.log(user)

In this example, the getUserSync function returns a user object from a hardcoded list of users. This function is blocking, because it executes synchronously and returns the result immediately.

Non-blocking code example:

const getUserAsync = (userId, callback) => {
  const users = {
    1: { name: "John", age: 35 },
    2: { name: "Jane", age: 28 },
  }
  setTimeout(() => {
    callback(users[userId])
  }, 1000)
}
 
getUserAsync(1, (user) => {
  console.log(user)
})
console.log("This will run first")

In this example, the getUserAsync function returns a user object from a hardcoded list of users, but it executes asynchronously, using the setTimeout function to delay the execution of the callback function by 1 second. The getUserAsync function takes a callback function as its second argument, which is called with the user object once it has been retrieved.

Installing node

There are a lot of ways to install node, and by far, the best way to install it, is by using the nvm which stands for node version manager.

Head to the docs, install nvm and you are good to go

Hello World

Let’s get started with writing our very first Node.js program. But, what is a Node.js program? Simple, some JavaScript!

console.log("hello world")

Ok, this is cool, but now what? How do we go about executing this code? We can use the Node CLI for this. In your terminal, run:

node index.js

By use the Node CLI, we can then execute a JavaScript file by using path to that file. That’s it! You’ve created your first JavaScript program. If you did this right, you’ll see hello world in the terminal, which in Node.js, is the console.

Browser vs. Node.js

JavaScript is a popular programming language used in both the browser and server-side applications. However, there are significant differences between how it works in the browser and in Node.js.

Global Object

In a browser, the global object is window, while in Node.js, it is global. For example, to log the global object in a browser, we can use:

console.log(window)

To do the same in Node.js, we use:

console.log(global)

Modules

We can import modules in the browser using script tags with the type attribute set to module and the src attribute set to the path of the module file. For example:

<script type="module" src="./module.js"></script>

We can then use the exported functions in our JavaScript code. For instance, we can import a sayHello function from a module called module.js and use it in our main JavaScript file as follows:

import { sayHello } from "./module.js"
sayHello()

On the other hand, in Node.js, we use the require or import statement to import modules:

import { module } from "./module.js"

DOM

The browser has a Document Object Model (DOM) that allows us to interact with HTML elements. For example, to change the text of an HTML element in the browser, we can use:

document.getElementById("elementId").innerHTML = "New text"

However, in Node.js, there is no DOM, so we cannot access or manipulate HTML elements.

Server vs. Website

Node.js is mainly used for server-side applications, while the browser is used for websites. For example, we can create a simple server in Node.js using:

const http = require("http")
 
const server = http.createServer((req, res) => {
  res.write("Hello World!")
  res.end()
})
 
server.listen(3000)

On the other hand, in the browser, we can create a website using HTML, CSS, and JavaScript.

Console

The console object works the same way in both the browser and Node.js. For example, to log a message in the browser, we can use:

console.log("Hello World!")

Similarly, in Node.js, we can use:

console.log("Hello World!")

JavaScript is used in both the browser and Node.js, but there are significant differences in how it works in each environment.

However, there are also many similarities, and if you already know JavaScript, you should be able to quickly pick up Node.js. Understanding the differences and similarities between the two environments is crucial when developing applications in either of them.

Node REPL

Node.js is a popular open-source server environment built on Chrome’s V8 JavaScript engine.

One of the most useful tools provided by Node.js is the REPL (Read-Eval-Print-Loop), which allows you to execute JavaScript code interactively in a terminal.

The Node REPL is similar to a command-line interface for JavaScript. It is an interactive environment where you can enter JavaScript code and see the results immediately. It is great for testing out small pieces of code or experimenting with new features.

To start the Node REPL, simply open your terminal and type node. This will give you access to the Node REPL prompt where you can start typing your JavaScript code.

Here are some examples of how to use the Node REPL:

// Basic arithmetic
> 2 + 2
4
 
// String manipulation
> 'hello, world'.toUpperCase()
'HELLO, WORLD'
 
// Defining a variable
> var x = 10
undefined
> x
10
 
// Using a function
> function add(a, b) { return a + b }
undefined
> add(3, 5)
8

The Node REPL is a great tool for quickly testing out ideas, debugging code, or experimenting with new features. It can also be useful for prototyping code before integrating it into a larger project.

However, the Node REPL is not a substitute for a proper development environment. It is not designed for writing large or complex code, and it lacks many of the features and tools that are available in a full-fledged IDE or text editor.

Overall, the Node REPL is a useful tool for any JavaScript developer, but it should be used in tandem with a proper development environment to ensure the best results.

Process and Environment

Process

In Node.js, the process object is a global object that provides information about the current Node.js process and allows developers to interact with it. Some of the most commonly used properties and methods of the process object are:

  • ➡️ process.argv
    • an array that contains the command line arguments passed to the current process
  • ➡️ process.pid
    • the ID of the current process
  • ➡️ process.env
    • an object that contains the environment variables of the current process
  • ➡️ process.exit()
    • terminates the current process with an optional exit code

Here’s an example of how to use the process.argv property to get the command line arguments passed to a Node.js script:

// script.js
console.log(process.argv)

If we run this script with the command node script.js arg1 arg2, the output will be:

;["node", "/path/to/script.js", "arg1", "arg2"]

This shows that the process.argv array contains the path to the Node.js executable, the path to the script being run, and the two arguments passed to the script.

Environment

The environment in Node.js refers to the set of variables that are available to a program at runtime. These variables are stored in the process.env object, which is an object containing key-value pairs of environment variable names and values.

Here’s an example of how to use the process.env object to access environment variables:

// script.js
console.log(process.env.NODE_ENV)

If we run this script with the command NODE_ENV=production node script.js, the output will be:

production

This shows that we can access the value of the NODE_ENV environment variable using the process.env object.

Conclusion

In conclusion, understanding the process and environment in Node.js is crucial for building scalable and high-performance applications. By using the process object and environment variables, developers can access and manipulate the runtime environment of their programs.

What is a CLI

What is it

A Command-Line Interface (CLI) is a text-based interface that allows users to interact with a computer program or operating system by typing in text commands. A CLI is a powerful tool for developers and system administrators to perform various tasks quickly and efficiently.

What are examples for a CLI command

Using CLI commands in bash is easy. For example, to list the contents of a directory, you can use the ls command. To change directories, you can use the cd command. To remove a file, you can use the rm command. These commands can also be combined with flags and arguments to perform more specific actions.

Flags

Flags are options that modify the behavior of a command.

For example, the -a flag can be used with the ls command to show hidden files. Arguments, on the other hand, are values or inputs that are provided to a command.

For example, the mkdir command requires an argument that specifies the name of the directory to be created.

Creating a CLI in Node.js is a popular choice for developers. It allows you to create a custom CLI that can be used to perform specific tasks. You can use third-party libraries such as commander and yargs to create a CLI in Node.js. These libraries provide an easy way to parse arguments and flags and execute commands.

When creating a CLI, you need to decide whether it should be installed globally or locally. A global installation allows the CLI to be used from any directory on your system, while a local installation requires the user to be in the directory where the CLI is installed.

Installing CLIs is easy. You can use the npm command to install a CLI from the Node.js package manager. For example, to install the create-react-app CLI globally, you can use the following command:

npm install -g create-react-app

CLI basics

We’re now ready to create the CLI for our note taking app. First thing we’re going to do is create a new Node.js project, we can do this by using the npm cli:

npm init

The init command will create a package.json file for us on the root of the directory in which we ran the command. Don’t worry about npm and the package.json, we’ll get to those soon enough.

Next, let’s create the file where we’ll write our JavaScript for the CLI. Create a file called index.js on the root if you don’t have one already. Here in this file add a for now, just add a console.log. We’ll create the logic for allowing a user to add a new note to the app. So we need a few things:

  1. [p] A command to use in the terminal
  2. [p] The ability for that command to accept arguments and flags
  3. [p] Using the value of that argument to create a new note

Let’s start by creating a command we can use in our terminal. To do that, we have to register a command name in a package.json:

{
  "name": "note",
  "bin": {
    "note": "./index.js"
  }
}

Under the bin field, we can use pretty much any name we want. This name will end up being the command we use in the terminal as our CLI, so pick a good name! We’ll just use note. Notice we add the file path to that js file we created. That’s telling Node.js that we want to execute that file with the note command is ran. We need to augment that file with some hints for our machine. That’s because our computer might have many OS languages installed, like python or ruby, so it won’t know which one of these should be used to execute our program. We want to use Node.js to execute our js file. So at the top, the very first line of our JS file, we add:

#!/usr/bin/env node

This is a hashbang. A hashbang is a special comment placed at the top of a script that tells the operating system which interpreter to use when running the script. In the context of a Node.js CLI app, the hashbang tells the machine to use Node.js as the interpreter to execute the JS file.

Next, we need to install this CLI on our machine so we can test it out. We could actually install this globally like it’s a 3rd party module, but instead we’re going to create a symlink.

A symlink, or symbolic link, is a special type of file that acts as a reference to another file or directory. In the context of a Node.js CLI app, creating a symlink allows us to run the CLI from anywhere in our file system, just like a globally installed third-party module. This means that we can run our note command from any directory on our machine, without having to navigate to the directory where the CLI code is stored. Creating a symlink is done using the npm link command, which creates a symlink from the globally installed npm package to the locally stored code. So run the command:

npm link

We should not be able to run the note command in our terminal and see our log output, try it out!

note

Note logic

Now that we have a working CLI, let’s start creating the logic for our note taking app! For now, we want to be able to except some input from the terminal as a new note. Later we will figure out what to do with that input, but for now, lets just make sure we can read it. To do so, we can tap into the process . So in our index.js file:

// index.js
const note = process.argv[2]
const newNote = {
  id: Date.now(),
  note,
}
console.log("your new note", newNote)

The above code assumes anything you place after the note command will be the note. It then takes that note and creates a new object and logs it. Simple. So the command would be:

note "this is my note"

Notice I put the note in strings so the terminal doesn’t think each word is a new argument, but instead the while string is the argument.

This is cool, but we need some more features here. We need to be able to save the notes somewhere, we also need some more flexibility on how we create notes. Like adding tags, a name, a status. Being able to list all notes and even removing a note would be useful too. So we’ll need more commands and flags for our CLI. We could parse this ourself, but there are some handy 3rd party modules we can install that will make this easier. What is a module you ask? We’re covering that next!

Modules in node

What is a module

In Node.js, a module is a self-contained piece of code that performs a specific task. It can be a function, an object, or a piece of functionality that can be used in other parts of your application.

module types

There are three types of modules in Node.js: internal, user-created, and third-party modules from npm.

Internal Modules

Internal modules are built into Node.js and can be accessed using the require function without any additional installation. For example, the built-in http module is used to create a web server:

const http = require("http")
 
http
  .createServer(function (req, res) {
    res.writeHead(200, { "Content-Type": "text/plain" })
    res.end("Hello World!")
  })
  .listen(8080)

In this example, we use the require() function to load the http module and use it to create a web server that listens on port 8080.

User-created Modules

User-created modules are modules that you create yourself and can be included in your application using the require function. For example, you can create a module that exports a function that adds two numbers:

// math.js
 
function add(a, b) {
  return a + b
}
 
module.exports = add

And then use it in another file:

// index.js
 
const add = require("./math")
 
console.log(add(2, 3)) // output: 5

In this example, we create a module called math.js that exports a function called add. We then use the require() function to load the add function from the math.js module and use it to add two numbers.

Third-party Modules

Third-party modules from npm are modules that are created by other developers and can be downloaded from the npm registry using the npm package manager. For example, you can use the axios module to make HTTP requests:

const axios = require("axios")
 
axios
  .get("<https://jsonplaceholder.typicode.com/users>")
  .then(function (response) {
    console.log(response.data)
  })
  .catch(function (error) {
    console.log(error)
  })

In this example, we use the require() function to load the axios module and use it to make an HTTP GET request to the jsonplaceholder API.

Exporting Modules

To export a module from a file, you need to use the module.exports or export keyword. This allows you to make the module available to other parts of your application.

For you to be able to import a module you must add the type: module in your package configuration

{
  "name": "intro-node",
  ...
  "type": "module",
   ...
}

The common JS way of importing and exporting modules

For example, you can create a module that exports an object:

// logger.js
 
const logger = {
  log: function (message) {
    console.log(message)
  },
}
 
module.exports = logger

And then use it in another file:

// index.js
 
const logger = require("./logger")
 
logger.log("Hello, world!") // output: Hello, world!

In this example, we create a module called logger.js that exports an object with a log function. We then use the require() function to load the logger.js module and use it to log a message.

Importing Modules

In the latest versions of Node.js, you can also use the import statement to import a module. This statement is similar to the require() function but has some differences in syntax and behavior. For example, you can import a named export:

// math.js
// named import
export function add(a, b) {
  return a + b
}
 
// default export(only one per file)
export default [1, 2, 3]

And then use it in another file:

// index.js
// named importa
import { add } from "./math.js"
//default export
import anyName from "./math.js"
 
console.log(add(2, 3)) // output: 5

In this example, we create a module called math.js that exports a named function called add. We then use the import statement to load the add function from the math.js module and use it to add two numbers.

Conclusion

Modules are an essential part of Node.js development, and understanding how to create, import, and export them is crucial to building efficient and scalable applications.

Require vs. Import

In Node.js, require and import are two ways to include external modules in your code. Both of them serve the same purpose, but they have some differences that are worth noting.

Require

require is a built-in function in Node.js that allows you to load external modules. It works synchronously, which means that it blocks the execution of the rest of the code until the module is loaded. Here is an example:

const fs = require("fs")
const data = fs.readFileSync("file.txt", "utf8")
console.log(data)

In this example, we are loading the fs module, which provides a way to interact with the file system. We then use the readFileSync method to read the contents of a file called file.txt. Finally, we log the contents to the console.

Import

Tip import is a newer way to include external modules in your code. It is part of the ES6 (ECMAScript 2015) specification, and it works asynchronously. Here is an example:

import fs from "fs/promises"
async function readFile() {
  const data = await fs.readFile("file.txt", "utf8")
  console.log(data)
}
readFile()

In this example, we are using the import statement to load the fs/promises module, which provides a way to interact with the file system using promises. We then define an async function called readFile, which uses the readFile method to read the contents of a file called file.txt. Finally, we log the contents to the console.

Exporting

In addition to loading external modules, you may also want to export your own code as a module that can be used by other parts of your application. There are several ways to do this, depending on the module system you are using.

CommonJS

CommonJS is the module system used by Node.js. To export a module in CommonJS, you can use the module.exports or exports objects. Here is an example:

// module.js
function hello(name) {
  console.log(`Hello, ${name}!`)
}
module.exports = { hello }
 
// app.js
const module = require("./module")
module.hello("world")

In this example, we define a function called hello in a module called module.js. We then export the function using module.exports. In app.js, we load the module using require, and we call the hello function.

ES6 Modules

ES6 modules are a newer module system that is supported by some browsers and by Node.js with the --experimental-modules flag. To export a module in ES6, you can use the export keyword. Here is an example:

// module.js
function hello(name) {
  console.log(`Hello, ${name}!`)
}
export { hello }
 
// app.js
import { hello } from "./module.js"
hello("world")

In this example, we define a function called hello in a module called module.js. We then export the function using the export keyword. In app.js, we load the module using import, and we call the hello function.

Conclusion

In conclusion, require and import are two ways to load external modules in Node.js. They have some differences in their syntax and behavior, but they both serve the same purpose. Additionally, there are different ways to export your own code as a module, depending on the module system you are using.

Thinking in modules

When developing node.js apps, it’s essential to think about modules as separate pieces of functionality that can be easily maintained and reused.

By dividing your code into smaller, self-contained modules, you can create more modular, maintainable, reusable, and efficient code.

What should be a module?

Each module should have a clear and specific purpose, and should not be overly complex or tightly coupled to other modules. This means that you should aim to create modules that are focused on a single task or responsibility, and that can be used in different parts of the app or in different apps altogether.

For example, a module that handles database connections should only be responsible for managing the connection to the database, and should not be responsible for any other tasks, such as data validation or business logic. This allows the module to be reused across different parts of the app or in different apps, without having to modify its code.

On the other hand, a module that handles user authentication should only be responsible for authenticating users, and should not be responsible for any other tasks, such as database access or UI rendering. This allows the module to be easily maintainable, as any changes to the authentication logic can be made without affecting other parts of the app.

Best export and import patterns💡

When exporting from a module,

use the export keyword followed by the name of the function, variable, or class you want to export. You can also use export default to export a single value from a module.

When importing a module, use the import keyword followed by the name of the module, and then use dot notation to access the exported values. This allows you to selectively import only the functions, variables, or classes that you need from a module, and to avoid polluting the global namespace of your app.

For example:

// modules/myModule.js
export function myFunction() {
  // ...
}
 
export const myVariable = 42
 
// app.js
import { myFunction, myVariable } from "./modules/myModule.js"

Index.js pattern🔥

It’s common to use an index.js file inside a folder to export several modules at once. This allows you to group related modules together, and to provide a clean and simple interface for importing them.

For example, suppose you have two modules, myModule1 and myModule2, that you want to export from a folder called modules. You can create an index.js file inside the modules folder that exports both modules:

// modules/myModule1.js
export function myFunction1() {
  // ...
}
 
// modules/myModule2.js
export function myFunction2() {
  // ...
}
 
// modules/index.js
export { myFunction1 } from "./myModule1.js"
export { myFunction2 } from "./myModule2.js"
 
// app.js
import { myFunction1, myFunction2 } from "./modules"
//or
import * as utils from ".utils"

By using the index.js pattern, you can simplify your import statements and make your code more readable.

In conclusion, thinking in modules is a fundamental concept in node.js development that can help you create more modular, maintainable, and reusable code. By following these guidelines, you can write code that is easier to understand, easier to modify, and easier to scale

Internal Modules

Internal modules in Node.js refer to built-in modules that are available within the Node.js environment.

These modules are part of the Node.js core and provide a range of functionalities and utilities that can be used in your application without the need to install any external packages.

There are several internal modules available in Node.js, including but not limited to:

  • 🔥 fs:
    • ➡️ used for file system operations
  • 🔥 http:
    • ➡️ used for creating HTTP servers and clients
  • 🔥 path:
    • ➡️ used for working with file paths
  • 🔥 os:
    • ➡️ used for retrieving operating system-related information
  • 🔥 crypto:
    • ➡️ used for cryptographic operations

To use these modules in your Node.js application, you need to import them using the require() function. For example, to use the fs module to read a file in your application, you would write:

const fs = require("fs")
fs.readFile("file.txt", (err, data) => {
  if (err) throw err
  console.log(data)
})

Starting from Node.js version 13, you can also use ES6 module syntax to import internal modules.

import fs from "fs"

With Node.js V18, you can be explicit about importing core, internal modules like so:

import fs from "node:fs"

Using the node: prefix doesn’t change the behavior of the import, it’s just a way to be explicit about this module being a core module and not a 3rd party one.

In summary, internal modules in Node.js are built-in modules that provide a range of functionalities and utilities. You can import them using either the require() function or ES6 module syntax, depending on your Node.js version and the type of module you want to import.

NPM and 3rd party modules

In Node.js, third-party modules are available for use in our code to extend its functionalities. We can manage these modules using Node Package Manager (NPM). NPM is a command-line interface that allows us to install, update, and remove packages.

  • [p] To start, we need to create a package.json file in our project directory. - [ ] This file contains metadata about our project and the dependencies required to run it. We can create this file using the command npm init and following the prompts.

    Once we have our package.json file set up, we can start installing packages using the command npm install <package-name>. This command installs the package and saves it to our project's node_modules folder. It also adds the package as a dependency to our package.json file.

We can also install packages globally using the -g flag. These packages are installed in a global directory and are available to all projects. However, it is recommended to install packages locally to avoid version compatibility issues.

Running scripts

To run scripts using NPM, we can define them in our package.json file under the scripts field.

For example, we can define a script to start our server using the command node server.js by adding the following line to our package.json file:

"scripts": {
  "start": "node server.js"
}
 

We can then run this script using the command npm run start.

To import modules in our code, we can use the require() function. For example, if we want to use the express module, we can import it using the following code:

import express from "express"

This imports the express module and assigns it to a variable named express. We can then use the functionalities provided by the express module in our code.

In summary, NPM is a powerful tool that allows us to manage third-party modules in our Node.js projects. We can install packages, manage dependencies, run scripts, and import modules using NPM. With these functionalities, we can extend the capabilities of our Node.js applications and make them more efficient and effective.

Using yargs 📝

We’re now going to install and use a 3rd party module to help us create our Note taking app. Its called yargs.

npm i yargs

Once you install it, create a src folder with a commands.js file and let’s add some commands.

import yargs from "yargs"
import { hideBin } from "yargs/helpers"
 
yargs(hideBin(process.argv))
  .command(
    "new <note>",
    "create a new note",
    (yargs) => {
      return yargs.positional("note", {
        describe: "The content of the note you want to create",
        type: "string",
      })
    },
    async (argv) => {},
  )
  .option("tags", {
    alias: "t",
    type: "string",
    description: "tags to add to the note",
  })
  .command(
    "all",
    "get all notes",
    () => {},
    async (argv) => {},
  )
  .command(
    "find <filter>",
    "get matching notes",
    (yargs) => {
      return yargs.positional("filter", {
        describe: "The search term to filter notes by, will be applied to note.content",
        type: "string",
      })
    },
    async (argv) => {},
  )
  .command(
    "remove <id>",
    "remove a note by id",
    (yargs) => {
      return yargs.positional("id", {
        type: "number",
        description: "The id of the note you want to remove",
      })
    },
    async (argv) => {},
  )
  .command(
    "web [port]",
    "launch website to see notes",
    (yargs) => {
      return yargs.positional("port", {
        describe: "port to bind on",
        default: 5000,
        type: "number",
      })
    },
    async (argv) => {},
  )
  .command(
    "clean",
    "remove all notes",
    () => {},
    async (argv) => {},
  )
  .demandCommand(1)
  .parse()

We’re using yargs to setup different commands we can use from our cli, for example, the new command can be used like:

note new "this is my new note"

Right now, the function for each command is blank, we need to make them actually do something next.

Async operation in node

Asynchronous code is an important concept in Node.js. It allows the program to run non-blocking code, which means that other operations can be performed while waiting for a task to complete.

Callbacks

Callbacks are a traditional way of handling asynchronous code in Node.js. A callback function is passed as an argument to a function that performs an asynchronous operation. When the operation is complete, the callback function is executed.

fs.readFile("file.txt", (err, data) => {
  if (err) {
    console.error(err)
    return
  }
  console.log(data)
})

In the example above, the readFile function reads the contents of file.txt. When the file is read, the callback function is executed. If an error occurs during the operation, the error is passed to the callback function as the first argument.

Promises

Promises provide a cleaner way of handling asynchronous code in Node.js. Promises represent a value that may not be available yet and allow you to chain multiple asynchronous operations together.

const fs = require("fs/promises")
 
fs.readFile("file.txt")
  .then((data) => {
    console.log(data)
  })
  .catch((err) => {
    console.error(err)
  })

In the example above, the readFile function returns a promise. When the promise is resolved, the then function is executed. If the promise is rejected, the catch function is executed.

Async/Await

Async/await is a newer way of handling asynchronous code in Node.js. It provides a more readable and concise way of writing asynchronous code by using the async and await keywords.

const fs = require("fs/promises")
 
async function readFile() {
  try {
    const data = await fs.readFile("file.txt")
    console.log(data)
  } catch (err) {
    console.error(err)
  }
}
 
readFile()

In the example above, the readFile function is marked as async. This allows us to use the await keyword to wait for the readFile function to complete before executing the next line of code. If an error occurs during the operation, it is caught by the try/catch block.

Error Handling

Error handling is an important part of handling asynchronous code in Node.js. In all of the examples above, error handling is performed using a try/catch block or a catch function.

It’s important to handle errors properly to avoid crashing the application. In addition, it’s important to provide meaningful error messages to help with debugging.

The FS module

The FS (File System) module is a built-in module in Node.js that provides an API for interacting with the file system. This module allows developers to perform various file operations such as reading, writing, updating, deleting, and renaming files.

One of the most commonly used methods in the FS module is the fs.readFile() method, which reads the content of a file asynchronously and returns its content in a callback function. Another popular method is fs.writeFile(), which writes data to a file asynchronously.

Other frequently used methods include:

  • 💡 fs.mkdir()
    • ➡️ to create a new directory
  • 💡 fs.readdir()
    • ➡️ to read the contents of a directory
  • 💡 fs.stat()
    • ➡️ to get information about a file
  • 💡 fs.unlink()
    • ➡️ to delete a file
  • 💡 fs.rename()
    • ➡️ to rename a file

The FS module also provides synchronous versions of these methods, which can be useful for simple scripts or small applications where performance is not a major concern. However, it’s generally recommended to use the asynchronous versions of these methods, as they are non-blocking and allow for better performance in larger applications.

Overall, the FS module is a powerful tool for developers working with Node.js, as it enables them to easily interact with the file system and perform a wide range of file operations.

The callback-based versions of the node:fs module APIs are preferable over the use of the promise APIs when maximal performance (both in terms of execution time and memory allocation) is required.

Directory Operations

Node.js provides a robust set of file system (fs) operations, allowing you to interact with the file system on your machine. Bellow are directory-related operations, including creation, renaming, moving, deleting, and copying directories.

General Syntax

  • ✅ Importing the File System Module
const fs = require('fs').promises;
Async Function Declaration
  • ✅ Async function declaration
async function exampleFunction() {
  // Directory operation code here
}
  • ✅ Try-Catch for Error Handling
try {
  // Successful operation
} catch (error) {
  // Error handling
}

Commonly Used Options

1. fs.mkdir (Create Directory)

  • Syntax: fs.mkdir(path, [options])
  • Options:
    • recursive: Boolean (default: false). When true, creates parent directories if they do not exist.
    • mode: Integer (default: 0o777). Sets the directory permissions.
async function createDirectory(path) {
  try {
    await fs.mkdir(path, { recursive: true })
    console.log(`Directory created at ${path}`)
  } catch (error) {
    console.error(`Error creating directory: ${error.message}`)
  }
}

2. fs.readdir (Read Directory)

  • Syntax: fs.readdir(path, [options])
  • Options:
    • withFileTypes: Boolean (default: false). When true, the method returns an array of fs.Dirent objects (not just filenames).
async function readDirectory(path) {
  try {
    const files = await fs.readdir(path, { withFileTypes: true })
    files.forEach((file) => {
      console.log(file.name)
    })
  } catch (error) {
    console.error(`Error reading directory: ${error.message}`)
  }
}

3.fs.rmdir (Remove Directory)

  • Syntax: fs.rmdir(path, [options])
  • Options:
    • recursive: Boolean (default: false). When true, performs a recursive directory removal.
async function deleteDirectory(path) {
  try {
    await fs.rmdir(path, { recursive: true })
    console.log(`Directory deleted at ${path}`)
  } catch (error) {
    console.error(`Error deleting directory: ${error.message}`)
  }
}

4.fs.rename (Rename Directory)

  • Syntax: fs.rename(oldPath, newPath)
  • Options: No additional options for renaming.
async function renameDirectory(oldPath, newPath) {
  try {
    await fs.rename(oldPath, newPath)
    console.log(`Directory renamed from ${oldPath} to ${newPath}`)
  } catch (error) {
    console.error(`Error renaming directory: ${error.message}`)
  }
}

Good practice

  • Use async/await: For cleaner and more readable code.
  • Error Handling: Always implement error handling with try-catch blocks.
  • Check Existence: Use fs.access or fs.stat to check the existence of a directory before performing operations.
  • Path Management: Utilize the path module for handling file paths, ensuring cross-platform compatibility.

Creating Directories

fs.mkdir

  • ➡️ Purpose:
    • To create a new directory.
  • ➡️ Callback style:
    • fs.mkdir(path, [options], callback) for callback style.
  • ➡️ Modern Approach:
    • Use fs.promises.mkdir for a promise-based approach.
Creating directories
import fs from "node:fs/promises"
async function createDirectory(path) {
  try {
    await fs.mkdir(path)
    console.log(`Directory created at ${path}`)
  } catch (error) {
    console.error(`Error creating directory: ${error.message}`)
  }
}
 
createDirectory("./newDirectory")

Renaming Directories

fs.rename

  • Purpose: To rename a directory.
  • Method: fs.rename(oldPath, newPath, callback).
  • Modern Approach: Use fs.promises.rename.
Renaming directory
async function renameDirectory(oldPath, newPath) {
  try {
    await fs.rename(oldPath, newPath)
    console.log(`Directory renamed from ${oldPath} to ${newPath}`)
  } catch (error) {
    console.error(`Error renaming directory: ${error.message}`)
  }
}
 
renameDirectory("./oldDirectory", "./newDirectory")

Deleting Directories

fs.rmdir

  • Purpose: To remove a directory.
  • Method: fs.rm(path, callback).
  • Modern Approach: Use fs.promises.rm.
Deleting a directory
async function deleteDirectory(path) {
  try {
    await fs.rm(path)
    console.log(`Directory deleted at ${path}`)
  } catch (error) {
    console.error(`Error deleting directory: ${error.message}`)
  }
}
 
deleteDirectory("./oldDirectory")

Reading Directories

fs.readdir

  • Purpose: To read the contents of a directory.
  • Method: fs.readdir(path, callback).
  • Modern Approach: Use fs.promises.readdir.
async function readDirectory(path) {
  try {
    const files = await fs.readdir(path)
    console.log(`Contents of ${path}:`, files)
  } catch (error) {
    console.error(`Error reading directory: ${error.message}`)
  }
}
 
readDirectory("./someDirectory")

Moving and Copying Directories

  • Moving: Node.js doesn’t have a built-in method for moving directories. It’s usually done by renaming (if on the same drive) or copying to a new location and then deleting the original.
  • Copying: There’s no direct method to copy directories. It involves recursively reading, copying files, and creating directories.
Copying a Directory
const path = require("path")
 
async function copyDirectory(src, dest) {
  try {
    await fs.mkdir(dest, { recursive: true })
    let entries = await fs.readdir(src, { withFileTypes: true })
 
    for (let entry of entries) {
      let srcPath = path.join(src, entry.name)
      let destPath = path.join(dest, entry.name)
 
      entry.isDirectory()
        ? await copyDirectory(srcPath, destPath)
        : await fs.copyFile(srcPath, destPath)
    }
  } catch (error) {
    console.error(`Error copying directory: ${error.message}`)
  }
}
 
copyDirectory("./sourceDirectory", "./destinationDirectory")

Good Practices

  • Error Handling: Always use try/catch for error handling in async functions.
  • Check Existence: Before performing operations, check if the directory exists to avoid errors.
  • Avoid Blocking the Event Loop: Use non-blocking asynchronous methods.
  • Path Handling: Use the path module to handle file paths, ensuring compatibility across different OS.

File operations

1. Creating Files

fs.writeFile

  • Purpose: To create a new file or replace the contents if the file already exists.
  • Syntax: fs.writeFile(file, data, [options])
  • Options:
    • encoding: String or null (default: 'utf8').
    • mode: Integer (default: 0o666).
    • flag: String (default: 'w').
async function createFile(filePath, content) {
  try {
    await fs.writeFile(filePath, content)
    console.log(`File created at {filePath}`)
  } catch (error) {
    console.error(`Error creating file: ${error.message}`)
  }
}
 
createFile("example.txt", "Hello, Node.js!")

2. Reading Files

fs.readFile

  • Purpose: To read the contents of a file.
  • Syntax: fs.readFile(path, [options])
  • Options:
    • encoding: String or null (default: null).
    • flag: String (default: 'r').
async function readFile(filePath) {
  try {
    const content = await fs.readFile(filePath, "utf8")
    console.log(`File contents: ${content}`)
  } catch (error) {
    console.error(`Error reading file: ${error.message}`)
  }
}
 
readFile("example.txt")

3. Updating Files

fs.appendFile

  • Purpose: To append data to a file.
  • Syntax: fs.appendFile(path, data, [options])
  • Options: Same as fs.writeFile.
async function appendToFile(filePath, content) {
  try {
    await fs.appendFile(filePath, content)
    console.log(`Appended content to ${filePath}`)
  } catch (error) {
    console.error(`Error appending file: ${error.message}`)
  }
}
 
appendToFile("example.txt", "\nAppended Text")

4. Deleting Files

fs.rm

  • Purpose: To remove a file or directory.
  • Syntax: fs.rm(path, [options])
  • Options:
    • force: Boolean (default: false). When true, ignores the absence of the file.
    • recursive: Boolean (default: false). When true, performs a recursive directory removal.
async function deleteFile(filePath) {
  try {
    await fs.rm(filePath)
    console.log(`File deleted at ${filePath}`)
  } catch (error) {
    console.error(`Error deleting file: ${error.message}`)
  }
}
 
deleteFile("example.txt")

How to Rename Files

Another major part of working with files is renaming an already existing file. Let’s take a look at how that works:

const { rename } = require('fs/promises');
 
async function renameFile('dummytext.txt', 'changedDummyText.txt') {
   try {
     await rename(from, to);
     console.log(`Renamed ${from} to ${to}`);
   } catch (error) {
     console.error(`Got an error trying to rename the file: ${error.message}`);
   }
 }

As you can see, our async function took in two parameters this time, the first one will old the current name of the document we are about to change, and the second one will old the new name we are working with.

We’ve also changed our dummyText.txt to changedDummyText.txt.

How to Delete Files

To delete a file we’ll have to call the unlink method. This time, instead of having two parameters, we’ll have only one: the name, or the path of the file you want to delete.

We’ll also still use our try-catch error handling method to control our errors, and log the name of the file that has been deleted on the console.

const { unlink } = require('fs/promises');
 
async function deleteFile('./dummytext.txt') {
 
   try {
     await unlink(filePath);
     console.log(`Deleted ${filePath}`);
   } catch (error) {
     console.error(`Got an error trying to delete the file: ${error.message}`);
   }
 }

Good Practices

  • Use async/await: Ensures cleaner, more readable asynchronous code.
  • Error Handling: Always use try-catch blocks for robust error handling.
  • Data Validation: Validate data before writing to a file.
  • File Existence: Use fs.access or fs.stat to check for the existence of a file before certain operations.
  • Safe File Deletion: Use options like force judiciously to prevent unintended file deletions.

Types of test

In a Node.js application, there are several types of tests that can be written. Understanding the differences between these tests can help you determine which ones to use in different situations. Below are some of the most common types of tests:

Unit Testing

Unit testing is the process of testing individual units or components of code in isolation. This is typically done by creating test cases that focus on specific functions or methods within a module or file. The purpose of unit testing is to identify and fix bugs in small, isolated pieces of code before they can cause larger problems in the application.

Integration Testing

Integration testing is the process of testing how different pieces of code work together. This can involve testing how modules interact with each other, or how an application interacts with a database or external API. The purpose of integration testing is to identify issues that may arise when different parts of the application are combined.

End-to-End (E2E) Testing

End-to-end testing is the process of testing an entire application from start to finish. This includes testing the user interface, as well as the various backend components that make up the application. The purpose of E2E testing is to ensure that the application works as intended in a real-world scenario.

API Testing

API testing is the process of testing the various APIs or endpoints that an application exposes. This can involve testing how the API responds to different inputs, as well as how it interacts with other parts of the application. The purpose of API testing is to ensure that the application’s APIs are working as intended and can handle different types of requests.

Overall, each type of test serves a different purpose and helps ensure that your Node.js application is functioning as intended. By understanding the differences between these tests, you can choose which ones to use in different scenarios and create a more robust testing strategy.

Jest Testing library

Introduction to Jest

Jest is a delightful JavaScript testing framework with a focus on simplicity. It works out of the box for any React project and is widely used due to its ease of setup, excellent mocking capabilities, and minimal configuration.

Why Jest?

  1. Zero Configuration: Jest aims to work out of the box, with minimal setup.
  2. Great Mocking Library: Easy mocking of objects, which is essential for unit testing.
  3. Built-in Coverage Reports: Automatically generates code coverage reports.
  4. Snapshot Testing: Allows testing the change in data structures according to the code changes.
  5. Runs Tests in Parallel: Optimizes performance by running tests in parallel.
  6. Watch Mode: Re-runs tests related to changed files, which is helpful during development.

Getting Started with Jest

Installation

  1. Initialize npm in your project (skip if already done):
npm init
  1. Install Jest:

    bashCopy code

npm install --save-dev jest

Configuring Jest

For most projects, Jest works without any configuration. However, you can configure Jest by adding a jest section in your package.json file, or by creating a jest.config.js file in your project root.

Example package.json configuration:

{
  "scripts": {
    "test": "jest"
  }
}

Writing Your First Test

Create a file named sum.js:

function sum(a, b) {
  return a + b
}
module.exports = sum

Create a file named sum.test.js:

const sum = require("./sum")
 
test("adds 1 + 2 to equal 3", () => {
  expect(sum(1, 2)).toBe(3)
})

Running Tests

Run your tests using the following command:

npm test

Basic Jest Concepts

Test Suites and Test Cases

  • Test Suites: A collection of test cases. Defined using describe block.
  • Test Cases: Individual tests. Defined using test or it block.

Matchers

Jest provides “matchers” to let you test values in different ways. Examples include:

  • expect(value).toBe(value): Checks exact equality.
  • expect(value).toEqual(value): Checks the value of objects or arrays.

Asynchronous Testing

Jest can test asynchronous code by using async/await.

test("async test", async () => {
  await expect(someAsyncFunction()).resolves.toEqual(expectedValue)
})

Mock Functions

Mock functions allow you to test the links between code by erasing the actual implementation, capturing calls to the function, and allowing test-time configuration of return values.

Snapshot Testing

Snapshot testing is a feature of Jest that automatically generates the output of rendered components and ensures your UI does not change unexpectedly.

Best Practices for Using Jest

  1. Organize Tests Properly: Group related tests using describe blocks.
  2. Clear and Concise Test Descriptions: Test descriptions should clearly describe what they are testing.
  3. Test Isolation: Ensure tests do not depend on each other.
  4. Mock External Modules and APIs: Use Jest’s mocking features for external dependencies.
  5. Regularly Update Snapshots: If UI changes are intentional, update snapshots accordingly.