Top 25+ Node.js Interview Questions & Answers | Your Personalized AI Guide

20 Must-Know Node.js Interview Questions Explained

Q1: What is Node.js, and what are its main features?

Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. It allows developers to run JavaScript on the server-side. Its main features include:

  • Non-blocking I/O: Uses asynchronous programming, making it efficient for handling multiple operations simultaneously.
  • Event-driven architecture: Relies on events and callbacks to handle actions, which improves performance.
  • Single-threaded: Uses a single-threaded model with event looping, enabling handling of concurrent requests without multiple threads.
  • Cross-platform: Can run on various operating systems like Windows, Linux, and macOS.
  • NPM (Node Package Manager): Provides a vast ecosystem of libraries and modules.

Q2: Explain the concept of Event Loop in Node.js.

The Event Loop is a fundamental concept in Node.js that enables non-blocking I/O operations. It works as follows:

  • Call Stack: Executes functions sequentially.
  • Callback Queue: Stores asynchronous callbacks.
  • Event Loop: Continuously checks the call stack and callback queue, pushing callbacks from the queue to the stack for execution when the stack is empty.

This mechanism allows Node.js to handle many operations simultaneously without blocking the main thread.

Q3: What is the purpose of the require function in Node.js?

The require function is used to load modules in Node.js. It allows you to include built-in Node.js modules, third-party modules, and your own custom modules. Example usage:

const http = require('http'); // Built-in module
const express = require('express'); // Third-party module
const myModule = require('./myModule'); // Custom module
  • Caching: Modules are cached after the first load, so subsequent require calls are faster.
  • Single Instance: Returns the same instance of a module every time it's required.

Q4: Describe the difference between module.exports and exports in Node.js.

Both module.exports and exports are used to export modules, but there are key differences:

1. module.exports: The actual object that is returned as the module. You can assign a new object or function to it.

module.exports = function() {
  console.log("Hello World");
};

2. exports: A shorthand reference to module.exports . You can add properties to it, but you cannot reassign it.

exports.sayHello = function() {
  console.log("Hello World");
};

Q5: What are Streams in Node.js, and what are their types?

Streams are objects that allow you to read data from a source or write data to a destination in a continuous manner. They handle data piece by piece, which makes them efficient for handling large amounts of data. Types of streams in Node.js include:

  • Readable: Used for reading data (e.g., fs.createReadStream() ).
  • Writable: Used for writing data (e.g., fs.createWriteStream() ).
  • Duplex: Both readable and writable (e.g., net.Socket() ).
  • Transform: A type of duplex stream that can modify or transform data as it is written and read (e.g., zlib.createGzip() ).

Example of using a readable stream:

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
readableStream.on('data', function(chunk) {
  console.log(chunk);
});

Request question

Please fill in the form below to submit your question.

Q6: What is the role of the package.json file in a Node.js project?

The package.json file is a crucial part of any Node.js project. It serves several purposes:

  • Metadata: Contains metadata about the project, such as its name, version, description, and author.
  • Dependencies: Lists the project dependencies, specifying which packages and versions are required.
  • Scripts: Defines scripts that can be run with npm, such as start, test, and build scripts.
  • Configuration: Provides a place for configuration options for various modules.

Q7: How does middleware work in Express.js?

Middleware in Express.js are functions that execute during the lifecycle of a request to the server. They can perform various tasks, such as logging, authentication, parsing, and more. Middleware functions have access to the request object ( req ), the response object ( res ), and the next middleware function in the application’s request-response cycle.

Example of middleware:

const express = require('express');
const app = express();

const myMiddleware = (req, res, next) => {
  console.log('Middleware executed');
  next(); // Pass control to the next middleware
};

app.use(myMiddleware);
app.get('/', (req, res) => {
  res.send('Hello World');
});

app.listen(3000, () => {
  console.log('Server running on port 3000');
});

Q8: Explain the difference between process.nextTick() and setImmediate().

Both process.nextTick() and setImmediate() are used to schedule callbacks in Node.js, but they operate differently:

  • process.nextTick(): Schedules a callback to be invoked in the current iteration of the event loop, before any I/O operations.
process.nextTick(() => {
  console.log('Executed in the current iteration');
});
  • setImmediate(): Schedules a callback to be executed in the next iteration of the event loop, after I/O events.
setImmediate(() => {
  console.log('Executed in the next iteration');
});

Q9: What is the purpose of the buffer module in Node.js?

The buffer module in Node.js is used to handle binary data. Buffers are fixed-size chunks of memory outside the V8 heap. They are primarily used for:

  • Manipulating binary data: Allows reading and writing of binary data streams.
  • Handling file I/O: Works efficiently with file and network I/O operations.
  • Encoding conversions: Converts between different encodings, such as UTF-8, Base64, and more.

Example of creating a buffer:

const buffer = Buffer.from('Hello World');
console.log(buffer); // 

Q10: What are the different ways to handle asynchronous code in Node.js?

Asynchronous code in Node.js can be handled in several ways:

  • Callbacks: Functions passed as arguments to be executed after a certain operation.
fs.readFile('file.txt', (err, data) => {
  if (err) throw err;
  console.log(data);
});
  • Promises: Objects representing the eventual completion or failure of an asynchronous operation.
fs.promises.readFile('file.txt')
  .then(data => console.log(data))
  .catch(err => console.error(err));
  • Async/Await: Syntactic sugar built on top of promises, making asynchronous code look synchronous.
async function readFile() {
  try {
    const data = await fs.promises.readFile('file.txt');
    console.log(data);
  } catch (err) {
    console.error(err);
  }
}
readFile();

Request question

Please fill in the form below to submit your question.

Q11: What are the different types of modules in Node.js?

Node.js modules can be categorized into three types:

  • Core Modules: Built-in modules that come with Node.js, such as http , fs , path , url , etc. They can be included without specifying a path.
const http = require('http');
  • Local Modules: Custom modules created by the user within their project. They are included by specifying a relative path.
const myModule = require('./myModule');
  • Third-party Modules: Modules that are available through npm and can be installed using the npm package manager. They are included by specifying the module name.
const express = require('express');

Q12: Explain how you can manage dependencies in a Node.js project.

Managing dependencies in a Node.js project involves using the Node Package Manager (NPM) or Yarn. Here's how you can manage dependencies:

  1. Initializing a Project: Start by creating a package.json file to keep track of your project dependencies and metadata.
npm init
  1. Installing Dependencies: Use npm install (or npm i ) to install dependencies. This command adds the dependencies to the node_modules directory and updates the package.json and package-lock.json files.
npm install express
  1. Saving Dependencies: By default, dependencies are saved in the dependencies section of package.json . Use the --save-dev flag to save them as development dependencies.
npm install mocha --save-dev
  1. Updating Dependencies: Keep your dependencies up-to-date by running the update command.
npm update
  1. Removing Dependencies: Remove a dependency using the npm uninstall command, which also updates the package.json file.
npm uninstall express
  1. Versioning: Specify the version of a dependency in the package.json file. Use semantic versioning to manage versions.
"dependencies": {
  "express": "^4.17.1"
}
  1. Global vs Local Dependencies:
  • Local Dependencies: Installed within the project directory and listed in package.json .
  • Global Dependencies: Installed system-wide and used across multiple projects.
npm install -g nodemon
  1. Lock Files:
  • package-lock.json: Ensures consistent dependency versions across different installations by locking the exact versions of dependencies.

By managing dependencies effectively, you ensure that your Node.js project has all the necessary modules and their correct versions, making it easier to maintain and deploy.

Q13: What is the purpose of the cluster module in Node.js?

The cluster module allows the creation of child processes (workers) that share the same server port. This helps in taking advantage of multi-core systems, improving the performance and scalability of Node.js applications. Each worker can handle incoming requests, balancing the load across multiple processes.

Example of using the cluster module:

const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;

if (cluster.isMaster) {
  for (let i = 0; i < numCPUs; i++) {
    cluster.fork();
  }
  cluster.on('exit', (worker, code, signal) => {
    console.log(`Worker ${worker.process.pid} died`);
  });
} else {
  http.createServer((req, res) => {
    res.writeHead(200);
    res.end('Hello World');
  }).listen(8000);
}

Q14: What is the purpose of the path module in Node.js, and how do you use it?

The path module in Node.js provides utilities for working with file and directory paths. It is used to handle and transform file paths across different operating systems.

Common methods in the path module include:

  • path.join([...paths]): Joins multiple path segments into one path.
const path = require('path');
const fullPath = path.join('/users', 'john', 'docs', 'file.txt');
console.log(fullPath); // Output: '/users/john/docs/file.txt'
  • path.resolve([...paths]): Resolves a sequence of paths into an absolute path.
const absolutePath = path.resolve('users', 'john', 'docs', 'file.txt');
console.log(absolutePath); // Output: '/current/working/directory/users/john/docs/file.txt'
  • path.basename(path, [ext]): Returns the last portion of a path.
const fileName = path.basename('/users/john/docs/file.txt');
console.log(fileName); // Output: 'file.txt'
  • path.dirname(path): Returns the directory name of a path.
const dirName = path.dirname('/users/john/docs/file.txt');
console.log(dirName); // Output: '/users/john/docs'
  • path.extname(path): Returns the extension of the path.
const extName = path.extname('/users/john/docs/file.txt');
console.log(extName); // Output: '.txt'

The path module ensures that the file paths are handled correctly, avoiding issues caused by different operating system path conventions.

Q15: What is the difference between synchronous and asynchronous functions in Node.js?

Synchronous Functions: These functions block the execution of the code until they complete. They run sequentially and are simpler but can be inefficient for I/O operations.

const data = fs.readFileSync('file.txt', 'utf8');
console.log(data); // This will block until the file is read

Asynchronous Functions: These functions do not block the execution. They run in the background and use callbacks, promises, or async/await to handle the results. They are more efficient for I/O operations and allow the program to continue executing other tasks while waiting for the operation to complete.

fs.readFile('file.txt', 'utf8', (err, data) => {
  if (err) throw err;
  console.log(data); // This will execute after the file is read
});

Request question

Please fill in the form below to submit your question.

Q16: What is the purpose of the eventEmitter module in Node.js?

The eventEmitter module in Node.js is used to handle asynchronous events. It allows objects to emit named events and handle functions that respond to those events (listeners). This module is part of the events module.

Example of using eventEmitter:

const EventEmitter = require('events');
class MyEmitter extends EventEmitter {}

const myEmitter = new MyEmitter();
myEmitter.on('event', () => {
  console.log('An event occurred!');
});
myEmitter.emit('event'); // Outputs: An event occurred!

Q17: What is the difference between Promises and Observables in Angular?

Both Promises and Observables are used to handle asynchronous operations, but they have some key differences:

1. Promises:

  • Single Value: A Promise resolves or rejects a single value.
  • Eager Execution: A Promise starts executing immediately when it is created.
  • Not Cancellable: Once created, a Promise cannot be cancelled.
  • API Methods: then() , catch() , finally() .
const promise = new Promise((resolve, reject) => {
  // asynchronous operation
});

promise.then(value => {
  // handle resolved value
}).catch(error => {
  // handle error
});

2. Observables:

  • Multiple Values: An Observable can emit multiple values over time.
  • Lazy Execution: An Observable does not start emitting values until it is subscribed to.
  • Cancellable: Observables can be cancelled using unsubscribe() .
  • API Methods: subscribe() , unsubscribe() , operators like map , filter , merge .
import { Observable } from 'rxjs';

const observable = new Observable(observer => {
  // asynchronous operation
  observer.next(value);
  observer.complete();
});

const subscription = observable.subscribe({
  next(value) {
    // handle emitted value
  },
  error(err) {
    // handle error
  },
  complete() {
    // handle completion
  }
});

// To cancel the subscription
subscription.unsubscribe();

Q18: What is the use of process.env in Node.js?

process.env is an object in Node.js that contains the user environment variables. It is used to access environment variables in a Node.js application. This is useful for storing configuration details such as database credentials, API keys, and other sensitive information outside the source code.

Example of using process.env :

// Set environment variables (usually done in a .env file or server configuration)
process.env.DB_HOST = 'localhost';
process.env.DB_USER = 'root';
process.env.DB_PASS = 'password';

// Access environment variables
const dbHost = process.env.DB_HOST;
const dbUser = process.env.DB_USER;
const dbPass = process.env.DB_PASS;

console.log(`Database host: ${dbHost}`);
console.log(`Database user: ${dbUser}`);

Q19: Explain the concept of microservices and how Node.js is suited for it.

Microservices is an architectural style that structures an application as a collection of small, loosely coupled, independently deployable services. Each service is responsible for a specific functionality and communicates with other services over a network, typically through HTTP/REST or messaging protocols.

Node.js is well-suited for microservices because:

  • Lightweight and Fast: Its non-blocking I/O and event-driven architecture make it ideal for handling multiple requests efficiently.
  • Modular: Node.js encourages modular code, making it easier to split functionality into microservices.
  • Scalable: Can easily scale horizontally by adding more instances.
  • Large Ecosystem: The vast number of available modules and tools, like Express, make it easy to develop microservices.

Q20: How do you create a RESTful API using Express in Node.js?

Creating a RESTful API using Express involves defining routes for different HTTP methods (GET, POST, PUT, DELETE) and implementing the corresponding handlers.

Example of creating a simple RESTful API:

const express = require('express');
const app = express();
app.use(express.json()); // Middleware to parse JSON request bodies

// In-memory data store
let items = [{ id: 1, name: 'Item 1' }];

// GET all items
app.get('/items', (req, res) => {
  res.json(items);
});

// GET an item by ID
app.get('/items/:id', (req, res) => {
  const item = items.find(i => i.id === parseInt(req.params.id));
  if (!item) return res.status(404).send('Item not found');
  res.json(item);
});

// POST a new item
app.post('/items', (req, res) => {
  const item = { id: items.length + 1, name: req.body.name };
  items.push(item);
  res.status(201).json(item);
});

// PUT update an item by ID
app.put('/items/:id', (req, res) => {
  const item = items.find(i => i.id === parseInt(req.params.id));
  if (!item) return res.status(404).send('Item not found');
  item.name = req.body.name;
  res.json(item);
});

// DELETE an item by ID
app.delete('/items/:id', (req, res) => {
  const itemIndex = items.findIndex(i => i.id === parseInt(req.params.id));
  if (itemIndex === -1) return res.status(404).send('Item not found');
  items.splice(itemIndex, 1);
  res.status(204).send();
});

app.listen(3000, () => {
  console.log('Server running on port 3000');
});

Request question

Please fill in the form below to submit your question.

Request question

Please fill in the form below to submit your question.

10 Node.js Coding Tasks | Practical Interview Preparation

Q1: Write a Node.js script to create a simple HTTP server that responds with "Hello World" when accessed.
(Basic)

const http = require('http');

const server = http.createServer((req, res) => {
  res.statusCode = 200;
  res.setHeader('Content-Type', 'text/plain');
  res.end('Hello World\n');
});

server.listen(3000, '127.0.0.1', () => {
  console.log('Server running at http://127.0.0.1:3000/');
});

Use the Node.js http module to create a server that responds with "Hello World".

Q2: Given the following code, identify and fix the error.
(Basic)

const fs = require('fs');

fs.readFile('nonexistentfile.txt', (err, data) => {
  if (err) throw err;
  console.log(data);
});

The error handling should not throw an error directly in a callback, as it can crash the application. Instead, handle the error gracefully.

Corrected code:


const fs = require('fs');

fs.readFile('nonexistentfile.txt', (err, data) => {
  if (err) {
    console.error('Error reading file:', err);
    return;
  }
  console.log(data.toString());
});
Q3: Optimize the following code to improve performance by reducing synchronous operations.
(Intermediate)

const fs = require('fs');

const data1 = fs.readFileSync('file1.txt');
const data2 = fs.readFileSync('file2.txt');
console.log(data1);
console.log(data2);

Use asynchronous operations to read the files concurrently, improving performance.

Optimized code:


const fs = require('fs');

fs.readFile('file1.txt', (err, data1) => {
  if (err) throw err;
  console.log(data1.toString());
});

fs.readFile('file2.txt', (err, data2) => {
  if (err) throw err;
  console.log(data2.toString());
});
Q4: What would be the output of the following code?
(Intermediate)
console.log('Start');

setTimeout(() => {
  console.log('Timeout');
}, 0);

Promise.resolve().then(() => {
  console.log('Promise');
});

console.log('End');

The output will be:

Start
End
Promise
Timeout

Explanation: The console.log('Start') and console.log('End') statements run first. The Promise.resolve().then(...) callback runs before the setTimeout(..., 0) callback because promises have higher priority in the event loop.

Q5: Write a Node.js script that reads a JSON file and logs the content to the console.
(Intermediate)

const fs = require('fs');

fs.readFile('data.json', 'utf8', (err, data) => {
  if (err) {
    console.error('Error reading file:', err);
    return;
  }
  try {
    const jsonData = JSON.parse(data);
    console.log(jsonData);
  } catch (err) {
    console.error('Error parsing JSON:', err);
  }
});
Q6: Debug the following code and explain why it does not work as expected.
(Advanced)

const express = require('express');
const app = express();

app.get('/', (req, res) => {
  res.send('Hello World');
});

app.listen(3000, () => {
  console.log('Server running on port 3000');
});

app.get('/error', (req, res) => {
  throw new Error('This is a forced error');
});

app.use((err, req, res, next) => {
  console.error(err.stack);
  res.status(500).send('Something broke!');
});

The error handling middleware should be defined before the app.listen() call. Otherwise, the server will not use the error handler correctly.

Corrected code:


const express = require('express');
const app = express();

app.get('/', (req, res) => {
  res.send('Hello World');
});

app.get('/error', (req, res) => {
  throw new Error('This is a forced error');
});

app.use((err, req, res, next) => {
  console.error(err.stack);
  res.status(500).send('Something broke!');
});

app.listen(3000, () => {
  console.log('Server running on port 3000');
});
Q7: Improve the performance of the following code that reads a large file and processes its content.
(Advanced)

const fs = require('fs');

fs.readFile('largefile.txt', 'utf8', (err, data) => {
  if (err) {
    console.error('Error reading file:', err);
    return;
  }
  processFileContent(data);
});

function processFileContent(content) {
  // Perform processing on the file content
  console.log('File processed');
}

Use a readable stream to process the file content in chunks, which is more memory-efficient for large files.

Optimized code:


const fs = require('fs');

const readStream = fs.createReadStream('largefile.txt', 'utf8');

readStream.on('data', (chunk) => {
  processFileContent(chunk);
});

readStream.on('end', () => {
  console.log('File processed');
});

readStream.on('error', (err) => {
  console.error('Error reading file:', err);
});

function processFileContent(content) {
  // Perform processing on the file content
  console.log('Processing chunk:', content);
}
Q8: Write a Node.js script that connects to a MongoDB database, inserts a document, and retrieves all documents from a collection.
(Advanced)

const { MongoClient } = require('mongodb');

async function run() {
  const url = 'mongodb://localhost:27017';
  const dbName = 'mydatabase';
  const client = new MongoClient(url);

  try {
    await client.connect();
    console.log('Connected successfully to server');
    const db = client.db(dbName);
    const collection = db.collection('documents');

    // Insert a document
    const insertResult = await collection.insertOne({ name: 'John Doe', age: 30 });
    console.log('Inserted document:', insertResult.ops);

    // Find all documents
    const findResult = await collection.find({}).toArray();
    console.log('Found documents:', findResult);
  } catch (err) {
    console.error('Error:', err);
  } finally {
    await client.close();
  }
}

run().catch(console.dir);
Q9: What would be the output of the following code, and why?
(Advanced)

console.log('Start');

setImmediate(() => {
  console.log('Immediate');
});

process.nextTick(() => {
  console.log('Next Tick');
});

console.log('End');

The output will be:

Start
End
Next Tick
Immediate

Explanation: console.log('Start') and console.log('End') run first. process.nextTick() callbacks are executed before any I/O operations, so Next Tick comes next. Finally, setImmediate() callbacks run after I/O events, so Immediate is last.

Q10: Optimize the following code to handle multiple asynchronous operations using Promise.all.
(Advanced)

const fs = require('fs');

fs.readFile('file1.txt', 'utf8', (err, data1) => {
  if (err) {
    console.error('Error reading file1:', err);
    return;
  }
  fs.readFile('file2.txt', 'utf8', (err, data2) => {
    if (err) {
      console.error('Error reading file2:', err);
      return;
    }
    console.log('File 1 content:', data1);
    console.log('File 2 content:', data2);
  });
});

Use Promise.all to handle multiple asynchronous operations concurrently.

Optimized code:


const fs = require('fs').promises;

Promise.all([
  fs.readFile('file1.txt', 'utf8'),
  fs.readFile('file2.txt', 'utf8')
])
  .then(([data1, data2]) => {
    console.log('File 1 content:', data1);
    console.log('File 2 content:', data2);
  })
  .catch(err => {
    console.error('Error reading files:', err);
  });

Request question

Please fill in the form below to submit your question.

Get Ahead in Node.js Development with Workik AI – Join Now!

Join developers who are using Workik’s AI assistance everyday for programming

Sign Up Now

Overview of Node.js

What is Node.js?

What is the history and latest trends in Node.js development?

What are some of the popular frameworks and libraries associated with Node.js?

  • Express.js: A minimal and flexible Node.js web application framework that provides a robust set of features for web and mobile applications.
  • Koa.js: A new web framework designed by the team behind Express, aiming to be a smaller, more expressive, and more robust foundation for web applications and APIs.
  • NestJS: A framework for building efficient, reliable, and scalable server-side applications.
  • Socket.io: A library that enables real-time, bidirectional, and event-based communication between web clients and servers.
  • Hapi.js: A rich framework for building applications and services, known for its powerful plugin system and fine-grained control over request handling.

What are the use cases of Node.js?

  • Web Servers: Creating and managing web servers to handle HTTP requests.
  • APIs: Developing RESTful and GraphQL APIs.
  • Real-Time Applications: Building real-time applications such as chat applications, live updates, and collaborative tools.
  • Microservices: Implementing microservices architecture due to its lightweight and efficient nature.
  • Server-Side Scripting: Using JavaScript for server-side operations, enabling full-stack development with a single language.

What are some of the tech roles associated with expertise in Node.js?

  • Node.js Developer: Focuses on building and maintaining server-side applications using Node.js.
  • Backend Developer: Specializes in server-side logic and integration, often using Node.js for building APIs and services.
  • Full-Stack Developer: Utilizes Node.js for backend development along with front-end technologies.
  • DevOps Engineer: Manages infrastructure and deployment processes, frequently working with Node.js applications.
  • API Developer: Specializes in creating and maintaining APIs, often using Node.js for its efficiency and scalability.

What pay package can be expected with experience in Node.js?


Source: Velvet Jobs & Talent

  • Junior Node.js Developer: Typically earns between $60,000 and $80,000 per year.
  • Mid-Level Node.js Developer: Generally earns from $80,000 to $110,000 per year.
  • Senior Node.js Developer: Often earns between $100,000 and $140,000 per year.
  • Full-Stack Developer with Node.js expertise: Generally earns between $90,000 and $120,000 per year.
  • Backend Developer with Node.js expertise: Typically earns between $85,000 and $110,000 per year.