|

Node.js File System (fs) 2026: Complete Guide to Reading, Writing & Managing Files

The file system module (fs) is one of the most heavily used Node.js built-in modules. Whether you are reading configuration files, processing uploads, writing logs, or building a static site generator, you need fs. This guide covers everything from basic reads and writes to advanced patterns like streams, file watching, and atomic writes that you will use in production Node.js applications.

This lesson builds directly on the Node.js basics guide. Make sure you understand how Node.js handles asynchronous operations and the async/await pattern before continuing.

The Three fs APIs

Node.js provides three flavors of the file system module. Understanding when to use each is fundamental:

// 1. Promise-based (RECOMMENDED for most code)
import fs from 'node:fs/promises';
const data = await fs.readFile('config.json', 'utf-8');

// 2. Callback-based (legacy, used in older codebases)
import fs from 'node:fs';
fs.readFile('config.json', 'utf-8', (err, data) => {
  if (err) throw err;
  console.log(data);
});

// 3. Synchronous (only for startup/scripts)
import fs from 'node:fs';
const data = fs.readFileSync('config.json', 'utf-8');

Rule of thumb: Use fs/promises by default. Use synchronous methods only during application startup (loading config before the server starts) or in CLI tools. Never use synchronous methods in request handlers — they block the entire event loop and freeze your server.

Reading Files

Reading files is the most common operation. The encoding parameter determines whether you get a string or a Buffer.

import fs from 'node:fs/promises';

// Read as text (specify encoding)
const text = await fs.readFile('readme.md', 'utf-8');
console.log(text); // String content

// Read as binary (no encoding = Buffer)
const binary = await fs.readFile('image.png');
console.log(binary); // <Buffer 89 50 4e 47 ...>
console.log(binary.length); // File size in bytes

// Read JSON files
const raw = await fs.readFile('package.json', 'utf-8');
const pkg = JSON.parse(raw);
console.log(pkg.name, pkg.version);

// Since Node.js 22: import JSON directly
// import pkg from './package.json' with { type: 'json' };

// Handle missing files gracefully
async function readFileOrDefault(filePath, defaultValue) {
  try {
    return await fs.readFile(filePath, 'utf-8');
  } catch (err) {
    if (err.code === 'ENOENT') return defaultValue; // File not found
    throw err; // Re-throw unexpected errors
  }
}

const config = await readFileOrDefault('config.json', '{}');

The ENOENT error code means “Error NO ENTry” — the file does not exist. Always check err.code rather than parsing error messages, because error messages can change between Node.js versions.

Writing Files

Node.js provides several methods for writing data to files, each suited to different use cases.

import fs from 'node:fs/promises';

// Write a file (creates or overwrites)
await fs.writeFile('output.txt', 'Hello, World!');

// Write with specific encoding
await fs.writeFile('data.csv', 'name,age
Chirag,28
', 'utf-8');

// Write JSON
const data = { users: [{ name: 'Chirag', role: 'admin' }] };
await fs.writeFile('data.json', JSON.stringify(data, null, 2));

// Append to a file (creates if missing)
await fs.appendFile('access.log',
  `[${new Date().toISOString()}] GET /api/users 200
`
);

// Write binary data
const buffer = Buffer.from([0x48, 0x65, 0x6c, 0x6c, 0x6f]);
await fs.writeFile('binary.dat', buffer);

// Write with flags
await fs.writeFile('important.txt', 'data', {
  flag: 'wx' // 'wx' = write exclusive: fails if file exists
});

Atomic Writes

In production, a crash during writeFile can corrupt the file (partially written data). The atomic write pattern prevents this:

import fs from 'node:fs/promises';
import path from 'node:path';
import crypto from 'node:crypto';

async function atomicWrite(filePath, data) {
  // Write to a temporary file first
  const tmpPath = filePath + '.' + crypto.randomBytes(6).toString('hex');
  await fs.writeFile(tmpPath, data);

  // Rename is atomic on most file systems
  await fs.rename(tmpPath, filePath);
}

// Usage: even if the process crashes, the original file is never corrupted
await atomicWrite('config.json', JSON.stringify(config, null, 2));

Directory Operations

import fs from 'node:fs/promises';
import path from 'node:path';

// Create a directory
await fs.mkdir('logs');

// Create nested directories
await fs.mkdir('data/backups/2026/may', { recursive: true });

// List directory contents
const entries = await fs.readdir('./src');
console.log(entries); // ['index.js', 'utils/', 'config.js']

// List with file types
const detailed = await fs.readdir('./src', { withFileTypes: true });
for (const entry of detailed) {
  const type = entry.isDirectory() ? 'DIR' : 'FILE';
  console.log(`${type}: ${entry.name}`);
}

// Recursively list all files
async function* walkDir(dir) {
  const entries = await fs.readdir(dir, { withFileTypes: true });
  for (const entry of entries) {
    const fullPath = path.join(dir, entry.name);
    if (entry.isDirectory()) {
      yield* walkDir(fullPath);
    } else {
      yield fullPath;
    }
  }
}

// Usage
for await (const file of walkDir('./src')) {
  console.log(file);
}

// Remove a directory
await fs.rmdir('empty-dir');

// Remove a directory and all contents
await fs.rm('old-project', { recursive: true, force: true });

The recursive walkDir generator is one of the most useful patterns in Node.js file handling. It uses async generators to lazily yield file paths without loading all entries into memory at once.

The Path Module

Never concatenate file paths with + or template literals. The path module handles cross-platform differences automatically.

import path from 'node:path';

// Join path segments safely
path.join('users', 'chirag', 'documents', 'file.txt');
// Linux: 'users/chirag/documents/file.txt'
// Windows: 'users\chirag\documents\file.txt'

// Resolve to absolute path
path.resolve('src', 'index.js');
// '/home/chirag/project/src/index.js'

// Extract parts of a path
const filePath = '/home/chirag/project/src/utils.js';
path.dirname(filePath);   // '/home/chirag/project/src'
path.basename(filePath);  // 'utils.js'
path.extname(filePath);   // '.js'
path.basename(filePath, '.js'); // 'utils' (without extension)

// Parse into an object
path.parse(filePath);
// { root: '/', dir: '/home/chirag/project/src',
//   base: 'utils.js', ext: '.js', name: 'utils' }

// Get __dirname equivalent in ES modules
import { fileURLToPath } from 'node:url';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);

File Metadata & Permissions

import fs from 'node:fs/promises';

const stats = await fs.stat('server.js');

console.log(stats.size);          // Size in bytes
console.log(stats.mtime);         // Last modified
console.log(stats.birthtime);     // Created
console.log(stats.isFile());      // true
console.log(stats.isDirectory()); // false
console.log(stats.isSymbolicLink()); // false

// Check if file exists
async function fileExists(filePath) {
  try {
    await fs.access(filePath);
    return true;
  } catch {
    return false;
  }
}

// Change permissions (Unix)
await fs.chmod('script.sh', 0o755); // rwxr-xr-x

// Copy files
await fs.copyFile('source.txt', 'backup.txt');

// Create symbolic links
await fs.symlink('target.txt', 'link.txt');

// Rename / move files
await fs.rename('old-name.txt', 'new-name.txt');
await fs.rename('file.txt', 'archive/file.txt'); // Move

Streams Deep Dive

When files are too large to fit in memory (or when you want efficient real-time processing), use streams.

import fs from 'node:fs';
import { pipeline } from 'node:stream/promises';
import { Transform } from 'node:stream';

// Read stream: processes file in chunks
const readStream = fs.createReadStream('access.log', {
  encoding: 'utf-8',
  highWaterMark: 64 * 1024, // 64KB chunks (default)
});

// Write stream
const writeStream = fs.createWriteStream('errors.log');

// Custom transform stream: filter error lines
const errorFilter = new Transform({
  transform(chunk, encoding, callback) {
    const lines = chunk.toString().split('
');
    const errors = lines
      .filter(line => line.includes('ERROR'))
      .join('
');
    if (errors) this.push(errors + '
');
    callback();
  }
});

// Pipeline: read → filter → write
await pipeline(readStream, errorFilter, writeStream);
console.log('Error log extracted');

// Stream a file as HTTP response
import http from 'node:http';

const server = http.createServer((req, res) => {
  if (req.url === '/download') {
    res.writeHead(200, {
      'Content-Type': 'application/octet-stream',
      'Content-Disposition': 'attachment; filename="data.csv"',
    });
    fs.createReadStream('large-data.csv').pipe(res);
  }
});

Using pipeline() instead of .pipe() is critical — it properly handles errors and destroys streams when something goes wrong. A leaked stream from .pipe() without error handling can cause memory leaks that crash your server hours or days later.

Watching Files for Changes

File watching enables live-reload, auto-compile, and monitoring features.

import fs from 'node:fs/promises';

// Modern API: fs.watch (recursive supported since Node 19+)
const watcher = fs.watch('./src', { recursive: true });

for await (const event of watcher) {
  console.log(`${event.eventType}: ${event.filename}`);
  // 'change: index.js'
  // 'rename: new-file.js' (also fires on create/delete)
}

// With debouncing (avoid duplicate events)
let timeout;
const watcher2 = fs.watch('./src', { recursive: true });

for await (const event of watcher2) {
  clearTimeout(timeout);
  timeout = setTimeout(() => {
    console.log(`Rebuilding after: ${event.filename}`);
    rebuild();
  }, 100);
}

File watching has quirks across operating systems — macOS, Linux, and Windows each handle filesystem events differently. For production use, consider libraries like chokidar that normalize these differences. The built-in fs.watch has improved significantly in recent Node.js versions but still has edge cases.

Real-World Patterns

Rotating Log Files

import fs from 'node:fs/promises';

async function rotateLog(logPath, maxSize = 5 * 1024 * 1024) {
  try {
    const stats = await fs.stat(logPath);
    if (stats.size > maxSize) {
      const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
      await fs.rename(logPath, `${logPath}.${timestamp}`);
      await fs.writeFile(logPath, ''); // Fresh log file
    }
  } catch (err) {
    if (err.code !== 'ENOENT') throw err;
  }
}

Safe Config File Loading

import fs from 'node:fs/promises';

async function loadConfig(configPath) {
  const raw = await fs.readFile(configPath, 'utf-8');
  const config = JSON.parse(raw);

  // Validate required fields
  const required = ['port', 'database', 'secret'];
  for (const key of required) {
    if (!(key in config)) {
      throw new Error(`Missing required config: ${key}`);
    }
  }

  return Object.freeze(config); // Prevent accidental mutation
}

Processing CSV Line by Line

import { createReadStream } from 'node:fs';
import { createInterface } from 'node:readline';

async function processCSV(filePath) {
  const rl = createInterface({
    input: createReadStream(filePath),
    crlfDelay: Infinity,
  });

  let lineNum = 0;
  let headers = [];

  for await (const line of rl) {
    if (lineNum === 0) {
      headers = line.split(',');
    } else {
      const values = line.split(',');
      const row = Object.fromEntries(
        headers.map((h, i) => [h.trim(), values[i]?.trim()])
      );
      // Process each row
      console.log(row);
    }
    lineNum++;
  }
  console.log(`Processed ${lineNum - 1} rows`);
}

Common Pitfalls

Path traversal vulnerabilities: If your server reads files based on user input, an attacker could request ../../etc/passwd. Always sanitize paths and resolve them relative to a safe base directory.

import path from 'node:path';

const SAFE_DIR = '/app/uploads';

function safePath(userInput) {
  const resolved = path.resolve(SAFE_DIR, userInput);
  if (!resolved.startsWith(SAFE_DIR)) {
    throw new Error('Path traversal detected');
  }
  return resolved;
}

Race conditions with file existence checks: Checking if a file exists before reading it creates a race condition — the file could be deleted between the check and the read. Instead, just try the operation and handle the error.

Not closing file handles: When using fs.open(), always close the handle in a finally block. Leaked file handles eventually hit the OS limit and crash your application.

Forgetting to handle encoding: Reading without specifying encoding returns a Buffer, not a string. This is a common source of bugs when you expect readFile to return text but get [object Buffer] instead.

The Node.js file system module is your gateway to building CLI tools, static site generators, log processors, file-based databases, and any application that interacts with the local file system. Master these patterns and you can handle any file-related task that comes your way.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *