Skip to content

Memory leak in Next.js SSR under bun --bun next start — JSC GC fails to reclaim heap after concurrent requests #29302

@Playys228

Description

@Playys228

What version of Bun is running?

1.3.13 (also reproduced on 1.3.12 and 1.3.11)

What platform is your computer?

Linux 6.19.9-zen1-1-zen Arch Linux, 32 GB RAM

What steps can reproduce the bug?

1. Create the project

mkdir bun-nextjs-memleak && cd bun-nextjs-memleak
bun init -y
bun add next@16.2.3 react@19.2.5 react-dom@19.2.5

2. next.config.ts

import type { NextConfig } from "next";

const nextConfig: NextConfig = {
  reactCompiler: true,
  images: { unoptimized: true },
};

export default nextConfig;

3. app/layout.tsx

export const metadata = { title: "Memleak repro" };

export default function RootLayout({ children }: { children: React.ReactNode }) {
  return <html><body>{children}</body></html>;
}

4. app/products/[brandSlug]/[slug]/page.tsx — Server Component with per-request fetch

// Every request fetches unique JSON (~5-50 KB) from the mock API.
// No React cache() wrapper — keeps the repro minimal.

interface Props {
  params: Promise<{ brandSlug: string; slug: string }>;
}

export default async function ProductPage({ params }: Props) {
  const { brandSlug, slug } = await params;

  const res = await fetch(
    `http://localhost:4000/products/${brandSlug}/${slug}`,
    { cache: "no-store" }
  );
  const product = await res.json(); // ~5-50 KB per product

  return (
    <div>
      <h1>{product.title}</h1>
      <p>{product.description}</p>
      <ul>
        {product.specs?.map((s: { label: string; value: string }, i: number) => (
          <li key={i}>{s.label}: {s.value}</li>
        ))}
      </ul>
    </div>
  );
}

5. mock-api.ts — Lightweight API server that returns unique JSON per URL

// Run with: bun run mock-api.ts
// Returns deterministic but unique ~10-30 KB JSON per {brandSlug, slug} combo

const generateProduct = (brandSlug: string, slug: string) => {
  const id = `${brandSlug}-${slug}`;
  return {
    id,
    title: `Product ${slug} by ${brandSlug}`,
    description: "A".repeat(5000 + (slug.charCodeAt(0) % 20) * 1000), // 5-25 KB string
    price: Math.round(Math.random() * 100000),
    brand: { name: brandSlug, slug: brandSlug },
    images: Array.from({ length: 5 }, (_, i) => `https://example.com/${id}/img${i}.jpg`),
    specs: Array.from({ length: 15 }, (_, i) => ({
      label: `Spec ${i}`,
      value: `Value ${i} for ${id}`,
    })),
    variants: Array.from({ length: 3 }, (_, i) => ({
      id: `${id}-v${i}`,
      name: `Variant ${i}`,
      price: Math.round(Math.random() * 50000),
      isActive: true,
      properties: Array.from({ length: 5 }, (_, j) => ({
        name: `Prop ${j}`,
        value: `Val ${j}`,
      })),
    })),
  };
};

Bun.serve({
  port: 4000,
  fetch(req) {
    const url = new URL(req.url);
    const parts = url.pathname.split("/").filter(Boolean);
    // /products/:brandSlug/:slug
    if (parts[0] === "products" && parts.length === 3) {
      return Response.json(generateProduct(parts[1], parts[2]));
    }
    return new Response("Not Found", { status: 404 });
  },
});

console.log("Mock API running on http://localhost:4000");

6. load-test.ts — Drives 25,000 unique-URL requests in batches of 100

// Run with: bun run load-test.ts
// Generates 25,000 unique {brandSlug, slug} pairs and hits the Next.js server

const TOTAL = 25_000;
const BATCH_SIZE = 100;
const BASE_URL = "http://localhost:3000";

// Generate unique slugs
const products = Array.from({ length: TOTAL }, (_, i) => ({
  brandSlug: `brand-${Math.floor(i / 100)}`,
  slug: `product-${i}`,
}));

const startTime = Date.now();
let totalFetched = 0;
let errors = 0;

for (let i = 0; i < products.length; i += BATCH_SIZE) {
  const batch = products.slice(i, i + BATCH_SIZE);
  const batchNum = Math.floor(i / BATCH_SIZE) + 1;

  const results = await Promise.allSettled(
    batch.map((p) =>
      fetch(`${BASE_URL}/products/${p.brandSlug}/${p.slug}`)
        .then((r) => {
          if (!r.ok) throw new Error(`HTTP ${r.status}`);
          return r.text();
        })
    )
  );

  const fulfilled = results.filter((r) => r.status === "fulfilled").length;
  const rejected = results.filter((r) => r.status === "rejected").length;
  totalFetched += fulfilled;
  errors += rejected;

  const elapsed = ((Date.now() - startTime) / 1000).toFixed(1);
  console.log(
    `[batch ${batchNum}/${Math.ceil(TOTAL / BATCH_SIZE)}] ` +
    `OK: ${fulfilled}, FAIL: ${rejected}, ` +
    `Total: ${totalFetched}/${TOTAL}, Errors: ${errors}, ` +
    `Elapsed: ${elapsed}s`
  );

  // If too many failures in a row, the server is likely dead
  if (rejected === BATCH_SIZE) {
    console.error("Entire batch failed — server is likely unresponsive. Stopping.");
    break;
  }
}

console.log(`\nDone. ${totalFetched} OK, ${errors} errors out of ${TOTAL} total.`);

7. Build and run

# Terminal 1 — mock API
bun run mock-api.ts

# Terminal 2 — build and start Next.js with Bun runtime
bun --bun next build
bun --bun next start -H 127.0.0.1 -p 3000

# Terminal 3 — fire the load test
bun run load-test.ts

8. Monitor memory (optional, in a 4th terminal)

# Watch the Next.js server process RSS in real-time
while true; do
  PID=$(pgrep -f "next start" | head -1)
  [ -z "$PID" ] && echo "Process dead" && break
  RSS=$(ps -o rss= -p "$PID" | awk '{printf "%.0f", $1/1024}')
  HEAP=$(cat /proc/$PID/status 2>/dev/null | grep VmRSS | awk '{print $2/1024 "MB"}')
  echo "$(date +%H:%M:%S) PID=$PID RSS=${RSS}MB"
  sleep 2
done

What is the expected behavior?

RSS should be adequate quantity for the bun process and inner nextjs node processes.

What do you see instead?

  1. RSS jumps to ~377 MB, heap to ~204 MB after the first batch
  2. Bun.gc(true) frees 0 bytes — heap stays frozen at exactly 786,130,199 bytes
  3. Repeated Bun.gc(true) calls continue to free 0 bytes:

Live process investigation (node next start — comparison baseline)

Captured while running the production app under Node.js via bun run --bun --max-old-space-size=2048 next start -H 127.0.0.1 -p 3000.

Important observation: bun run --bun spawns Node, not Bun

The --bun flag does not appear to take effect for Next.js. The process tree shows:

bun(1564414)───bun(1564415)───node(1564416)─┬─{node}(1564418)
                                             ├─{node}(1564430)
                                             ├─ ... (35 threads total)

Next.js's binary (node_modules/.bin/next) has #!/usr/bin/env node as its shebang, so even under bun run --bun, the actual server runs as a Node.js process. This means the SIGTRAP crash and GC failure reported above happen when Bun does successfully inject its runtime (on earlier versions / different configs), not in this particular Node fallback mode.

Node.js process snapshot (26 minutes uptime, after load test)

PID       %CPU  %MEM  VSZ (KB)      RSS (KB)     ELAPSED  THREADS
1564416   20.8  24.1  95,522,648    7,774,100    26:28    35

RSS: 7.6 GB (24.1% of 32 GB system RAM) — under Node.js, the process survives but still consumes enormous memory.

/proc/<pid>/status memory breakdown

VmPeak:   134,353,996 kB   (~128 GB peak virtual — likely mmap'd regions)
VmSize:    95,522,648 kB   (~91 GB current virtual)
VmRSS:     7,774,100 kB   (~7.4 GB resident)
RssAnon:   7,741,472 kB   (~7.4 GB anonymous — heap + stacks)
RssFile:      32,628 kB   (~32 MB file-backed pages)
VmData:   91,625,128 kB   (~87 GB data segment)
VmSwap:      71,228 kB   (~70 MB swapped out)
Threads:  35

/proc/<pid>/smaps_rollup (aggregated memory map)

Rss:             7,774,100 kB
Pss:             7,758,280 kB    (nearly all private — no sharing)
Pss_Dirty:       7,741,472 kB    (almost all dirty anonymous)
Pss_Anon:        7,741,472 kB
Pss_File:           16,808 kB
Private_Dirty:   7,741,472 kB
Shared_Clean:       27,564 kB
Swap:               71,228 kB

OOM score

oom_score:     765 / 1000   (very high — kernel will kill this process first under pressure)
oom_score_adj: 0            (no manual adjustment)

Key takeaway from process data

Additional information

  • The issue does not occur with node next start — only with bun --bun next start
  • Note: bun run --bun next start may silently fall back to Node.js due to Next.js's #!/usr/bin/env node shebang — verify with pstree or /proc/<pid>/exe which runtime is actually running
  • Adding cacheMaxMemorySize: 25 * 1024 * 1024 to next.config.ts does not help
  • The issue is triggered specifically by many unique URLs — a smaller number of URLs with repeated cache hits does not exhibit this behavior
  • The production app uses bun run --bun --max-old-space-size=2048 next start which also crashes
  • The crash is deterministic and happens within the first 1000 requests
  • Under Node.js fallback, the process survives but still reaches 7.6 GB RSS (OOM score 765/1000) — the load pattern is inherently memory-heavy, but V8 copes while JSC does not

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions