NP
← Back to Blog
ReactNext.jsPerformanceWeb DevelopmentBest Practices

30 Days with
the React Compiler: What I Stopped Memoizing

NP

Nick Paolini

April 26, 2026

9 min read read

The React Compiler went stable in Next.js 16, and I've spent the last month running it across two production apps. Here's what I deleted, what broke, and whether I'd turn it on for new projects.

I deleted 247 lines of memoization code. Some renders got faster. Some didn't. Build times got slower. Here's the full story.

The One-Sentence Pitch

The React Compiler analyzes your components at build time and automatically memoizes them so you don't have to manually wrap everything in useMemo, useCallback, and React.memo.

Most React apps over-memoize defensively. You wrap a callback in useCallback, then wrap the child component in React.memo, then wrap the derived array in useMemo, and components still re-render anyway because you missed one prop or got the dependency array wrong.

The compiler solves this by treating memoization as a compiler optimization instead of a developer responsibility. Write normal code, let the compiler handle referential stability.

Turning It On in Next.js 16

The config is simple. In your next.config.ts:

import type { NextConfig } from 'next'
 
const nextConfig: NextConfig = {
  experimental: {
    reactCompiler: true,
  },
}
 
export default nextConfig

That's it. Next build will now run the React Compiler over your components.

The caveat: the compiler uses Babel under the hood, which means slower builds. My medium-sized app went from 18 seconds to 21 seconds for production builds. Dev server cold start went from 3.2 seconds to 3.5 seconds.

Before you enable it, check for two things. First, run npm list react and make sure you're on React 19 or later. Second, if you have custom Babel config, the compiler might conflict. I had to remove one outdated plugin that was doing its own component transforms.

You'll know it's working when you see this in your build output:

 Compiled successfully
 React Compiler: Optimized 247 components

What I Deleted (With Diffs)

I went through both apps and deleted every useMemo, useCallback, and React.memo that the compiler could handle. Here are three representative examples.

Example 1: Derived Array

// Before
function ProductList({ products, filters }) {
  const filteredProducts = useMemo(() => {
    return products.filter(p => 
      p.category === filters.category &&
      p.price <= filters.maxPrice
    )
  }, [products, filters.category, filters.maxPrice])
 
  return (
    <div>
      {filteredProducts.map(p => (
        <ProductCard key={p.id} product={p} />
      ))}
    </div>
  )
}
 
// After
function ProductList({ products, filters }) {
  const filteredProducts = products.filter(p => 
    p.category === filters.category &&
    p.price <= filters.maxPrice
  )
 
  return (
    <div>
      {filteredProducts.map(p => (
        <ProductCard key={p.id} product={p} />
      ))}
    </div>
  )
}

The compiler sees that filteredProducts is deterministic based on props and memoizes it automatically. You write the straightforward version, the compiler handles stability.

Example 2: Callback Passed to Child

// Before
function Dashboard({ userId }) {
  const handleRefresh = useCallback(() => {
    fetch(`/api/users/${userId}/stats`).then(/* ... */)
  }, [userId])
 
  return <StatsPanel onRefresh={handleRefresh} />
}
 
const StatsPanel = React.memo(({ onRefresh }) => {
  return <button onClick={onRefresh}>Refresh</button>
})
 
// After
function Dashboard({ userId }) {
  const handleRefresh = () => {
    fetch(`/api/users/${userId}/stats`).then(/* ... */)
  }
 
  return <StatsPanel onRefresh={handleRefresh} />
}
 
function StatsPanel({ onRefresh }) {
  return <button onClick={onRefresh}>Refresh</button>
}

Both the useCallback and the React.memo are unnecessary. The compiler memoizes handleRefresh and StatsPanel automatically.

Example 3: Memoized List Item

// Before
const ListItem = React.memo(({ item, onSelect }) => {
  return (
    <div onClick={() => onSelect(item.id)}>
      <h3>{item.title}</h3>
      <p>{item.description}</p>
    </div>
  )
})
 
function ItemList({ items, onSelectItem }) {
  const handleSelect = useCallback((id) => {
    onSelectItem(id)
  }, [onSelectItem])
 
  return (
    <div>
      {items.map(item => (
        <ListItem key={item.id} item={item} onSelect={handleSelect} />
      ))}
    </div>
  )
}
 
// After
function ListItem({ item, onSelect }) {
  return (
    <div onClick={() => onSelect(item.id)}>
      <h3>{item.title}</h3>
      <p>{item.description}</p>
    </div>
  )
}
 
function ItemList({ items, onSelectItem }) {
  const handleSelect = (id) => {
    onSelectItem(id)
  }
 
  return (
    <div>
      {items.map(item => (
        <ListItem key={item.id} item={item} onSelect={handleSelect} />
      ))}
    </div>
  )
}

I deleted the React.memo wrapper and the useCallback. The compiler handles both. The code is cleaner and easier to follow.

Across both apps, I removed 247 lines of memoization code. That's 247 lines I don't have to maintain or get wrong.

What I Kept (And Why)

The compiler isn't magic. There are cases where you still need manual optimization.

Third-party hooks. If you use a library like React Hook Form or Zustand, the compiler can't see inside those hooks. I kept useMemo around a few derived values from Zustand stores because the compiler doesn't know when those dependencies change.

Expensive computations that need caching. The compiler provides referential stability, not caching. If you have a genuinely expensive computation that you want to cache across renders (not just across re-executions of the same render), keep the useMemo.

function DataGrid({ rawData }) {
  // Keep this useMemo - parsing 50k rows is expensive
  const parsedData = useMemo(() => {
    return parseAndValidate(rawData) // Heavy computation
  }, [rawData])
 
  // Compiler can handle this
  const sortedData = parsedData.sort((a, b) => a.name.localeCompare(b.name))
 
  return <Table data={sortedData} />
}

Imperative code and refs. The compiler doesn't optimize imperative logic or ref manipulation. Those stay as-is.

Escape hatch with "use no memo". For the rare case where the compiler's optimization breaks something, you can opt out:

function ProblematicComponent() {
  "use no memo"
  
  // Compiler will skip this component
  return <div>...</div>
}

I used this directive twice in 30 days, both times temporarily while debugging. I eventually fixed the underlying issue and removed it.

The Measurements

Here's the before/after from my larger production app (a dashboard with approximately 40k LOC of React):

| Metric | Before Compiler | With Compiler | Change | |--------|----------------|---------------|---------| | Bundle size (gzipped) | 247.3 KB | 246.1 KB | -1.2 KB | | Build time (prod) | 18.2s | 21.4s | +17.5% | | Dev server cold start | 3.2s | 3.5s | +9.4% | | Render count (filter interaction) | 14 renders | 6 renders | -57% | | Render count (tab switch) | 8 renders | 3 renders | -62.5% |

The bundle got slightly smaller because I deleted memoization code. Build times increased noticeably because Babel. Dev server is a bit slower but still under 4 seconds, which is fine.

The render count improvements are real. I measured these with React DevTools Profiler on specific user interactions. Filtering a product list went from 14 component renders to 6. Switching tabs went from 8 to 3.

For most interactions, the app feels the same. For a few heavy interactions (like the filter), there's a noticeable smoothness improvement. Not revolutionary, but measurable.

The build time increase is the trade-off. If you have a large app and tight CI budgets, the extra 3-4 seconds might matter. For me, it doesn't.

Where It Surprised Me

I hit one genuinely confusing bug. I had a useEffect that was firing on every render, even though its dependency array looked correct:

function SearchBox({ onSearch }) {
  const [query, setQuery] = useState('')
 
  useEffect(() => {
    const timeoutId = setTimeout(() => {
      onSearch(query)
    }, 300)
    return () => clearTimeout(timeoutId)
  }, [query, onSearch])
 
  return <input value={query} onChange={e => setQuery(e.target.value)} />
}

Before the compiler, onSearch was wrapped in useCallback by the parent, so this worked fine. With the compiler, I deleted that useCallback thinking the compiler would handle it.

The compiler did memoize onSearch, but not in a way my useEffect dependency array understood. I spent an hour debugging before realizing the issue.

The fix was to acknowledge that onSearch is a dependency I genuinely care about for the effect logic, so I kept the parent's useCallback:

// Parent component - kept this useCallback
const handleSearch = useCallback((query: string) => {
  // Actually needs stability for the effect dependency
  performSearch(query)
}, [])

Lesson learned: the compiler optimizes for preventing unnecessary re-renders, not for making your useEffect dependencies stable. Those are different problems.

The ESLint Plugin Earns Its Keep

Install eslint-plugin-react-compiler alongside the compiler:

npm install -D eslint-plugin-react-compiler

Add it to your ESLint config:

{
  "plugins": ["react-compiler"],
  "rules": {
    "react-compiler/react-compiler": "error"
  }
}

The plugin flags code patterns the compiler can't optimize. When you see a warning, treat it as a refactoring opportunity, not noise.

I got warnings on three patterns: components that mutated props directly (bad practice anyway), components with complex destructuring that confused the analyzer, and one case where I was using a ref in a way the compiler couldn't trace.

Fixing these warnings made my code better even aside from the compiler. The plugin essentially enforces React best practices as a side effect of enabling compiler optimization.

The Verdict

New projects: Yes, turn it on. The build time increase is negligible for small-to-medium apps, and you'll never have to think about manual memoization again.

Existing projects: Yes, but budget a day for ESLint cleanup. Run the linter, fix the warnings, delete your manual memoization, and verify with the profiler. I did this for two apps in about 6 hours total.

Edge cases to hold off:

  • Very large monorepos with tight CI time budgets
  • Apps using lots of third-party hooks that rely on manual memoization patterns
  • Teams that don't have time to fix ESLint warnings before shipping

For everyone else, the compiler is a clear win. Less code to maintain, fewer memoization bugs, and measurable performance improvements.

Here's the workflow I use now:

// 1. Write straightforward code
function ProductCard({ product, onAddToCart }) {
  const discount = product.price * 0.1
  const finalPrice = product.price - discount
 
  const handleClick = () => {
    onAddToCart(product.id, finalPrice)
  }
 
  return (
    <div onClick={handleClick}>
      <h3>{product.name}</h3>
      <p>Price: ${finalPrice}</p>
    </div>
  )
}
 
// 2. Trust the compiler to memoize it
 
// 3. Verify with React DevTools Profiler if needed
 
// 4. Only add manual memoization if you find a genuine issue

I've been writing React for 8 years. This is the first time I've felt comfortable not memoizing defensively. That alone is worth the 3-second build time increase.

RustWebAssemblyPerformanceNext.jsTypeScript

Rust + WebAssembly: When It's Actually Worth the Complexity

9 min read
AIWeb DevelopmentUXForm ValidationJavaScript

Building AI-Powered Form Validation: Beyond Basic RegEx

13 min read
CSSWeb DevelopmentFrontendBest Practices

The Modern CSS Toolkit: What Actually Matters in 2026

10 min read