At 2 a.m. one night last year, I was staring at a Chrome DevTools flame chart for a client’s Next.js app that took 12 seconds to load on first visit. The bundle size was 1.8MB of JavaScript — way too big. That’s when I went back to basics and rebuilt their code splitting strategy from scratch. The result? Initial load dropped to 1.2 seconds. No magic tricks here — just disciplined use of Next.js’s built-in tools and a few hard-learned lessons from shipping 40+ apps since 2016.
When Dynamic Imports Aren’t Enough
Next.js’s dynamic() is the obvious starting point, but it’s easy to misuse. I learned this the hard way on a UAE e-commerce app where over-eager dynamic imports broke the browser cache completely. The fix:
- •Always use
ssr: falseonly when you need to exclude server-side rendering - •For components used in multiple places (like a shared
ProductCard), import them dynamically once in a parent layout file - •Preload important components using
useEffectto fetch them in the background if user idle time exists
The real win comes from combining this with Webpack’s magic comments. I always add webpackChunkName to dynamic imports so I can track bundles in the DevTools. Example:
const ProductDetail = dynamic(() =>
import('../components/ProductDetail').then(mod => mod.ProductDetail),
{
ssr: false,
loading: () => <Loader />
}
)Route-Based Splitting for Multi-Language Sites
One of my recent projects (a bilingual Dubai construction company site) needed Arabic translations that added 300KB to the bundle. The key was splitting language resources via route-based lazy loading:
- Keep language JSON files under
/public/locales/[lang]/[page].json - Create a
/ar/[...slug].jsfile that loads the Arabic translations before rendering the page - Use
next.config.jsrewrites to avoid repeating logic:
// next.config.js
module.exports = {
i18n: {
locales: ['en', 'ar'],
defaultLocale: 'en'
},
rewrites: async () => {
return [
{
source: '/ar/:path*',
destination: '/api/translate?lang=ar&path=:path*'
}
]
}
}This reduced the main bundle size by 22% for English users. For more language-specific SEO tips, I wrote How I Build Accessible Arabic Websites.
The Catch with Third-Party Libraries
Lazy loading external components like react-pdf or Stripe Elements often causes more problems than it solves. Once, I spent 6 hours debugging a Next.js 14 app where a dynamically imported PDF viewer library caused hydration mismatches. The solution was simpler than I expected: wrap the dynamic library component in a client-side-only context provider. Here’s the pattern:
// pdf-viewer-wrapper.tsx
export default function PDFViewer({ fileUrl }: { fileUrl: string }) {
const [showViewer, setShowViewer] = useState(false)
useEffect(() => {
setShowViewer(true)
}, [])
if (!showViewer) return null
return (
<PDFViewer fileUrl={fileUrl} />
)
}
// page.tsx
const PDFViewer = dynamic(() =>
import('../components/PDFViewerWrapper')
)This pattern avoids server-rendering issues while still lazy-loading the heaviest part. For similar library patterns, check my Docker for Next.js and Laravel Dev Setup — Docker debugging saved my sanity during those 6 hours.
Handling Dynamic Routes in Large-Scale Apps
On a recent UAE property platform with 15K+ listings (Reach Home Properties), we implemented dynamic code splitting for route parameters. The key was leveraging [id].tsx files with dynamic imports based on URL patterns:
// pages/properties/[id].tsx
export default function PropertyPage() {
const router = useRouter()
const isCommercialPage = router.query.id?.includes('commercial-')
const PageContent = isCommercialPage
? dynamic(() => import('../components/CommercialPropertyDetails'))
: dynamic(() => import('../components/StandardPropertyDetails'))
return <PageContent />
}This reduced cold start bundle size by splitting commercial property templates from residential ones. Important detail: we used Redis caching to handle the increased render requests — each dynamic route needed ~100ms extra server-side processing until the Vercel cache warmed up.
Caching Strategy That Actually Works
I’ve wasted too much time debugging stale dynamic imports. Now I always:
- Add content hashes to bundle URLs using Webpack
outputconfig:
// next.config.js
module.exports = {
webpack: (config) => {
config.output.filename = 'static/chunks/[name].[contenthash].js'
return config
}
}- Set aggressive Cloudflare caching for bundles (
CF-Cache-Status: HITshould be your friend) - Use a
_cacheBusterquery param only when critical fixes are deployed:
// dynamic-import-wrapper.ts
const loadComponent = (componentPath: string) => {
const cacheBuster = process.env.NEXT_PUBLIC_CACHE_BUSTER || ''
return import(`../components/${componentPath}?cb=${cacheBuster}`)
}A logistics client’s admin dashboard in Dubai had major cache invalidation issues before adding this setup — now their monthly deployments clear caches automatically without touching Redis manually.
SEO Considerations (Because Google Isn’t Your Only User)
One thing Googlebot still hates: invisble content loaded way too late. On a UAE restaurant booking app (built with Tawasul Limo’s stack), I fixed bad Lighthouse SEO scores by combining dynamic imports with skeleton states that actually rendered the semantic content structure.
Yes, that meant duplicating a bit of CSS and DOM structure in the loading component. Yes, it felt hacky. But Googlebot indexes the essential metadata correctly now, and actual users get fast interaction times.
Frequently Asked Questions
### Does dynamic import work with Server Components in Next.js 14?
Yes, but with caveats. You can't use use client components inside Server Components dynamically. The workaround I use: wrap the dynamic import in a Suspense boundary and ensure the component tree only uses lazy-loaded parts in client-only contexts.
### How does code splitting affect FCP and LCP metrics?
Significantly — I've seen initial load FCP drop from 8.5s to 1.7s by splitting analytics scripts and non-critical UI. LCP gets trickier though: if your largest contentful paint element relies on a dynamically imported component, it might delay LCP until after the component loads.
### Can I lazy load third-party libraries like React PDF?
Technically yes, but be careful with dependencies. The PDF viewer I mentioned earlier had transitive dependencies that caused duplicate React instances when loaded dynamically. The fix? Use Webpack's splitChunks config to ensure single React version:
// next.config.js
module.exports = {
webpack: (config) => {
config.optimization = {
splitChunks: {
cacheGroups: {
defaultVendors: {
test: /[\\/]node_modules[\\/]/,
name: 'vendors',
chunks: 'all',
enforce: true
}
}
}
}
return config
}
}### Should I use dynamic() for API calls?
No. Lazy loading code for API routes won't improve performance. API code should stay in /pages/api/ and use Serverless Function optimizations. The dynamic imports in Next.js exist for UI code specifically, not backend logic.
If you’re dealing with slow Next.js applications that feel stuck in 2020, let’s talk. I’ve spent 7 years optimizing web apps for UAE businesses from Abu Dhabi to Doha, and I know exactly how to balance performance with real-world deadlines. Book a free consultation or get in touch directly if you're ready to ship faster apps.