Last month, a client in Abu Dhabi asked me to speed up their Next.js API for a luxury vehicle booking platform built with Laravel + Next.js. Testing revealed their user profile fetch endpoint consistently took ~580ms. Not a disaster, but slow enough to be a bottleneck. One specific call hit the database 18 times per request to fetch nested relations. After implementing Redis caching, we dropped that to 260ms average. No framework magic — just smart use of in-memory storage.
Why (and When) Redis Is Worth the Effort
Let’s cut the theoretical BS — Redis isn’t always the answer. For my current project, it made sense because:
- •The data was static or rarely changing (e.g., branch locations, pricing tiers)
- •The original endpoint had multiple DB queries per request
- •The client’s traffic spikes happened predictably (Eid holidays, Dubai events)
I went with Redis 7.0 for its better stream handling and ioredis as the Node.js client. Yes, I spent 90 minutes fighting to bind it to a custom Redis port on Vercel’s Edge Runtime — turns out Vercel uses 127.0.0.1 for localhost but the default Redis port gets blocked unless you add it to their allowlist. You’ll need to email their support to request it.
Here’s the actual setup:
Server-side Caching with Redis and Next.js API Routes
We cached responses where query parameters determined output — think /api/branches?location=dubai. First request hits the DB, subsequent calls return cached JSON.
// pages/api/branches.js
const { Redis } = require('ioredis');
export default async function handler(req, res) {
const redis = new Redis(process.env.REDIS_URL);
const cacheKey = `branches-${req.query.location || 'all'}`;
try {
const cached = await redis.get(cacheKey);
if (cached) {
res.setHeader('X-Cache', 'HIT');
return res.status(200).json(JSON.parse(cached));
}
// Real DB fetch
const results = await fetchFromDatabase(); // Hypothetical
await redis.setex(cacheKey, 3600, JSON.stringify(results)); // 1-hour expiry
res.setHeader('X-Cache', 'MISS');
res.status(200).json(results);
} catch (error) {
console.error(error);
res.status(500).json({ error: 'Server error' });
}
}This cut 320ms average in their branch listing page during load testing. The key trick: cache by URL + query params, and set short expiries (30 mins-2 hours) for dynamic data.
The Gotchas No One Warns You About
Caching isn’t just throw-data-into-Redis-and-call-it-a-day. Here’s what actually tripped me up:
- JSON serialization issues
When caching nested objects from ORM relations (like Laravel Eloquent models), I had to explicitly convert them to plain objects before JSON.stringify. Otherwise, I got circular reference errors.
- Cache invalidation is a pain point
A support ticket came in 3am about outdated pricing. Lesson learned: we now trigger cache invalidation via webhooks when admins update critical data.
- Local vs. production consistency
During testing, cacheKey generation had a toLowerCase() call on one environment but not the other. Suddenly, Dubai and DUBAI were different keys. Fixed once I standardized query param sanitization.
One specific client — a Dubai-based limo service (similar to Tawasul Limo) — required Arabic/English versions of location data. I added language code to the cache key:
const cacheKey = `branches-${req.query.location || 'all'}-${req.locale}`;When to Use (and When Not to Use) This Approach
This worked because the client’s backend hit predictable, parameter-driven routes with high traffic. For other scenarios:
- •Avoid Redis if your data needs to be 100% real-time. Think live chat or trading platforms — Redis adds complexity where WebSockets or server-sent events work better.
- •Use Redis when you have:
- Repeated queries with same parameters across users
- Data that doesn’t change mid-session
- Heavy calculations (like pricing algorithms) that can be cached post-calculation
One exception? I tried this on a real estate app (inspired by Reach Home Properties) that listed apartment units. Problem: filters changed constantly as new users saved favorites. Redis caching caused stale data complaints. We switched to in-client pagination with Apollo Cache instead.
Frequently Asked Questions
How do I handle Redis authentication in a Next.js API route?
Use environment variables for REDIS_URL with the full credentials, like redis://default:password123@redis-host:6379. Store credentials in .env.local and never commit them. For Vercel, you’ll need them in the production config too.
Can I use Redis caching with statically generated (SSG) Next.js pages?
Partially. SSG pages generate at build time, but you can still use Redis to cache the data used during generation. For example:
export async function getStaticProps() {
const redis = new Redis(process.env.REDIS_URL);
const data = await redis.get('homepage_data') || await fetchFreshData();
return { props: { data } };
} What happens if Redis goes down?
Your app should fall back to direct DB queries. Wrap Redis calls in try/catch blocks to prevent cascading failures. I’ve had this happen on a UAE logistics client — their inventory lookups slowed down, but didn’t crash.
How do I test Redis performance improvements locally?
Use wrk or artillery.io to simulate concurrent users hitting your endpoint. For local Redis testing, spin up an instance in Docker:
docker run --name myredis -d redis:7.0 Want to Speed Up Your App? Let’s Build Something Together
The Tawasul Limo team initially resisted caching because “developers told them it’s easy to mess up cache invalidation.” But after explaining how we could reduce 38% of their server queries, they came around.
If your app’s performance is holding you back — whether you’re building a UAE e-commerce store or an Arabic/English B2B tool — I’ve probably hit that same wall before. Let’s book a free consultation to fix it together.