What you'll build
A cache layer with Redis that drastically reduces your API response times by storing frequently queried data in memory. You'll implement a generic cache wrapper that works with any async function, automatic time-based expiration (TTL), manual invalidation when data changes, and patterns for caching API responses. When finished, you'll have responses that go from seconds to milliseconds, reducing the load on your database and improving user experience.
Step 1: Ask an AI for the implementation
I need to implement cache with Redis:
- Fetch wrapper with cache
- Time-based invalidation (TTL)
- Manual invalidation
- Pattern for API routes
- TypeScript + ioredis
Give me the complete code.
Code
import Redis from 'ioredis'
const redis = new Redis(process.env.REDIS_URL)
export async function cached<T>(
key: string,
fetcher: () => Promise<T>,
ttl = 3600
): Promise<T> {
const cached = await redis.get(key)
if (cached) return JSON.parse(cached)
const data = await fetcher()
await redis.setex(key, ttl, JSON.stringify(data))
return data
}
// Usage
const users = await cached(
'users:all',
() => prisma.user.findMany(),
300 // 5 minutes
)
Invalidate
// Invalidate one key
await redis.del('users:all')
// Invalidate by pattern
const keys = await redis.keys('users:*')
if (keys.length) await redis.del(...keys)
Next step
โ Arduino + MQTT