API Gateway with Redis
The Coderz Stack uses Nginx as an API Gateway combined with Redis as a caching and rate-limiting backend. This sits in front of all backend APIs and handles:- Routing — directs requests to the correct backend service
- Caching — stores responses in Redis to avoid redundant backend calls
- Rate Limiting — prevents abuse and protects backend services
- Request Logging — logs every request with full context for Kibana
- Load Balancing — distributes traffic across multiple API replicas
Why an API Gateway?
Without a gateway, each client talks directly to the backend APIs. This means:- No caching — every request hits the database
- No rate limiting — a single client can overwhelm the API
- No unified logging — logs are scattered across services
- No central security enforcement
Redis: What It Does
Redis (Remote Dictionary Server) is an in-memory data store used for:| Use Case | How |
|---|---|
| Response Cache | Store API responses with a TTL (e.g., 60s) |
| Rate Limiting | Count requests per IP, block when threshold exceeded |
| Session Storage | Store user sessions (optional) |
| Queue | Background job queuing (optional) |
Cache Hit vs Miss
Nginx Gateway Configuration
Add to/opt/coderz/configs/nginx/nginx.conf:
Adding Redis to Docker Compose
Add to/opt/coderz/docker-compose.yml:
maxmemory-policy: allkeys-lru means when Redis is full, it evicts the least recently used key — perfect for a cache.
Using Redis for Rate Limiting with Nginx + Lua
For advanced Redis-backed rate limiting (usingopenresty or nginx-lua):
Redis CLI — Checking Cache
Nginx Log Format for Kibana
Use this custom Nginx log format to capture all fields needed for Kibana API monitoring:Rate Limit Response
When a client exceeds the rate limit:Performance Impact
With Redis caching enabled:| Scenario | Without Cache | With Cache (Hit) |
|---|---|---|
| GET /api/items | ~150ms (DB query) | ~1ms |
| GET /api/products | ~200ms (DB query) | ~1ms |
| POST /api/orders | ~250ms (write) | Not cached (write ops) |