api-design api rate limitingredis sliding windowapi security

API Rate Limiting with Redis: Sliding Window Implementation

Master Redis sliding window rate limiting for robust API security. Learn implementation patterns, real-world examples, and best practices for scalable systems.

📖 13 min read 📅 April 1, 2026 ✍ By PropTechUSA AI
13m
Read Time
2.4k
Words
23
Sections

In today's interconnected PropTech ecosystem, APIs serve as the backbone connecting property management systems, listing platforms, and third-party integrations. However, without proper rate limiting, these critical endpoints become vulnerable to abuse, leading to service degradation and potential security breaches. Redis-based sliding window rate limiting offers a sophisticated solution that balances security with user experience, providing granular control over [API](/workers) access patterns.

Understanding API Rate Limiting in Modern Architecture

The Critical Role of Rate Limiting

API rate limiting serves multiple purposes beyond simple traffic control. It protects against malicious attacks, ensures fair resource allocation among users, and maintains service quality under varying load conditions. For PropTech applications handling sensitive property data and financial transactions, robust rate limiting becomes essential for maintaining trust and compliance.

Traditional rate limiting approaches often fall short when dealing with complex usage patterns. Simple token bucket algorithms may allow burst traffic that overwhelms downstream services, while fixed window counters create artificial boundaries that users can exploit.

Why Redis for Rate Limiting

Redis excels as a rate limiting backend due to its atomic operations, built-in expiration mechanisms, and exceptional performance characteristics. Its single-threaded execution model eliminates race conditions, while features like pipelining and Lua scripting enable sophisticated rate limiting logic with minimal latency overhead.

The distributed nature of modern applications makes Redis particularly valuable. Multiple application instances can share rate limiting state seamlessly, ensuring consistent enforcement across horizontally scaled deployments.

Sliding Window Advantages

Sliding window algorithms provide more granular control compared to fixed window approaches. Instead of resetting counters at arbitrary time boundaries, sliding windows maintain a continuous view of request patterns, preventing users from exploiting window boundaries to exceed intended limits.

This approach proves especially valuable for PropTech APIs where usage patterns vary significantly between different user types and time periods.

Core Concepts of Sliding Window Rate Limiting

Algorithm Mechanics

The sliding window algorithm maintains a time-ordered log of requests within a specified time frame. For each incoming request, the system removes expired entries from the window, counts remaining requests, and determines whether to allow or reject the current request.

The "sliding" nature ensures that the time window moves continuously rather than jumping at fixed intervals. This provides smoother rate limiting behavior and more accurate enforcement of intended limits.

Redis Data Structures for Implementation

Redis sorted sets provide an ideal data structure for sliding window implementation. Each request timestamp serves as the score, while unique identifiers act as members. This structure enables efficient range operations for cleaning expired entries and counting recent requests.

typescript
// Basic sorted set structure for rate limiting

// Key: rate_limit:user_id

// Score: timestamp

// Member: unique_request_id

ZADD rate_limit:user123 1634567890123 request_abc123

Alternatively, Redis lists combined with expiration can provide simpler implementations for less demanding scenarios, though they lack the precision and efficiency of sorted sets.

Time Window Considerations

Choosing appropriate time windows requires balancing user experience with system protection. Shorter windows provide more responsive rate limiting but require more frequent cleanup operations. Longer windows smooth out burst behavior but may delay reaction to abuse patterns.

For PropTech applications, consider different window sizes for different operation types. Property searches might use 60-second windows, while API key authentication could employ 24-hour windows for security purposes.

Implementation Patterns and Code Examples

Basic Sliding Window Implementation

Here's a comprehensive sliding window rate limiter implementation using Redis and Node.js:

typescript
import Redis from 'ioredis';

import { v4 as uuidv4 } from 'uuid';

class SlidingWindowRateLimiter {

private redis: Redis;

private windowSizeMs: number;

private maxRequests: number;

constructor(redis: Redis, windowSizeMs: number, maxRequests: number) {

this.redis = redis;

this.windowSizeMs = windowSizeMs;

this.maxRequests = maxRequests;

}

async checkLimit(identifier: string): Promise<RateLimitResult> {

const now = Date.now();

const windowStart = now - this.windowSizeMs;

const key = rate_limit:${identifier};

const requestId = uuidv4();

const [pipeline](/custom-crm) = this.redis.pipeline();

// Remove expired entries

pipeline.zremrangebyscore(key, 0, windowStart);

// Count current requests in window

pipeline.zcard(key);

// Add current request

pipeline.zadd(key, now, requestId);

// Set expiration

pipeline.expire(key, Math.ceil(this.windowSizeMs / 1000));

const results = await pipeline.exec();

const currentCount = results?.[1]?.[1] as number;

const allowed = currentCount < this.maxRequests;

if (!allowed) {

// Remove the request we just added since it's rejected

await this.redis.zrem(key, requestId);

}

return {

allowed,

count: currentCount + (allowed ? 1 : 0),

remaining: Math.max(0, this.maxRequests - currentCount - (allowed ? 1 : 0)),

resetTime: now + this.windowSizeMs

};

}

}

interface RateLimitResult {

allowed: boolean;

count: number;

remaining: number;

resetTime: number;

}

Lua Script Optimization

For high-throughput scenarios, Lua scripts ensure atomicity while reducing network round trips:

lua
-- sliding_window_rate_limit.lua

local key = KEYS[1]

local window_size = tonumber(ARGV[1])

local max_requests = tonumber(ARGV[2])

local now = tonumber(ARGV[3])

local request_id = ARGV[4]

local window_start = now - window_size

-- Remove expired entries

redis.call('ZREMRANGEBYSCORE', key, 0, window_start)

-- Count current requests

local current_count = redis.call('ZCARD', key)

-- Check if request should be allowed

if current_count < max_requests then

-- Add the request

redis.call('ZADD', key, now, request_id)

redis.call('EXPIRE', key, math.ceil(window_size / 1000))

return {1, current_count + 1, max_requests - current_count - 1}

else

return {0, current_count, 0}

end

Integration with Express.js

Here's how to integrate the rate limiter into an Express.js application:

typescript
import express from 'express';

import Redis from 'ioredis';

const app = express();

const redis = new Redis(process.env.REDIS_URL);

// Different rate limits for different endpoints

const createRateLimiter = (windowMs: number, maxRequests: number) => {

const limiter = new SlidingWindowRateLimiter(redis, windowMs, maxRequests);

return async (req: express.Request, res: express.Response, next: express.NextFunction) => {

const identifier = req.ip || 'unknown';

try {

const result = await limiter.checkLimit(identifier);

// Set rate limit headers

res.set({

'X-RateLimit-Limit': maxRequests.toString(),

'X-RateLimit-Remaining': result.remaining.toString(),

'X-RateLimit-Reset': result.resetTime.toString()

});

if (result.allowed) {

next();

} else {

res.status(429).json({

error: 'Rate limit exceeded',

retryAfter: Math.ceil((result.resetTime - Date.now()) / 1000)

});

}

} catch (error) {

console.error('Rate limiting error:', error);

// Fail open - allow request if rate limiter fails

next();

}

};

};

// Apply different limits to different routes

app.use('/api/search', createRateLimiter(60000, 100)); // 100 requests per minute

app.use('/api/auth', createRateLimiter(900000, 5)); // 5 requests per 15 minutes

Advanced Multi-Tier Rate Limiting

For complex PropTech platforms, implement multiple rate limiting tiers:

typescript
class MultiTierRateLimiter {

private limiters: Map<string, SlidingWindowRateLimiter>;

constructor(redis: Redis) {

this.limiters = new Map([

['per_second', new SlidingWindowRateLimiter(redis, 1000, 10)],

['per_minute', new SlidingWindowRateLimiter(redis, 60000, 300)],

['per_hour', new SlidingWindowRateLimiter(redis, 3600000, 5000)]

]);

}

async checkAllLimits(identifier: string): Promise<RateLimitResult> {

for (const [tier, limiter] of this.limiters) {

const result = await limiter.checkLimit(${tier}:${identifier});

if (!result.allowed) {

return { ...result, tier };

}

}

return { allowed: true, count: 0, remaining: 0, resetTime: 0 };

}

}

💡
Pro TipUse different Redis key prefixes for different rate limiting policies to avoid conflicts and enable easier monitoring.

Best Practices and Production Considerations

Performance Optimization Strategies

Redis performance remains critical for rate limiting systems that handle thousands of requests per second. Implement connection pooling to manage Redis connections efficiently, and consider using Redis Cluster for horizontal scaling when single-instance performance becomes insufficient.

Monitor Redis memory usage carefully, as sliding window implementations can accumulate significant data over time. Implement aggressive cleanup policies and consider using Redis's memory optimization features like compression.

typescript
// Redis configuration for optimal rate limiting performance

const redis = new Redis({

host: process.env.REDIS_HOST,

port: parseInt(process.env.REDIS_PORT || '6379'),

maxRetriesPerRequest: 3,

retryDelayOnFailover: 100,

lazyConnect: true,

keepAlive: 30000,

// Connection pool settings

family: 4,

db: 0

});

Error Handling and Fallback Strategies

Robust rate limiting systems must handle Redis failures gracefully. Implement circuit breaker patterns to detect Redis outages and fall back to in-memory rate limiting or fail-open policies during infrastructure issues.

typescript
class ResilientRateLimiter {

private primaryLimiter: SlidingWindowRateLimiter;

private fallbackLimiter: Map<string, any>;

private circuitBreaker: CircuitBreaker;

async checkLimit(identifier: string): Promise<RateLimitResult> {

try {

if (this.circuitBreaker.isOpen()) {

return this.fallbackCheck(identifier);

}

const result = await this.primaryLimiter.checkLimit(identifier);

this.circuitBreaker.recordSuccess();

return result;

} catch (error) {

this.circuitBreaker.recordFailure();

console.error('Primary rate limiter failed:', error);

return this.fallbackCheck(identifier);

}

}

private fallbackCheck(identifier: string): RateLimitResult {

// Implement simple in-memory rate limiting

// or fail-open policy based on requirements

return { allowed: true, count: 0, remaining: 1000, resetTime: Date.now() + 60000 };

}

}

Security and Abuse Prevention

Rate limiting serves as a first line of defense against various attack vectors. Implement different strategies for different types of identifiers - IP addresses for anonymous users, API keys for authenticated services, and user IDs for logged-in users.

Consider implementing progressive penalties for repeated violations, where subsequent violations result in longer lockout periods. This approach helps deter persistent attackers while minimizing impact on legitimate users experiencing temporary issues.

⚠️
WarningNever rely solely on client-provided identifiers for rate limiting. Always include server-side identifiers like IP addresses to prevent trivial bypasses.

Monitoring and Observability

Implement comprehensive monitoring for rate limiting systems to understand usage patterns and detect potential issues before they impact users. Track [metrics](/dashboards) like rate limit hit rates, Redis performance, and false positive rates.

typescript
// Example monitoring integration

class MonitoredRateLimiter extends SlidingWindowRateLimiter {

async checkLimit(identifier: string): Promise<RateLimitResult> {

const startTime = Date.now();

try {

const result = await super.checkLimit(identifier);

// Record metrics

metrics.histogram('rate_limiter.duration', Date.now() - startTime);

metrics.counter('rate_limiter.requests.total').inc();

metrics.counter('rate_limiter.requests.allowed').inc(result.allowed ? 1 : 0);

return result;

} catch (error) {

metrics.counter('rate_limiter.errors').inc();

throw error;

}

}

}

At PropTechUSA.ai, our API infrastructure leverages similar sliding window rate limiting patterns to ensure reliable service delivery across our property data and analytics platforms, maintaining sub-millisecond latency while protecting against abuse.

Scaling and Advanced Implementation Patterns

Distributed Rate Limiting Architecture

As PropTech platforms scale across multiple regions and availability zones, distributed rate limiting becomes essential. Redis Cluster provides horizontal scaling capabilities, but requires careful key distribution strategies to avoid hotspots.

Implement consistent hashing for rate limiting keys to ensure even distribution across cluster nodes. Consider using Redis Streams for audit logging and real-time monitoring of rate limiting decisions across distributed systems.

typescript
class DistributedRateLimiter {

private clusters: Redis.Cluster[];

private hashRing: ConsistentHashRing;

constructor(clusters: Redis.Cluster[]) {

this.clusters = clusters;

this.hashRing = new ConsistentHashRing(clusters);

}

async checkLimit(identifier: string): Promise<RateLimitResult> {

const cluster = this.hashRing.getNode(identifier);

const limiter = new SlidingWindowRateLimiter(cluster, this.windowSizeMs, this.maxRequests);

return limiter.checkLimit(identifier);

}

}

Integration with API Gateways

Modern PropTech architectures often employ API gateways for centralized request handling. Integrate rate limiting at the gateway level to provide consistent protection across all backend services while reducing individual service complexity.

Consider implementing rate limiting policies that vary based on authentication status, user tiers, or request complexity. Property listing APIs might allow higher rates for premium subscribers while maintaining stricter limits for anonymous users.

Real-World PropTech Use Cases

Different PropTech operations require tailored rate limiting strategies. Property search APIs benefit from burst-friendly limits that accommodate user browsing patterns, while financial transaction endpoints require strict, conservative limits to prevent abuse.

Property image upload endpoints might implement size-aware rate limiting, where larger files consume more quota than smaller ones. This approach provides fair resource allocation while preventing abuse through large file uploads.

💡
Pro TipImplement rate limiting granularity that matches your business logic. Per-property rate limits can prevent data scraping while per-user limits ensure fair access.

Conclusion and Implementation Roadmap

Redis-based sliding window rate limiting provides a robust foundation for protecting PropTech APIs while maintaining excellent user experience. The implementation patterns discussed here [offer](/offer-check) scalable solutions that can grow with your [platform](/saas-platform) while providing the granular control necessary for complex property technology ecosystems.

Start with basic sliding window implementation for critical endpoints, then gradually expand to multi-tier limiting and advanced monitoring as your platform scales. Remember that effective rate limiting balances security requirements with user experience - overly restrictive limits can harm legitimate users while insufficient protection leaves systems vulnerable to abuse.

The key to successful rate limiting lies in understanding your specific usage patterns and implementing policies that reflect real-world user behavior. Monitor rate limiting effectiveness continuously and adjust policies based on observed patterns and user feedback.

Ready to implement robust rate limiting for your PropTech platform? Begin with the basic sliding window implementation provided here, customize the parameters for your specific use cases, and gradually add advanced features as your requirements evolve. Your APIs and users will benefit from the improved reliability and security that proper rate limiting provides.

🚀 Ready to Build?

Let's discuss how we can help with your project.

Start Your Project →