Static Data (Long Cache)
24 hours typical
- Car data
- Track information
- Series details
- Constants
The iRacing Data Client SDK supports opt-in caching via pluggable stores powered by @http-client-toolkit/core. When a CacheStore is provided, S3 responses are cached automatically with TTL derived from the presigned URL’s expiry — no manual TTL management needed.
Pass a CacheStore implementation via the stores option:
import { IRacingDataClient } from 'iracing-data-client';
const iracing = new IRacingDataClient({ auth: { /* ... */ }, stores: { cache: myCacheStore, },});
// First call — network request, response is cachedconst cars1 = await iracing.car.get();
// Second call within cache period — returned from storeconst cars2 = await iracing.car.get();When caching is enabled:
expires fieldCache keys are derived from the request URL and parameters, so different parameters produce different cache entries:
await iracing.member.get({ custIds: [123456] }); // Cache key 1await iracing.member.get({ custIds: [789012] }); // Cache key 2await iracing.member.get({ custIds: [123456] }); // Hits cache key 1Different endpoints have different cache durations, determined by the iRacing API’s presigned URL expiry times:
Static Data (Long Cache)
24 hours typical
Dynamic Data (Short Cache)
5-15 minutes typical
Real-time Data (No Cache)
Not cached
Check the @http-client-toolkit organisation for available store packages:
import { IRacingDataClient } from 'iracing-data-client';import { RedisCacheStore } from '@http-client-toolkit/store-redis'; // example
const iracing = new IRacingDataClient({ auth: { /* ... */ }, stores: { cache: new RedisCacheStore({ url: 'redis://localhost:6379' }), },});A CacheStore implements the following interface:
interface CacheStore { get(key: string): Promise<Response | undefined> | Response | undefined; set(key: string, value: Response, ttl?: number): Promise<void> | void; delete(key: string): Promise<void> | void;}Simple in-memory example:
class InMemoryCacheStore { private cache = new Map<string, { response: Response; expires: number }>();
get(key: string): Response | undefined { const entry = this.cache.get(key); if (!entry) return undefined; if (entry.expires < Date.now()) { this.cache.delete(key); return undefined; } return entry.response.clone(); }
set(key: string, value: Response, ttl?: number): void { this.cache.set(key, { response: value.clone(), expires: Date.now() + (ttl ?? 300) * 1000, }); }
delete(key: string): void { this.cache.delete(key); }}
const iracing = new IRacingDataClient({ auth: { /* ... */ }, stores: { cache: new InMemoryCacheStore(), },});If you need caching logic beyond what the store provides (e.g. custom TTLs per endpoint, conditional caching), you can layer your own cache on top:
class CachedIRacingService { private cache = new Map<string, { data: unknown; expires: number }>();
constructor(private iracing: IRacingDataClient) {}
async getMemberInfo( custId: number, cacheDuration = 5 * 60 * 1000 // 5 minutes ) { const cacheKey = `member:${custId}`; const cached = this.cache.get(cacheKey);
if (cached && cached.expires > Date.now()) { return cached.data; }
const data = await this.iracing.member.get({ custIds: [custId] });
this.cache.set(cacheKey, { data, expires: Date.now() + cacheDuration, });
return data; }
clearCache() { this.cache.clear(); }}Warm up the cache on application start:
async function preloadCache(iracing: IRacingDataClient) { await Promise.all([ iracing.car.get(), iracing.track.get(), iracing.series.get(), iracing.constants.categories(), iracing.constants.divisions(), ]);}
await preloadCache(iracing);Combine requests to reduce cache entries and API calls:
// Instead of fetching members one at a timeconst ids = [123, 456, 789];
// Single batched requestconst members = await iracing.member.get({ custIds: ids });Use Pluggable Stores
Pass a CacheStore via stores.cache for automatic, TTL-aware caching with no extra code
Cache Static Data
Aggressively cache data that rarely changes (cars, tracks, series)
Monitor Performance
Track cache hit rates to validate your caching strategy
Don't Over-Cache
Avoid caching real-time data that needs to be fresh
stores option