Technical guides
Caching
Implement caching strategies to improve performance and reduce API load. Different data types have different caching requirements based on update frequency.
Why caching matters
- Performance: Faster pages and better UX for riders and dealers.
- Stability: Fewer spikes and retries during peak traffic.
- Fair usage: HLC reserves the right to throttle customers who abuse the API.
Use case: Large dealers sync catalog data daily but refresh inventory every few minutes to keep availability accurate.
What to cache (and for how long)
| Data | Suggested TTL | Why |
|---|---|---|
| Brands | 24 hours | Changes infrequently |
| Categories | 24 hours | Changes infrequently |
| Products | 24 hours | Stable product data |
| Inventory | 1–10 minutes | Changes often |
| Prices | 5–15 minutes | Changes occasionally |
Required headers (legacy vs new)
Header requirements depend on whether you’re using the legacy v3.0 flow or the unified v4.x experience.
| Header | Legacy v3.0 | Unified v4.x | Notes |
|---|---|---|---|
| ApiKey | Required | Supported | Access token auth |
| Authorization: Bearer <JWT> | Not supported | Supported | JWT auth |
| language | Optional | Optional | en or fr |
| callerName | Optional | Optional | App identifier |
Tip: Use a single cache layer for static data, and a short‑TTL cache for inventory/prices.
Example: cached fetch pattern
const baseUrl = 'https://api.hlc.bike/us/v3.0'
const apiKey = process.env.HLC_API_KEY
// Pseudo cache wrapper (Redis, KV, in‑memory)
async function cached(key, ttlSeconds, fetcher) {
const hit = await cache.get(key)
if (hit) return JSON.parse(hit)
const fresh = await fetcher()
await cache.set(key, JSON.stringify(fresh), ttlSeconds)
return fresh
}
const brands = await cached('brands', 86400, async () => {
const res = await fetch(`${baseUrl}/Catalog/Brands`, {
headers: { ApiKey: apiKey },
})
return res.json()
})