A major monumental complex hosted thousands of events annually — concerts, exhibitions, conferences. Their online ticketing platform was their lifeline. Then the bots came.
The Problem: Bots, Outages, and Lost Revenue
Every time tickets went on sale, the system collapsed. Not from legitimate demand — from bots. Scalpers used automated scripts to buy thousands of tickets in milliseconds, then resell them at 10x markup on secondary markets.
The impact was devastating:
- 30 requests per second crashed the origin servers
- 2+ second delays triggered HTTP 408 timeout errors
- Daily outages frustrated legitimate customers
- Revenue loss: hundreds of thousands in discounted resale tickets instead of face value
Traditional rate limiting didn't work. Scalpers rotated IP addresses, spoofed User-Agents, and distributed their attacks across botnets. The venue needed a smarter defense.
The Solution: Multi-Layer Bot Mitigation
We implemented a comprehensive bot detection and mitigation strategy at the CDN edge, before traffic even reached the origin.
Layer 1: Origin Restriction — Reduce Load by 50%
First, we placed a CDN layer in front of the origin. The cache absorbed legitimate requests, reducing direct origin hits by 50%. Bots that expected to hit the origin directly were now hitting the edge instead.
Layer 2: DDoS Protection — Rate Limiting and HTTP 429
We implemented strict rate limiting:
- Maximum 10 requests per second per IP address
- Excess traffic triggers HTTP 429 (Too Many Requests) responses
- Automatic IP blocking after 3 consecutive 429 errors
Layer 3: IP Scoring — Reputation Analysis
Not all IPs are equal. We analyzed each IP address for:
- Historical abuse patterns: IPs known for DDoS, spam, or credential stuffing
- Geographic anomalies: sudden traffic from data centers or VPNs
- Botnet membership: correlation with known malicious networks
- Low-reputation IPs: automatically blocked for 12 hours
Layer 4: JavaScript Challenges — Block Automated Access
Legitimate browsers execute JavaScript; bots don't. We served a JavaScript challenge that required solving a proof-of-work calculation. Bots failed silently. Real users never noticed.
Layer 5: Geolocation Filters — Regional Verification
The venue had specific sales rules for different regions. We implemented JavaScript-based geolocation verification to ensure:
- Presale only available to registered members in specific countries
- General sale available worldwide but required geolocation matching
- Suspicious geographic anomalies (e.g., requests from 10 countries in 1 second) were blocked
Layer 6: Crawling Detection — Stop Recon Traffic
Scalper bots perform reconnaissance — scanning URLs, testing payloads, mapping the system. We detected and blocked:
- 100+ requests per minute from a single IP: automatic 24-hour block
- Sequential parameter enumeration: attempts to discover hidden endpoints
- Payload injection attempts: XSS, SQLi, or command injection attempts
Layer 7: Anomaly Detection — Behavioral Analysis
We built anomaly detection rules based on historical baseline behavior:
| Anomaly | Response |
|---|---|
| 10x normal per-IP request rate | 12-hour IP block |
| 5x normal traffic volume | 4-hour User-Agent Mitigation (UAM) |
| 2x normal bandwidth spike | 4-hour UAM |
| Sudden traffic from new country | 4-hour UAM |
| Known crawler User-Agent | 24-hour IP block |
UAM (User-Agent Mitigation) meant: serve a JavaScript challenge before granting access. Bots fail; humans pass transparently.
Layer 8: WAF Protection — Input Validation
Beyond bot detection, we deployed Web Application Firewall (WAF) rules for:
- URL parameter validation: reject requests with malformed parameters
- Regex blocking: detect common attack patterns (../../../etc/passwd, union select, etc.)
- Content-Type enforcement: only accept expected content types for each endpoint
- HTTP method restriction: GET for reads, POST for purchases, reject DELETE/PUT on sensitive paths
Layer 9: VCL Logic — Advanced Request Handling
We implemented custom VCL (Varnish Configuration Language) logic to detect bot patterns:
# Restricting access from Go or Python clients (common bot languages)
if (req.http.User-Agent ~ "(?i)(go-http-client/|python-requests)") {
call deny_request;
}
# Blocking NoName(057) based on Accept header pattern
if (req.http.Accept == "text/html,application/xhtml+xml,application/xml,") {
call deny_request;
}
Layer 10: Whitelist Management — Trusted Partners
We maintained whitelists for:
- Search engines: Google, Bing, DuckDuckGo bots (verified via reverse DNS)
- Official mobile apps: Specific User-Agent strings for iOS/Android ticket apps
- Enterprise integrators: Partners with guaranteed request rates
- CDN partners: Other CDNs pre-approved for cache-to-cache traffic
Results: From Daily Outages to Year-Long Stability
- 60% traffic reduction: Bot traffic eliminated, legitimate traffic unaffected
- 99.99% uptime: Zero downtime for 12+ consecutive months
- Sub-second response times: Even during peak sales, latency remained under 200ms
- Revenue protection: Scalpers now unable to mass-purchase tickets
- Customer satisfaction: Legitimate users experienced zero friction — JavaScript challenges are transparent to browsers
Lessons Learned
Bot mitigation isn't a single tool — it's a layered strategy:
- Defense in depth: multiple overlapping layers catch what single defenses miss
- Behavioral analysis: understanding the baseline (normal traffic) is key to spotting anomalies
- Zero-friction for humans: legitimate users should never notice the defenses
- Continuous adaptation: bots evolve; your defenses must evolve faster
- CDN intelligence: blocking at the edge is far more efficient than handling attacks at the origin
Today, the venue operates ticket sales with confidence. Bots are blocked before they ever reach the origin. Legitimate fans can purchase tickets quickly and reliably. And the business is protected from both revenue loss and infrastructure collapse.
Need to strengthen your web security? Our technical team can help you design the perfect protection strategy for your use case.
Get started