26 Aug 2025 5 min read

How We Stopped Ticket Scalping on a Major Venue Website

How We Stopped Ticket Scalping on a Major Venue Website

A major monumental complex hosted thousands of events annually — concerts, exhibitions, conferences. Their online ticketing platform was their lifeline. Then the bots came.

The Problem: Bots, Outages, and Lost Revenue

Every time tickets went on sale, the system collapsed. Not from legitimate demand — from bots. Scalpers used automated scripts to buy thousands of tickets in milliseconds, then resell them at 10x markup on secondary markets.

The impact was devastating:

Traditional rate limiting didn't work. Scalpers rotated IP addresses, spoofed User-Agents, and distributed their attacks across botnets. The venue needed a smarter defense.

The Solution: Multi-Layer Bot Mitigation

We implemented a comprehensive bot detection and mitigation strategy at the CDN edge, before traffic even reached the origin.

Layer 1: Origin Restriction — Reduce Load by 50%

First, we placed a CDN layer in front of the origin. The cache absorbed legitimate requests, reducing direct origin hits by 50%. Bots that expected to hit the origin directly were now hitting the edge instead.

Layer 2: DDoS Protection — Rate Limiting and HTTP 429

We implemented strict rate limiting:

Layer 3: IP Scoring — Reputation Analysis

Not all IPs are equal. We analyzed each IP address for:

Layer 4: JavaScript Challenges — Block Automated Access

Legitimate browsers execute JavaScript; bots don't. We served a JavaScript challenge that required solving a proof-of-work calculation. Bots failed silently. Real users never noticed.

Layer 5: Geolocation Filters — Regional Verification

The venue had specific sales rules for different regions. We implemented JavaScript-based geolocation verification to ensure:

Layer 6: Crawling Detection — Stop Recon Traffic

Scalper bots perform reconnaissance — scanning URLs, testing payloads, mapping the system. We detected and blocked:

Layer 7: Anomaly Detection — Behavioral Analysis

We built anomaly detection rules based on historical baseline behavior:

Anomaly Response
10x normal per-IP request rate 12-hour IP block
5x normal traffic volume 4-hour User-Agent Mitigation (UAM)
2x normal bandwidth spike 4-hour UAM
Sudden traffic from new country 4-hour UAM
Known crawler User-Agent 24-hour IP block

UAM (User-Agent Mitigation) meant: serve a JavaScript challenge before granting access. Bots fail; humans pass transparently.

Layer 8: WAF Protection — Input Validation

Beyond bot detection, we deployed Web Application Firewall (WAF) rules for:

Layer 9: VCL Logic — Advanced Request Handling

We implemented custom VCL (Varnish Configuration Language) logic to detect bot patterns:

# Restricting access from Go or Python clients (common bot languages)
if (req.http.User-Agent ~ "(?i)(go-http-client/|python-requests)") {
    call deny_request;
}

# Blocking NoName(057) based on Accept header pattern
if (req.http.Accept == "text/html,application/xhtml+xml,application/xml,") {
    call deny_request;
}

Layer 10: Whitelist Management — Trusted Partners

We maintained whitelists for:

Results: From Daily Outages to Year-Long Stability

Lessons Learned

Bot mitigation isn't a single tool — it's a layered strategy:

Today, the venue operates ticket sales with confidence. Bots are blocked before they ever reach the origin. Legitimate fans can purchase tickets quickly and reliably. And the business is protected from both revenue loss and infrastructure collapse.

Need to strengthen your web security? Our technical team can help you design the perfect protection strategy for your use case.

Get started