What 60-80% of Your B2B Traffic Really Is (And How to Stop It)

  • Home
  • /
  • What 60-80% of Your B2B Traffic Really Is (And How to Stop It)
What 60-80% of Your B2B Traffic Really Is (And How to Stop It)

What 60-80% of Your B2B Traffic Really Is (And How to Stop It)

Security 18 Mar 2025

Your Platform Is Under Siege

You look at your analytics dashboard and see thousands of daily visitors. Traffic is up. The servers are busy. But your conversion rate is dropping, page load times are creeping up, and your support team is fielding complaints about slow browsing.

Here is the uncomfortable truth: on a typical B2B e-commerce platform, 60 to 80 percent of traffic is not human buyers. It is bots, scrapers, vulnerability scanners, and automated attack tools.

We know this because we measure it. Across the Intershop platforms we manage, we have detailed access logs that show exactly who is visiting and what they are doing. The pattern is consistent and alarming.

Real Numbers From a Production Platform

Here are actual metrics from a B2B platform with over 250 active shops, measured over a single month:

MetricValue
Users affected by slow browsing7 out of 10
Inbound traffic that was malicious97%
Attacks blocked in a single day159,131

These are not theoretical risks. These are production numbers from a real platform serving real buyers.

What the Bots Are Doing

Scraping Your Catalogue

Competitor crawlers systematically harvest your product data — prices, stock levels, descriptions, images. We have recorded 49,210 scraping requests in a single day from a single bot network, with 21,598 failed requests that still consumed server resources. Your infrastructure pays the cost whether the scraper succeeds or not.

Probing for Vulnerabilities

Automated scanners from botnets cycle through known exploits, testing your platform for SQL injection, XSS, path traversal, and authentication bypasses. These are not targeted attacks — they are industrial-scale sweeps that hit every internet-facing service indiscriminately.

Burning Your Infrastructure Budget

Every bot request consumes CPU, memory, bandwidth, and database connections. When bots account for the majority of your traffic, you are effectively paying to serve attackers. Your legitimate buyers experience the consequences: slow page loads, failed searches, timeouts during checkout.

Where the Attacks Come From

We analysed the geographic origin of bot traffic across our managed platforms. The pattern is clear:

Your B2B buyers are concentrated in Europe — typically 77% or more of legitimate traffic comes from the EU, with the majority from the platform’s home country. Meanwhile, the top sources of malicious traffic are countries with no business relationship to the platform:

CountryActionReason
ChinaBlocked#1 source of scanning bots
RussiaBlockedBrute force and exploit attempts
North KoreaBlockedState-sponsored probing
IranBlockedAutomated vulnerability scanners

After blocking these sources, 14,000+ malicious IPs were removed from the traffic stream. Zero false positives on real buyers.

Why Cloud WAFs Fall Short

The standard response to bot traffic is a cloud-based Web Application Firewall (WAF). Services like Azure WAF or AWS WAF inspect HTTP traffic at Layer 7, applying heuristic rules to identify malicious requests.

The problem is that cloud WAFs operate at the application layer. Every request — legitimate or malicious — still reaches your infrastructure, gets decrypted, inspected, and processed before the WAF decides whether to block it. This adds latency to every request, generates false positives that block real buyers, and costs a premium per gateway.

For a B2B platform where the threat is primarily IP-based (known botnets, known scanner networks, entire countries with no legitimate buyers), Layer 7 inspection is overkill. You do not need to examine the content of a request from a known botnet — you need to drop it before it reaches your application.

Kernel-Level Blocking: A Different Approach

This is why we built Nullgate. Instead of inspecting HTTP requests at the application layer, Nullgate drops malicious traffic at the Linux kernel using iptables and ipset. Packets from blocked IPs are discarded before they reach your web server, your application, or your database.

The difference is fundamental:

Cloud WAFNullgate
LayerL7 (application)Kernel
LatencyAdds processing to every requestZero — blocked packets never reach the app
False positivesHeuristic-based, commonZero — IP-based, deterministic
Threat feedsManual rulesAuto-updated from multiple intelligence sources
CostPer gateway, per monthFlat fee per server
SetupComplex configuration60 seconds

Nullgate maintains blocklists from multiple threat intelligence feeds, automatically refreshed on schedule. GeoIP blocking drops entire countries. ASN blocking removes known bulletproof hosting providers and botnet infrastructure. Manual bans take effect instantly at the kernel level.

The result: 148,000+ malicious IPs blocked. CPU usage stabilised. Page load times returned to normal. Legitimate buyers experienced the fast, responsive platform they expect.

The Other Bot Problem: The Good Bots Cannot See Your Site

While you are blocking malicious bots, there is a second problem most B2B platforms overlook: the bots you actually want — Googlebot, Bingbot, ChatGPT, Gemini, Claude, Perplexity — cannot read your site either.

Modern e-commerce platforms built with JavaScript frameworks (React, Angular, Vue, Nuxt) look great to human visitors. But most search engine crawlers and AI bots cannot execute JavaScript. When they visit your site, they see an empty page — a blank HTML shell with no product data, no descriptions, no prices, no content.

This means your products are invisible to Google. Your pages do not appear in AI-generated recommendations. Your competitors who serve static HTML rank above you, even if your products and content are superior.

The Solution: Serve the Right Content to the Right Visitor

The answer is prerendering — a dual-path routing strategy:

  • Human visitors receive the fast, dynamic JavaScript experience they expect
  • Search engines and AI bots receive fully-rendered static HTML that they can read, index, and recommend

When a request arrives, the server checks the user agent. If it is Googlebot, ChatGPT, or any other known crawler, it serves a pre-rendered HTML snapshot of the page — complete with all product data, descriptions, images, and metadata. If it is a human visitor, it serves the standard JavaScript application.

The result: your site scores 100/100 on Lighthouse, your products appear in Google search results, and AI assistants can recommend your products to potential buyers. No compromise on performance or user experience.

AI Bots Are Already Visiting

Check your access logs. You will find crawlers from ChatGPT, Claude, Perplexity, and Gemini already visiting your site. If your platform is JavaScript-rendered, these AI bots are seeing nothing. Every visit is a missed opportunity to appear in AI-generated product recommendations — a channel that is growing rapidly and that your competitors are already optimising for.

Two Sides of the Same Problem

Bot traffic management is not just about blocking attackers. It is about controlling who gets access to your platform and what they see:

  1. Block the bad bots — scrapers, scanners, and botnets that consume resources and threaten security. Nullgate handles this at the kernel level.

  2. Serve the good bots — search engines and AI crawlers that drive organic traffic and product discovery. Prerendering handles this at the application level.

Most platforms do neither. They serve JavaScript to every visitor — which means attackers waste their resources while search engines cannot index their content. The worst of both worlds.

What You Should Do

Start by measuring your bot traffic. Check your access logs for the past 30 days. Look at the geographic distribution. Look at the user agents. Look at the request patterns.

You will find two things: malicious bots consuming the majority of your resources, and legitimate crawlers leaving empty-handed because they cannot render your JavaScript.

If you want to solve both problems, get in touch. We will analyse your traffic, show you exactly what is hitting your servers, and demonstrate how Nullgate and prerendering work together to protect your platform and maximise your visibility.