π‘οΈ Advanced Bot Detection & Content Protection π€
Automatically detects bots and scrapers using sophisticated heuristics, then scrambles text content and obfuscates media to protect your website while maintaining full accessibility for human users.
Smart bot detection that protects your content from scrapers while allowing search engines to index your site properly:
- β Google, Bing, Yahoo β See real content for indexing
- β Facebook, Twitter, LinkedIn β See real content for social previews
- β SEO tools (SEMrush, Ahrefs) β See real content for analysis
- β Scrapers & headless browsers β See scrambled content
- β AI training bots β See scrambled content
- β Content harvesters β See obfuscated media
SEO protection is enabled by default - no configuration needed!
Install via npm for maximum flexibility and version control:
npm install @4884org/jumble
Quick setup via CDN for rapid prototyping:
<script type="module" src="https://unpkg.com/@4884org/jumble@latest/dist/jumble/jumble.esm.js"></script>
Add the loader script to <head>
, <jumble-head>
as the first element in <body>
, and the critical CSS/class to <html>
.
<!DOCTYPE html>
<html lang="en" class="jumble-init">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Protected Content</title>
<!-- Critical: Hide content during bot detection (100ms) -->
<style>
html.jumble-init * {
visibility: hidden !important;
}
</style>
<!-- Jumble Web Component Loader -->
<script type="module" src="https://unpkg.com/@4884org/jumble@latest/dist/jumble/jumble.esm.js"></script>
</head>
<body>
<!-- Place jumble-head as the first element in body -->
<jumble-head></jumble-head>
<main>
<h1>Your content will be protected automatically</h1>
<p>This text will be scrambled for bots but readable for humans.</p>
<img src="image.jpg" alt="This image will be hidden from bots">
<video src="video.mp4" controls></video>
</main>
</body>
</html>
This pattern works for React, Vue, Angular, Svelte, and static sites.
- Do not use
<jumble-head>
in any React/JSX/TSX/component file. - Do not add custom element type declarations for React/TypeScript.
- Do not use
defineCustomElements()
in your app code unless you need advanced loader scenarios. - Just add the loader script and
<jumble-head>
to your main HTML file as above.
Your app content works as usual. Jumble protects it automatically.
The <jumble-head>
component accepts the following attributes for customization:
Attribute | Type | Default | Description |
---|---|---|---|
allow-seo | boolean | true | Allow search engine bots to see real content for SEO. |
custom-allowlist | string[] | [] | Additional legitimate bot user agents to allow. |
custom-denylist | string[] | [] | Additional suspicious bot patterns to block. |
The <jumble-head>
component accepts the following attributes for customization:
Attribute | Type | Default | Description |
---|---|---|---|
allow-seo |
boolean |
true |
Controls SEO protection mode. When true , legitimate search engines see real content. When false , ALL bots see scrambled content. |
custom-allowlist |
string[] |
[] |
JSON array of additional bot patterns to allow (added to default search engines). Case-insensitive matching. |
custom-denylist |
string[] |
[] |
JSON array of additional bot patterns to block (added to default malicious patterns). Case-insensitive matching. |
<!-- β
Recommended: SEO-safe (default behavior) -->
<jumble-head></jumble-head>
<!-- π Maximum protection: Block ALL bots (including search engines) -->
<jumble-head allow-seo="false"></jumble-head>
<!-- π― Custom allowlist: Allow your partner's bot -->
<jumble-head custom-allowlist='["mypartnerbot", "trustedapi"]'></jumble-head>
<!-- π« Custom denylist: Block specific scrapers -->
<jumble-head custom-denylist='["badbot", "evilscraper", "harvester"]'></jumble-head>
<!-- β‘ Full configuration: Combined settings -->
<jumble-head
allow-seo="true"
custom-allowlist='["partnerbot", "apiservice"]'
custom-denylist='["maliciousbot", "scraper123"]'>
</jumble-head>
-
Default:
true
(SEO-safe mode) -
When
true
: Legitimate search engines (Google, Bing, Facebook, etc.) see real content for indexing and social previews -
When
false
: ALL bots see scrambled content (maximum protection but breaks SEO)
<!-- SEO-friendly (recommended for public websites) -->
<jumble-head allow-seo="true"></jumble-head>
<!-- Maximum security (recommended for private/internal content) -->
<jumble-head allow-seo="false"></jumble-head>
-
Default:
[]
(empty array) - Purpose: Add specific bot patterns that should be allowed to see real content
- Format: JSON array of strings (case-insensitive partial matching)
- Combined with: Default legitimate search engines (Google, Bing, Facebook, etc.)
<!-- Allow your monitoring service -->
<jumble-head custom-allowlist='["uptimerobot"]'></jumble-head>
<!-- Allow multiple partner services -->
<jumble-head custom-allowlist='["partnerapi", "trustedbot", "monitoring-service"]'></jumble-head>
-
Default:
[]
(empty array) - Purpose: Add specific bot patterns that should be blocked and see scrambled content
- Format: JSON array of strings (case-insensitive partial matching)
- Combined with: Default malicious patterns (headless browsers, scrapers, AI training bots, etc.)
<!-- Block specific scraper -->
<jumble-head custom-denylist='["badbot"]'></jumble-head>
<!-- Block multiple malicious patterns -->
<jumble-head custom-denylist='["scraper", "harvester", "malicious-bot", "content-stealer"]'></jumble-head>
<!-- E-commerce site: Allow price monitoring, block scrapers -->
<jumble-head
allow-seo="true"
custom-allowlist='["shopify-monitor", "price-tracker"]'
custom-denylist='["competitor-scraper", "price-spy"]'>
</jumble-head>
<!-- News site: Allow social media, block AI training -->
<jumble-head
allow-seo="true"
custom-denylist='["content-scraper", "article-bot"]'>
</jumble-head>
<!-- Private dashboard: Maximum protection -->
<jumble-head
allow-seo="false"
custom-allowlist='["internal-monitor"]'>
</jumble-head>
<!-- Public blog: Default SEO-safe protection -->
<jumble-head></jumble-head>
- Search Engines: Google, Bing, Yahoo, DuckDuckGo, Yandex, Baidu
- Social Media: Facebook, Twitter, LinkedIn, WhatsApp, Telegram, Discord
- SEO Tools: SEMrush, Ahrefs, Majestic, Moz, Screaming Frog
- Monitoring: UptimeRobot, Pingdom, GTmetrix
- Automation Tools: Selenium, Puppeteer, Playwright, PhantomJS
- Generic Scrapers: headless, scraper, crawler, harvester, extractor
- AI Training Bots: GPTBot, ClaudeBot, CCBot, meta-externalagent
- And 20+ more malicious patterns...
Jumble intelligently handles different writing systems:
- Latin Characters (a-z, A-Z): Scrambled to random lowercase letters
- Arabic Script (0x0600-0x06FF): Scrambled to random Arabic characters
- CJK Ideographs (0x4E00-0x9FFF): Scrambled to random Chinese characters
- Numbers (0-9): Left unchanged for functionality
- Punctuation & Symbols: Preserved to maintain readability structure
- Emojis: Remain unchanged
npm run build
npm test
npm run test:e2e
npm run test:coverage
npm start
- Live Demo - Interactive demonstration
- NPM Package - Package details
- GitHub Repository - Source code
MIT License - see LICENSE file for details.