Kerouac middleware that gives instructions to web crawlers using the Robots Exclusion Protocol.
$ npm install kerouac-robotstxt
Declare a robots.txt
route, using this middleware.
var robots = require('kerouac-robotstxt');
site.page('/robots.txt', robots());
And map a robots.txt
file when generating the site.
site.generate([
robots.createMapper()
]);
The generated output will include a /robots.txt
file. If your site contains
any sitemaps, which can be generated using kerouac-sitemap,
the locations of those sitemaps will be included so that search engines can
automatically discover all pages of your site.
Copyright (c) 2012-2022 Jared Hanson <https://www.jaredhanson.me/>