metalsmith-robots
A metalsmith plugin for generating a robots.txt file
This plugin allows you to generate a robots.txt file. It accepts global options, and can be triggered from a file's frontmatter with the public
and private
keywords. Works well with metalsmith-mapsite, as that also accepts setting a page to private from the frontmatter.
For support questions please use stack overflow or the metalsmith slack channel.
Installation
$ npm install metalsmith-robots
Example
Configuration in metalsmith.json
:
Which will generate the following robots.txt:
User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml
Options
You can pass options to metalsmith-robots
with the Javascript API or CLI. The options are:
useragent
: the useragent - String, default:*
allow
: an array of the url(s) to allow - Array of Stringsdisallow
: an array of the url(s) to disallow - Array of Stringssitemap
: the sitemap url - StringurlMangle
: mangle paths inallow
anddisallow
- Function
Besides these options, settings public: true
or private: true
in a file's frontmatter will add that page to the allow
or disallow
option respectively. metalsmith-robots
expects at least one of the last three options, without them it will not generate a robots.txt.
urlMangle
To make sure paths start with a /
you can mangle urls that are provided via allow
and disallow
.
License
MIT