A metalsmith plugin for generating a robots.txt file
This plugin allows you to generate a robots.txt file. It accepts global options, and can be triggered from a file's frontmatter with the public
and private
keywords. Works well with metalsmith-mapsite, as that also accepts setting a page to private from the frontmatter.
For support questions please use stack overflow or the metalsmith slack channel.
$ npm install metalsmith-robots
Configuration in metalsmith.json
:
{
"plugins": {
"metalsmith-robots": {
"useragent": "googlebot",
"allow": ["index.html", "about.html"],
"disallow": ["404.html"],
"sitemap": "https://www.site.com/sitemap.xml"
}
}
}
Which will generate the following robots.txt:
User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml
You can pass options to metalsmith-robots
with the Javascript API or CLI. The options are:
useragent
: the useragent - String, default:*
allow
: an array of the url(s) to allow - Array of Stringsdisallow
: an array of the url(s) to disallow - Array of Stringssitemap
: the sitemap url - StringurlMangle
: mangle paths inallow
anddisallow
- Function
Besides these options, settings public: true
or private: true
in a file's frontmatter will add that page to the allow
or disallow
option respectively. metalsmith-robots
expects at least one of the last three options, without them it will not generate a robots.txt.
To make sure paths start with a /
you can mangle urls that are provided via allow
and disallow
.
.use(robots({
urlMangle: (filepath) => {
return (filepath.slice(0, 1) !== '/') ? `/${filepath}` : filepath;
}
}))
MIT