Robots.txt Plugin
Introduction
Functionality
The Robots.txt plugin adds a special document type to Bloomreach Experience Manager, allowing webmasters to manage the contents of therobots.txt file retrieved by webbots. SeeRobots.txt Specifications for more information on the format and purpose of that file.
The plugin provides Beans and Components for retrieving therobots.txt-related data from the content repository, and a sample Freemarker template for rendering that data as arobots.txt file.
Below is a screenshot of the CMS document editor, where the content of therobots.txt file(s) can be entered / modified.

The screenshot above results in the followingrobots.txt in the site:
User-agent: *Disallow: /skip/this/urlDisallow: /a/bUser-agent: googlebotDisallow: /a/bDisallow: /x/ySitemap: http://www.example.com/sitemap.xmlSitemap: http://subsite.example.com/subsite-map.xml
By default, the Robots.txt plugin disallows all URLs pertaining to a preview-site, if such a site should be exposed publicly, such that search engines do not index preview sites.
Source Code
https://github.com/bloomreach/brxm/tree/brxm-14.7.3/robotstxt