22 votes

Cultiv DynamicRobots

The problem: You've made a multisite solution (either for multiple languages or just multiple sites in one Umbraco installation). You want to offer an XML sitemap per site, but your robots.txt can't display more than one URL for a sitemap.

The solution: DynamicRobots! This httpHandler will look for the string {HTTP_HOST} in your robots.txt file and replace it with the current site for you, so that spiders get a complete and valid URL for your sitemap.

Example: 
Sitemap: http://{HTTP_HOST}/sitemap

Note: if you are caching .txt files in IIS, then the hostname won't always change correctly, so make sure to disable caching for those files.

You can still edit your robots.txt file with the excellent robots.txt editor package by Lee Kelleher: http://our.umbraco.org/projects/developer-tools/robotstxt-editor

---

Also check out the Cultiv SearchEngineSitemap package, this package supports multisite solutions out of the box: http://our.umbraco.org/projects/website-utilities/cultiv-search-engine-sitemap

Screenshots

Package owner

Sebastiaan Janssen

Sebastiaan Janssen

Sebastiaan has 15475 karma points

Package Compatibility

This package is compatible with the following versions as reported by community members who have downloaded this package:
Untested or doesn't work on Umbraco Cloud
Version 8.18.x (untested)

You must login before you can report on package compatibility.

Previously reported to work on versions: 7.3.x, 7.0.x

Package Information

  • Package owner: Sebastiaan Janssen
  • Created: 20/07/2010
  • Current version 1.0.0
  • .NET version 4.5.2
  • License MIT
  • Downloads on Our: 3.7K