Robots Generator

Robots are used by search engines to index the web.

The Robots Generator helps you create a robots.text file to exclude certain robots or disallows access to specified folders of your website. This file must be copied to the root directory of your domain. You will find the Robots Generator in the Tools section.

How the robots.txt works:

To determine Use User-agent: Name of the Robot you determine for which search engine robot the orders are valid. By adding Disallow entries you can determine folders which you don't want to be indexed, for example, because they have not yet been finished. In addition, you can add comments for yourself which are identified by an # and ignored by robots.

To address all robots, use the * character.

To create a completely New File, click the according button in the toolbar, otherwise click Open File.

Click New to add Robots Exclusions and the path to your sitemap. By clicking Edit you can make changes to the existing entries.

Tipp: Use the Site Scanner to read the structure of your site and select directories to be locked for selected agents.

Below the list of User Agents and Disallow paths you can see the source text of your robots.txt file. Click Refresh to apply any changes you made in the upper part to that text or vice versa.

When finished you can Save your file or Export its content to a text file.