1. Select a user agent (robot) in the left column. The default is “ALL
User Agents”, then click on the “ADD USER AGENT” button.
2. Type in directories and/or web pages that you do not want indexed, then
click on the “ADD” button. Make sure there is a forward “slash”
in-between directories and pages, and at the end of directories (indicated in red below). Repeat this step until you have added
all of the files you want under the “User Agent”. If you don’t want a user
agent to index ANY of your site, LEAVE THE TEXTFIELD BLANK and click on the “ADD”
button. A “/” will appear after “Disallow:” in the code
3. Once step 2 is complete, you may repeat steps 1 & 2
again until you have added all of the “User Agents” you want, and files
under each that you want to “Disallow”.
4. When complete, click on “COPY CODE” to save the robots.txt code.