RapidBot is the incredibly easy tool designed to create a robots.txt file for your site directly from RapidWeaver.
Robots.txt file is retrieved by search engines, like Google and Bing, and used to state what pages need indexing and what must be ignored.
A robot (also known as Spider or Web Crawler) is a program that automatically traverses the web's hypertext structure by retrieving a document and recursively retrieving all documents that are referenced. Don't forget that instructing search engines bots help you adding visibility to your site and let people reach you in a more efficient way excluding irrelevant contents.
Only if search engines know what to do with your pages, they can give you a good ranking. RapidBot is a must-have SEO tool!