With the help of a Robots.txt file, site owners can tell search engines which parts of the site not to look in. Almost all search engines respect this protocol. Although you can create a Robots.txt file yourself, “MN Robots.txt Generator” makes the job much easier.
MN Robots.txt Generator is a free web service that can generate a Robots.txt file for you. You start by setting a default option (set this to either “allow” or “refused”). Next you type in your sitemap URL.
Below that you will find some robots (search engine names) like Google, Yahoo, and MSN Search, amongst others. You can choose the access of each search engine to be set as your default option or specifically “Allow” or “Refuse” its access. You will also find some special search bots like Google Image and Yahoo Blogs for which you perform the same “Allow” and “Refuse” procedure.
Finally you can enter the paths search engines should not look in, and click on the “Create Robots.txt” button. The generated text should be used as your Robots.txt file.
- Easily generate a robots.txt filefo your site.
- Lets site owners restrict access to search engines.
- You can specify exactly which site areas should engines not look in.
- A number of specific search robots and special bots are shown.
- Text is quickly generated that you can use as your Robots.txt file.
- Similar tools: Robots-txt-Checker, RapidsSiteMap and Copyc.at.
- Also read related articles:18 WordPress Security Plugins & Tips To Secure Your Blog
How To Submit Your Website To Bing & Google For More Traffic
Two Useful Google Chrome Extensions for SEO Guys
Check out MN Robots.txt Generator @ MN Robots.txt Generator