Controlling Web Site Crawling Using Search Engine Visibility
The Control Crawling option of Search Engine Visibility lets you control what pages of your site can be crawled, and generates the required file or tags.
From the Optimize tab, select Control Crawling to get started.
To Control Crawling for your website
Each tab lets you perform the indicated function and generates the necessary file. On each tab you can do the following:
- Allow All — Lets crawlers access all of your site's pages.
- Block specific web pages and search engines — Lets you select specific pages of your site to block and then, select from a list of search engines that cannot crawl the pages.
- Block All — Blocks all of your site's pages from crawlers. NOTE: This is not recommended.
Once you have determined how you want to control crawling for your site, from the tab you chose, click Create robots.txt or Create Meta Tag to generate the appropriate file. For more information on robots.txt and Meta Tags, see What's the difference between robots.txt and Meta Tags?