How to decrease the crawl rate of Bing’s search engine bot

The MSN bot or BING bot crawls your website to index your website’s content and show the results under the Bing search engine.

Bing supports the directives of the Robots Exclusion Protocol (REP) which can be listed in a site’s robots.txt file.

The crawl rate of this bot can be controlled using this robots.txt file. In order to decrease the crawl rate you should add the following content inside the file:

User-agent: bingbot
Crawl-delay: 1

The following two crawlers are also used for some crawling duties: MSNBot (general resources) and MSNBot-Media (images and video).

The corresponding user agent are:

msnbot
msnbot-media

Bing recommends using the lowest possible value for the crawl-delay. Here is a list of values which you can use on your website:

  • No crawl delay set – Normal
  • 1 – Slow
  • 5 – Very Slow
  • 10 – Extremely Slow

An alternative method of managing the Bing crawl rate for your website is to register for Bing’s webmaster tools and set up your settings there.

Don’t forget – two of the SEO essentials are its loading speed and its geographical location. That is why it is important to choose the right host. Check out SiteGround web hosting services.