Description
Every day, bots — also known as robots or spiders, invade your website. Search engines like Google, Yahoo, and Bing send these bots to your site so your content can be crawled, indexed, and appear in search results. These types of bots are good, but there are some cases where you want to avoid the bot running around your website crawling and indexing everything. That’s where the robots.txt file comes in.
Adding certain directives to a robots.txt file directs the bots to crawl only the pages you want to be crawled. However, it’s important to understand that not every bot will adhere to the rules you write in your robots.txt file. Google, for instance, won’t listen to any directives you place in the file about crawling frequency.
Do you need a robots.txt file?
No, a robots.txt file is not required for a website. If a bot comes to your website and it doesn’t have one, it will just crawl your website and index pages as it normally would. However, there are benefits to having one including:
- Help manage server overloads
- Prevent crawl waste by bots that are visiting pages you do not want them to
- Keep certain folders or subdomains private






