When you have pages on your website that you do not want search engines to crawl and index, you should use a robots.txt file.
- When you have multiple versions of a page to indicate the preferred version
- When your website receives a penalty from Google
- When you have pages that you don’t want search engines to crawl and index
- Whenever you feel like it
The correct answer is: When you have pages that you don’t want search engines to crawl and index
A robots.txt file is a directive used to manage and control the behavior of search engine bots. It’s essential when there are specific pages or sections of your site that you prefer to keep out of the search engine’s index. By placing a robots.txt file in the root directory of your site, you can indicate which areas search engines are allowed to access and which are off-limits, ensuring that only the content you want to be found is crawled and indexed.
Source: HubSpot Lesson: On Page and Technical SEO