A file called robots.txt may be added to the root folder of your website to improve how search engines index it. Website crawlers, or robots, are used by search engines like Google to examine all the material on your website. You might not want some areas of your website, such the admin page, to be indexed so that they can appear in user search results. You can explicitly disregard certain pages by adding them to the file. The Robots Exclusion Protocol is used by robots.txt files. You may quickly create the file using this website by entering the pages you want to exclude.