Robots.txt is a file that contains the directives for search engines. It tells the search engine crawlers which URLs the crawler cannot access. Most major search engines (including Google, Bing, and Yahoo) recognize and honor robots.txt requests. This is used mainly to avoid overloading your site with requests and it is not a mechanism for keeping a web page out of Google.
A robots.txt file is automatically generated for your store. This file lives at the root of your site. You can view the robots.txt file by appending "robots.txt" at your store's URL (for example, www.yourstore.com/robots.txt). robots.txt is a plain text file that follows the Robots Exclusion Standard.
The automatically generated robots.txt file looks like as shown below:
User-agent: * Disallow: /admin Disallow: /checkout Disallow: /order Disallow: /user Disallow: /account Disallow: /collections/*+* Disallow: /cart Sitemap:https://storehippo.com/sitemap.xml User-agent: AdsBot-Google Disallow: /admin Disallow: /checkout Disallow: /cart User-agent: Nutch Disallow: /
In case, if you want to update/edit the custom robots.txt file, you can add the custom robots.txt file. To do so, follow the steps mentioned below:
Now, the default robots.txt URL (www.yourstore.com/robots.txt) will serve content from the custom robots.txt file.