After implementing the “noindex” tag, you’ll need to make sure that none of your internal search URLs are actually indexed. Perform a search for “ site:exampleom inurl:catalogsearch”. If you see the URLs in the index, we recommend waiting until Google removes most URLs. If you don’t see the URLs in the index, you might consider blocking them using the robotsxt command.
Robotsxt
In Magento, you can also configure a robotsxt file. You need to use a robotsxt file to limit the number of pages of your Magento site that Google is eligible to crawl. This is especially important to configure if your site uses faceted navigation that allows users to select from a variety of attributes.
Fortunately, Magento does allow you to control your china mobile database site’s robotsxt . To do this, you can follow these steps:
In the Administration sidebar, navigate to Content > Design > Configure
Expand the "Search Engine Robots" drop-down menu
Add your robotsxt commands in the Edit custom directives for robotsxt file field
How you adjust your robotsxt will depend on your specific store. Unfortunately, there is no one-size-fits-all option here. The main goal is to block crawling of any low-value pages not indexed while allowing crawling of high-priority pages.
Here are some general things you might consider blocking in your robotsxt:
Low-value pages created by faceted navigation and sorting options
Internal search page for this site
Login Page
User's shopping cart
Sitemap.xml
The Sitemap.xml file ensures that Google has a way to discover all the critical URLs of your website. This means that no matter how your website is structured, the sitemap.xml provides Google with a way to find the important URLs on your website.