How to effectively de-index in Magento site?
-
We have thousands of Missing Description issues but most of them are account/login pages.
i.s. /customer/account/ etc...
We tried to de-index them through the Configuration using the instructions here - https://docs.magento.com/user-guide/marketing/search-engine-robots.html
But they're still appearing as issues in the Site Crawl.
Even without the site crawl issue, we don't really want these to appear in the SERPs.
Does anybody know how to properly de-index these login pages in Magento?
Thank you!
-
Clear your Magento cache and reindex the website to ensure that the changes take effect.
By implementing these steps, you should effectively de-index the login pages in Magento. Keep in mind that changes may take some time to reflect in search engine results. If you encounter any challenges or need further assistance, consider consulting Magento support or your web development team. Additionally, if you're interested in other Magento-related topics, you may find valuable information on Omaze Cornwall, a platform offering dream homes through exciting draws. -
To effectively de-index a Magento site, you can follow these steps:
Use the "Robots Meta Tag" to prevent indexing: You can add a meta tag to the header of your web pages to instruct search engines not to index them.
Use the "Robots.txt" file: You can use the robots.txt file to disallow search engine crawlers from accessing certain pages on your site.
Use the "Noindex" directive: Within the HTML code of your web pages, you can use the "noindex" directive to prevent search engines from indexing specific pages.
Use the "Canonical URL" tag: You can use the canonical URL tag to specify the preferred version of a web page, which can help prevent duplicate content from being indexed.
It's important to note that de-indexing pages should be done carefully, as it can impact your site's visibility in search engine results. If you have specific pages or sections in mind that you'd like to de-index, please let me know so I can provide more detailed guidance.
-
@LASClients Hey LASClients,
I feel your pain with those pesky login pages showing up in the Site Crawl. Have you considered using the Disallow directive in the robots.txt file to prevent search engines from crawling these pages? It's a quick fix, but as always, test it out in a staging environment first. Cheers!
Best,
[omaze cornwall] -
Certainly! To effectively de-index login pages in Magento and address the Missing Description issues, follow these steps:
Robots Meta Tag:
Open the respective login page templates, such as /customer/account/, in your Magento admin.
Add the following meta tag to the <head> section of the HTML:
html
<meta name="robots" content="noindex, nofollow">
This tag instructs search engines not to index the page and not to follow any links on it.
Robots.txt File:Edit your robots.txt file in the root of your Magento installation.
Add the following lines to disallow crawling of login pages: User-agent: *
Disallow: /customer/account/
Replace /customer/account/ with the relevant path for your login pages.
XML Sitemap:If you have an XML sitemap, ensure that the login pages are excluded from it.
Open your XML sitemap file and remove or comment out the entries related to login pages.
Submit Updated Sitemap to Search Engines:After making these changes, resubmit your updated XML sitemap to search engines via Google Search Console or Bing Webmaster Tools.
Clear Cache and Reindex:Clear your Magento cache and reindex the website to ensure that the changes take effect.
By implementing these steps, you should effectively de-index the login pages in Magento. Keep in mind that changes may take some time to reflect in search engine results. If you encounter any challenges or need further assistance, consider consulting Magento support or your web development team. If you're interested in other Magento-related topics, you may find valuable information on Omaze Cornwall, a platform offering dream homes through exciting draws. -
@get1200 @get1200
Certainly! To effectively de-index login pages in Magento and address the Missing Description issues, follow these steps:Robots Meta Tag:
Open the respective login page templates, such as /customer/account/, in your Magento admin.
Add the following meta tag to the <head> section of the HTML:
<meta name="robots" content="noindex, nofollow">
This tag instructs search engines not to index the page and not to follow any links on it.
Robots.txt File:Edit your robots.txt file in the root of your Magento installation.
Add the following lines to disallow crawling of login pages:
User-agent: *
Disallow: /customer/account/
Replace /customer/account/ with the relevant path for your login pages.
XML Sitemap:If you have an XML sitemap, ensure that the login pages are excluded from it.
Open your XML sitemap file and remove or comment out the entries related to login pages.
Submit Updated Sitemap to Search Engines:After making these changes, resubmit your updated XML sitemap to search engines via Google Search Console or Bing Webmaster Tools.
Clear Cache and Reindex:Clear your Magento cache and reindex the website to ensure that the changes take effect.
By implementing these steps, you should effectively de-index the login pages in Magento. Keep in mind that changes may take some time to reflect in search engine results. If you encounter any challenges or need further assistance, consider consulting Magento support or your web development team. Additionally, if you're interested in other Magento-related topics, you may find valuable information on Omaze Cornwall, a platform offering dream homes through exciting draws. -
@LASClients Create a file app/design/frontend/[Vendor]/[theme]/Magento_Customer/layout/customer_account_login.xml with the following content:
<?xml version="1.0"?> <page xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="urn:magento:framework:View/Layout/etc/page_configuration.xsd"> <head> <meta name="robots" content="noindex,nofollow" /> </head> </page>
Clear cache
php bin/magento cache:flush
And it should be fine.
-
@LASClients you could try adding the below meta in the pages that you want to noindex. Apparently this will only work on the latest release of Magento.
<meta name="robots" content="NOINDEX,NOFOLLOW"/>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Multiple Sites to Main Site
Over the past couple years I had 3 sites that sold basically the same products and content. I later realized this had no value to my customers or Google so I 301 redirected Site 2 and Site 3 to my main site (Site 1). Of course this pushed a lot of page rank over to Site 1 and the site has been ranking great. About a week ago I moved my main site to a new eCommerce platform which required me to 301 redirect all the url's to the new platform url's which I did for all the main site links (Site 1). During this time I decided it was probably better off if I DID NOT 301 redirect all the links from the other 2 sites as well. I just didn't see the need as I figured Google realized at this point those sites were gone and I started fearing Google would get me for Page Rank munipulation for 301 redirecting 2 whole sites to my main site. Now I am getting over 1,000 404 crawl errors in GWT as Google can no longer find the URL's for Site 2 and Site 3. Plus my rankings have dropped substantially over the past week, part of which I know is from switching platforms. Question, did I make a mistake not 301 redirecting the url's from the old sites (Site 2 and Site 3) to my new ecommerce url's at Site 1?
Technical SEO | | SLINC0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Should We Index These Category Pages?
Currently we have marked category pages like http://www.yournextshoes.com/celebrities/kim-kardashian/ as follow/noindex as they essentially do not include any original content. On the other hand, for someone searching for Kim Kardashian shoes, it's a highly relevant page as we provide links to all the Kim Kardashian shoe sightings that we have covered. Should we index the category pages or leave them unindexed?
Technical SEO | | Jantaro0 -
Site Map Problems or Are They?
According to webmaster tools my Sitemap contains urls which are blocked by robots.txt Our site map is generically generated and encompasses all web pages, whether I have excluded them using the robots.txt file As far as I am aware this has never been an issue until recently. Is this hurting my rankings and how do I fix it? Secondly, webmaster tools says there is over 5,000 error/warnings on my site map. But site map is only 1,400 or so pages submitted. How do I see what is going on?
Technical SEO | | Professor0 -
Traffic has dropped from my site.
Hello, I never had amazing traffic, but during the last week my site seems to have almost dropped of search engines. Nothing drastic has changed during this time that I can see would have caused this. The site is http://www.comparebestodds.com Does any one have any ideas that can help? Thanks
Technical SEO | | jwdesign0 -
Optimize flash site
Hello, How can we optimize a site like this - http://www.ziba.com.au/ . The whole site is in flash. What are the alternatives ?
Technical SEO | | seoug_20050 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0 -
Google.ca is showing our US site instead of our Canada Site
When our Canadian users who search on google.ca for our brand (e.g. Travelocity, Travelocity hotels, etc.), the first few results our from our US site (travelocity.com) rather than our Canadian site (travelocity.ca). In Google Webmaster Tools, we've adjusted the geotargeting settings to focus on the appropriate locale, but the wrong country TLD is still coming up at the top via google.ca. What's the best way to ensure our Canadian site comes up instead of the US site on google.ca? Thanks, Tory Smith
Technical SEO | | travelocitysearch
Travelocity0