Tens of duplicate homepages indexed and blocked later: How to remove from Google cache?
-
Hi community,
Due to some WP plugin issue, many homepages indexed in Google with anonymous URLs. We blocked them later. Still they are in SERP. I wonder whether these are causing some trouble to our website, especially as our exact homepages indexed. How to remove these pages from Google cache? Is that the right approach?
Thanks
-
Hi Nigel,
Thanks for the suggestion. I'm going to use "Remove URLs" tool from GSC. They have been created due to a bug in the Yoast SEO plugin. Very unfortunate and we paid for no mistake from our end.
Removing from SERP means removing from Google index also? Or Google will still consider them and just stops showing us? My intention is: Anyway we blocked them, but whether they will cause some distraction to our ranking efforts being there in results being cached.
Thanks
-
Thanks!
A agree - I have just done a similar clean up by:
1. Don't let them be created
2. Redirect all previous versions!One site I just worked on had 8 versions of the home page! lol
http
https
/index.php
/index.php/A mess!
We stopped them all being created and 301'd all versions just in case they were indexed anywhere or linked externally.
Cheers
-
It is assuredly true that, just like in any number of fields (medicine) - in SEO, prevention is better than cleanup based methodology. If your website doesn't take its medicine, you get problems like this one
I think your advice here was really good
-
Good solid advice
They can be created in any number of ways but it's normally simple enough to specify the preferred URL on the server then move any variations in htaccess, such as those with www (if the none www is preferred), those with a trailing slash at the end etc.
The self canonical on all will sort out any other duplicates.
As for getting rid of them - the search console way is the quickest. If they don't exist after that then the won't be reindexed unless they are linked from somewhere else. In such cases, they will 301 from htaccess so it shouldn't be a problem.
if you 410 you will lose any benefit from those links going to the pages and it's a bad experience for a visitor. Always 301 do not 410 if it is a version.
410s are fine for old pages you never want to see in the index again but not for a home page version.
Regards
Nigel
-
It's likely that you don't have access to edit the coding on these weird plugin URLs. As such, normal techniques like using a Meta no-index tag in the HTML may be non-viable.
You could use the HTTP header (server level stuff) to help you out. I'd advise adding two strong directives to the afflicted URLs through the HTTP header so that Google gets the message:
-
Use the X-Robots deployment of the no-index directive on the affected URLs, at the HTTP header (not the HTML) level. That linked pages tells you about the normal HTML implementation, but also about the X-Robots implementation which is the one you need (scroll down a bit)
-
Serve status code 410 (gone) on the affected URLs
That should prompt Google to de-index those pages. Once they are de-indexed, you can use robots.txt to block Google from crawling such URLs in the future (which will stop the problem happening again!)
It's important to de-index the URLs before you do any robots.txt stuff. If Google can't crawl the affected URLs, it can't find the info (in the HTTP header) to know that it should de-index those pages
Once Google is blocked from both indexing and crawling these pages, they should begin to stop caching them too
Hope that helps
-
-
+1 for "Make sure that they are not created in the first place" haha
-
Hi again vtmoz!
1. Make sure that they are not created in the first place
2. Make sure that they are not in the sitemap
3. Go to search console and remove any you do not want - it will say temporary removal but they will not come back if they are not in the structure or the sitemap.More:
https://support.google.com/webmasters/answer/1663419?hl=en
Note: Always self canonicalize the home page to stop versions with UTM codes (created by Facebook, Twitter etc) appearing in SERPS
Regards
Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
Removing an old Google places listing for a newer version?
Hey there, I was wondering whether you could help me out on the following; One of our clients has a Google places listing that we created for their business but it appears to be being blocked - or at least conflicting - with an old listing. As such, Google appears to be showing the old listing with an outdated URL and company name - rather than the new one. Does anyone know how I can go about removing this listing or showing that the newer one is now more relevant? Unfortunately, I don't have the logins for the old places listing. Old listing; https://plus.google.com/105224923085379238289 New listing; https://plus.google.com/b/114641937407677713536/114641937407677713536
Algorithm Updates | | Webrevolve0 -
Why are Google Webmaster Tools' Google rankings different to actual Google rankings?
Dear Moz, We have noticed that according to Google Webmaster Tools one of our client sites is ranking very prominently for some of the major key phrases that we are trying to rank them for. However, when we perform a Google search for these queries, our client's content is nowhere to be seen, not even on the 5th page (we logged out of the Google account before performing the test). A long-term manual spam action on our client's site was recently lifted by Google - is it possible that Google Webmaster Tools is providing data about our client's estimated Google rankings, without taking into consideration the penalty of the manual spam action which was taken? Thanks
Algorithm Updates | | BoomDialogue690 -
Weekly traffic from Google: How do you explain this?
Hello here, I have question for you. Please, have a look at the attached image from my website analytics which shows the unique visits trend of the last 2 months. What it is interesting is that every Monday Google brings me more traffic than any other day of the week, whereas on Saturdays it gives me the lowest traffic. And looks like that's a pretty regular weekly pattern. Why is Google doing that? What does that mean? Why such a clear and steady pattern? I am eager to know your thoughts about this! CGuULrN.jpg
Algorithm Updates | | fablau0 -
Am I the only one experiencing this Google SERP problem?
I perform Google searches every single day, sometimes several times in a day. These searches have nothing to do with being a marketer--they're simply as a consumer, researcher, person who needs a question answered, or in other words: a typical person. For about the past month or so, I have been unsuccessful at finding what I'm looking for on the first try EVERY SINGLE TIME. Yes, I mean it--every single time. I'm left either going all the way to the third page, clicking dozens of results and retuning to the SERPs, or having to start over with a differently worded query. This is far too often to be a coincidence. Has this been happening to anymore else? I know there was a recent significant algorithm update, right? I always look at algorithm updates through the eyes of an SEO, but I'm currently looking at it through the eyes for an average searcher, and I'm frustrated! It's been like trying to find something on Bing!
Algorithm Updates | | UnderRugSwept0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Top of Google Places but not Organic
Hi There, My website www.drivingbrighton.co.uk is number 1 for Google places for my area, however I'm not on page 1 of google organically. Is it possible to do this, or does Google not let you have a places and organic links? Any opinions welcome! Ant
Algorithm Updates | | Ant710 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0