Should We Remove Content Through Google Webmaster Tools?
-
We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search.
Some relevant notes:
- We transitioned the site from SiteCore to Wordpress to allow for greater flexibility
- The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS)
- The new content (and all required 301s) went live on 12/2
- The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog)
- As of posting this question, a site:domain search shows 6,110 results
While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful?
We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.
-
As Donna pointed out, the 'delay' tween what you expect time-line wise and what Google can 'do' is often longer than anyone would wish........
-
I agree with Ray-pp. It can take some time - weeks to months - for Google to catch up with the changes made to the site. Sounds like something else might be going on causing you to have so many extra pages indexed. Can you explain the cause of having ~5,000 extra pages indexed? When did they first start to appear? Are you sure you've configured your wordpress implementation to minimize unnecessary duplicates?
-
If you have implemented 301 redirects properly, then the old URLs (the ones redirecting to the new site) will naturally drop from the search engines as Google deems appropriate. There are a number of factors that influence when a page gets deindexed, such as the crawl rate for a website and how many links it may have.
If you really desire the pages to be removed, then as you've suggested you can ask for their removal from GWT. However, there is no harm is allowing them to stay indexed and waiting for Google to adjust appropriately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my content being fully read by Google?
Hi mozzers, I wanted to ask you a quick question regarding Google's crawlability of webpages. We just launched a series of content pieces but I believe there's an issue.
Intermediate & Advanced SEO | | TyEl
Based on what I am seeing when I inspect the URL it looks like Google is only able to see a few titles and internal links. For instance, when I inspect one of the URLs on GSC this is the screenshot I am seeing: image.pngWhen I perform the "cache:" I barely see any content**:** image.pngVS one of our blog post image.png Would you agree with me there's a problem here? Is this related to the heavy use of JS? If so somehow I wasn't able to detect this on any of the crawling tools? Thanks!0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
How does Google Keywords Tool compile search volume data from auto-suggest terms?
Hi everyone. This question has been nagging at my mind today ever since I had a colleague say "no one ever searches for the term 'presonus 16.4.2'" My argument is "Yes they do." My argument is based on the fact that when you type in 'presonus 16" - Google's auto-suggest lists several options, of which presonus 16.4.2 is one. That being said. Does Google's Keyword Tool base traffic estimates ONLY on actualy keywords typed in by the user, in this case "presonus 16" or does it also compile data for searchers who opt for the "suggested" term "presonus 16.4.2" ??? To clarify, does anyone have any insight as to whether Google is compiling data on strictly the term typed in from a use or giving precendence to a term being selected by a user that was listed as an auto-suggest, or, are they being counted twice???? Very curious to know everyone's take on this! Thanks!
Intermediate & Advanced SEO | | danatanseo0 -
We are ignored by Google - what should we do?
Hi, We believe that our website - https://en.greatfire.org - is being all but ignored by Google Search. The following two examples illustrate our case. 1. Searching for “China listening in on Skype - Microsoft assumes you approve”. This is the title of a blog post that we wrote which received some 50,000 visits. On Yahoo and Bing search, we rank first for this search. On Google, however, we rank 7th. Each of the six pages ranking higher than us are quoting and linking to our story. 2. Searching for “Online Censorship In China”. This is the title of our front page. Yahoo and Bing both rank us third for this search. On Google, however, we are not even among the first 300 results. Two of the pages among the first 10 results link to us. Our website has an average of around 1000 visits per day. We are quoted in and linked from virtually all Western mainstream media (see https://en.greatfire.org/press). Yet to this day we are receiving almost no traffic from Google Search. Our mission is to bring transparency to online censorship in China. If people could find us in Google, it would greatly help to spread awareness of the extent of Internet restrictions here. If you could indicate to us what the cause of our poor rankings could be, we would be very grateful. Thank you for your time and consideration.
Intermediate & Advanced SEO | | GreatFire.org0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0