Should We Remove Content Through Google Webmaster Tools?
-
We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search.
Some relevant notes:
- We transitioned the site from SiteCore to Wordpress to allow for greater flexibility
- The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS)
- The new content (and all required 301s) went live on 12/2
- The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog)
- As of posting this question, a site:domain search shows 6,110 results
While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful?
We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.
-
As Donna pointed out, the 'delay' tween what you expect time-line wise and what Google can 'do' is often longer than anyone would wish........
-
I agree with Ray-pp. It can take some time - weeks to months - for Google to catch up with the changes made to the site. Sounds like something else might be going on causing you to have so many extra pages indexed. Can you explain the cause of having ~5,000 extra pages indexed? When did they first start to appear? Are you sure you've configured your wordpress implementation to minimize unnecessary duplicates?
-
If you have implemented 301 redirects properly, then the old URLs (the ones redirecting to the new site) will naturally drop from the search engines as Google deems appropriate. There are a number of factors that influence when a page gets deindexed, such as the crawl rate for a website and how many links it may have.
If you really desire the pages to be removed, then as you've suggested you can ask for their removal from GWT. However, there is no harm is allowing them to stay indexed and waiting for Google to adjust appropriately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Errors in webmaster tools to pages that don't exist?
Hello, for sometime now we have URLs showing up in Google webmaster saying these are 404 errors but don't exist on our website.......but also never have? Heres an example cosmetic-dentistry/28yearold-southport-dentist-wins-best-young-dentist-award/801530293 The root being this goo.gl/vi4N4F Really confused about this? We have recently made our website wordpress? Thanks Ade
Intermediate & Advanced SEO | | popcreativeltd0 -
Google Search Results...
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
How long for Google Webmaster tools to update/reflect link changes
Hi all, Does anyone know or have experience of how long GWMT takes to update its data?, we did some work on our link profile back in October/November but are still seeing old links (removed) showing in GWMT. Thanks in advance,
Intermediate & Advanced SEO | | righty0 -
Why do pages with a 404 error drop out of webmaster tools only to reappear again?
I have noticed a lot of pages which have fallen out of webmaster tools crawl error log that had bee 404'ing are reappearing again Any suggestions as to why this might be the case? How can I make sure they don't reappear again?
Intermediate & Advanced SEO | | Towelsrus0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
Google Webmasters not Accurate
I recently updated all the Meta titles, descriptions and keywords on my website because in the past most were duplicate and/or written in the incorrect language. According to Webmaster Tools they have indexed our site post update, but we still have the same number of HTML issues. When I click to investigate the issues further it is clear they are reflecting the old Meta not the new stuff we just added. Should this fix itself the next time Google crawls my site or is there something else I should be doing about the issue? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Google.ca vs Google.com Ranking
I have a site I would like to rank high for particular keywords in the Google.ca searches and don't particularly care about the Google.com searches (it's a Canadian service). I have logged into Google Webmaster Tools and targeted Canada. Currently my site is ranking on the third page for my desired keywords on Google.com, but is on the 20th page for Google.ca. Previously this change happened quite quickly -- within 4 weeks -- but it doesn't seem to be taking here (12 weeks out and counting). My optimization seems to be fine since I'm ranking well on Google.com: not sure why it's not translating to Google.ca. Any help or thoughts would be appreciated.
Intermediate & Advanced SEO | | seorm0