Should We Remove Content Through Google Webmaster Tools?
-
We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search.
Some relevant notes:
- We transitioned the site from SiteCore to Wordpress to allow for greater flexibility
- The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS)
- The new content (and all required 301s) went live on 12/2
- The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog)
- As of posting this question, a site:domain search shows 6,110 results
While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful?
We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.
-
As Donna pointed out, the 'delay' tween what you expect time-line wise and what Google can 'do' is often longer than anyone would wish........
-
I agree with Ray-pp. It can take some time - weeks to months - for Google to catch up with the changes made to the site. Sounds like something else might be going on causing you to have so many extra pages indexed. Can you explain the cause of having ~5,000 extra pages indexed? When did they first start to appear? Are you sure you've configured your wordpress implementation to minimize unnecessary duplicates?
-
If you have implemented 301 redirects properly, then the old URLs (the ones redirecting to the new site) will naturally drop from the search engines as Google deems appropriate. There are a number of factors that influence when a page gets deindexed, such as the crawl rate for a website and how many links it may have.
If you really desire the pages to be removed, then as you've suggested you can ask for their removal from GWT. However, there is no harm is allowing them to stay indexed and waiting for Google to adjust appropriately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is tabbed content okay or bad for SEO? Google takes both sides.
Hello Moz Community! It seems like there are two opinions coming from directly from Google on tabbed content: 1) John Mueller says here that content is indexed but discounted 2) Matt Cutts says here that if you're not using tabs deceptively, you're in good shape I see this has been discussed in the Moz Q & A before, but I have an interesting situation: The pages I am building have ~50% static content, and ~50% tabbed content (only two tabs). Showing all tabbed content at once is not an option. Since the tabbed content will make up 50% of the total content, it's important that it is 100% weighted by Google. I can think of two ways to show it: 1) Standard tabs using jQuery Advantage: Both tab 1 and tab 2's content indexed Disadvantage: Tabbed content may be discounted? 2) Make the content of the tabs conditional on the server side website.com/page/ only shows tab 1's content in html website.com/page/?tab=2 only shows tab 2's content in the html. Include rel="canonical" pointing to website.com/page/. Advantage: Content of tab 1 indexed & 100% counted by Google Disadvantage: Content of tab 2 not indexed Which option is best? Is there a better solution?
Intermediate & Advanced SEO | | jamiestu130 -
How to handle broken links to phantom pages appearing in webmaster tools
Hi,Would love to hear different experiences and thoughts on this one. We have a site that is plagued with 404's in the Webmaster Tools. A significant number of them have never existed, for instance affiliates have linked to them with the wrong URL or scraper sites have linked to them with a truncated version of the URL and an ellipsis eg; /my-nonexistent... What's the best way to handle these? If we do nothing and mark as fixed, they reappear in the broken links report. If we 301 redirect and mark as fixed they reappear. We tried 410 (gone forever) and marking as fixed; they re-appeared. We have a lot of legacy broken links and we would really like to clean up our WMT broken link profile - does anyone know of a way we can make these links to non extistent pages disappear once and for all? Many thanks in advance!
Intermediate & Advanced SEO | | dancape0 -
Bypassing Google, Data Highlighter and Webmaster tools
eLLo! Has anyone used Data Highlighter? I've had colleagues mentioning a jump in CTR after using the data highlighter on pages. Thought I'll do the same and went into my webmaster tools but I've hit a brick wall. Whenever I highlight a product page, my country selector pops up and I'm unable to highlight a product page. A colleague of mine mentioned to bypass google by basing it on user agent, this will allow you to avoid the country selector. But if I bypass Google, wouldn't it affect Google Analytics, Indexing etc?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Switching from Google Plus Local to Google Plus Business
Greetings, We have a website design firm located in India. We wanted to target customers in our city who are looking for website design locally. And with google plus local and a few content marketing would get us into first page very soon because none in the competition is using social signals or even content marketing. BUT unfortunately from last month even though our Google Places is verified we cant verify our Google Local Plus page https://plus.google.com/b/116513400635428782065/ It just shows error 500. Its a bug and its been a year for people without it being addressed. So we are skeptical if our strategy would work without Google+. At the least we decided we would just make company local page and connect it with website. But it might not have effect as local. So we are still unsure which step to take either to wait for google to fix it.(feedbacks emails calls nothing worked) OR We start the process with Google Business Category.
Intermediate & Advanced SEO | | hard0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Strange Linking Data in Webmaster Tools
I run a site that was a Wordpress blog with Edirectory software for a directory on the back end. I've scrapped the Edirectory and built the entire site on Wordpress. After the site change I'm seeing about 700 404 Not Found crawling errors, which appear to be old Edirectory pages that no longer exist. My understanding is that they'll cycle out eventually. What troubles me is the linking data I'm seeing. In the "Links to My Site" area of Webmaster tools, I'm seeing 4,430 links to the "About" page, another 2,900 to an obscure deleted directory listing page and only 2,050 to the home page. I show 1,700 links to a terms and conditions pdf and other strange data. To summarize, I'm showing huge numbers of links to obscure pages. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | JSOC0