Panda Recovery - What is the best way to shrink your index and make Google aware?
-
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason.
We have reduced our index size by 95% and have done significant content development on the remaining 5% pages.
For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size.
Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc?
Thanks /sp80
-
Hi. I would be curious to know if anyone else has experienced something similar and recovered from Panda. How long did it take you? Did you manually remove the pages, set up 410s or 404s, or create 301s?
I've been working on a site for sometime now which has lost a great of traffic since July 2013. Over the past 2 months, a process has gone underway to manually remove the URLs from the index. The index has been cut in half, but still not at what it was pre-penalty. About 20,000 more pages to figure out what needs to be removed before it reaches the level it was before the massive traffic drop.
Any recovery or insight would be helpful.
-
Hi Sp80 (and group),
It's been about six months since you posted your Panda recovery question. I'm curious if you implemented Kerry22's suggestions, and what results you've seen. I hope it's worked out for you.
We're also dealing with removing thousands of pages of thin content (through 410s, keeping links up and sitemaps, as per Kerry's suggestion). This was a very helpful discussion to read.
Thanks,
Tom
-
Hi kerry,
Your post gives me some hope. I was hit by Panda in Feb. 2011 and lost 85% of my google traffic Made many changes to my site -- page deletions re-directs added content etc. Got a bump of 25% in September 2011 but lost that and more afterward.
We have an e-commerce gift site with 6000 pages. Is your site an e-commerce site?
I have not found a recovery story from any sites like mine that were hit with that large a drop.
I hope your recovery would relate to my situation.
-
Did Google process the 301s? In other words, are the old pages still in the index or not? If they processed the 301s eventually, you generally should be ok. If the old URLs seem stranded, then you might be best setting up the XML sitemap with those old URLs to just kick Google a little. I don't think I'd switch signals and move from a 301 to 404, unless the old pages are low quality, had bad links, etc.
Unfortunately, these things are very situational, so it can be hard to speak in generalities.
-
Hi Dr. Pete,
I know this is a late entry into this thread, but.. what if we did all our content cutting in the wrong ways over the past year – is there something we could/should do now to correct for this? Our site was hit by panda back in March 2012, and since then we've cut content several times. But we didn’t use this good process you advocate – here’s what we did when we cut pages:
1. We set up permanent 301 redirects for all of them immediately
2. Simultaneously, we always removed all links pointing to cut pages (we wanted to make sure users didn’t get redirected all the time)This is a far cry from what you recommend and what Kerry22 did to recover successfully. If you have some advice on the following questions, I’d definitely appreciate it:
- Is it possible Google still thinks we have this content on our site or intend to bring it back, and as a result we continue to suffer?
- If that is a possibility, then what can we do now (if anything) to correct the damage we did?
We're thinking about removing all of those 301s now, letting all cut content return 404s and making a separate sitemap of cut content to submit it to Google. Do you think it's too late or otherwise inadvisable for us to do this kind of thing?
Thanks in advance,
Eric -
It might be worth exploring NOINDEX'ing the useful pages and 410'ing the non-useful ones, if only because sometimes a mix of signals is more palatable to Google. Any time you remove a swatch of content with one method, it can trigger alarm bells. I'll be honest, though - these situations are almost always tricky and you almost always have to measure and adjust. I've never found a method that's right for all situations.
-
Thanks Pete,
I appreciate your input. Next to the additional sitemap with the known Google-indexed URLs we want deindexed, we also have reopened some crawl paths to these pages to see if there is a speed up.
This is an undertaking carried out across 30 international properties so we will be able to experiment with measures for certain domains and see how it affects de-indexing speed as we are tracking the numbers reported by Google daily.
I agree about the bad user experience of 410s as a dead end. We are mostly de-indexing as a mean of recovery from Panda but the content pages that we try to deindex are actually still useful to the users, just thin and partially duplicative in content. We have decided to still display the content when such page is reached but return a status code of 410. Alternatively it seems we could just set the robot tag to noindex but my feeling is the 410 approach will lead to faster deindexing - would you agree?
Also if you have any expertise to share on how to compile a more ocomprehensive list of URLs indexed by Google for a particular domain other than scraping the web interface using the site:domain.com query approach (only returns a small subset compared to the stated total number of indexed pages) please let me know.
Thanks again /Thomas
-
If you want to completely remove these pages, I think Kerry22 is spot on. A 410 is about the fastest method we know of, and her points about leaving the crawl paths open are very important. I completely agree with leaving them in a stand-alone sitemap - that's good advice.
Saw your other answer, so I assume you don't want to 301 or canonical these pages. The only caveat I'd add is user value. Even if the pages have no links, make sure people aren't trying to visit them.
This can take time, especially at large scale, and a massive removal can look odd to Google. This doesn't generally result in a penalty or major problems, but it can cause short-term issues as Google re-evaluates the site.
The only option to speed it up is, if the pages have a consistent URL parameter or folder structure, you may be able to do a mass removal in Google Webmaster Tools. This can be faster, but it's constrained to similar-looking URLs. In other words, there has to be a pattern. The benefit is that you can make the GWT request on top of the 410s, so that can sometimes help. Any massive change takes time, though, and often requires some course correction, I find.
-
Think second sitemap will be fine. Wouldn't add a page with just links as that is the type of page Panda doesn't like.
Regarding sets of pages - we started by going into the search results - found a lot of content that shouldn't have been indexed.
We then looked manually at the content on subsets of pages and found pages that were thin and very similar to others (at the product level) and either made them more unique or removed them. Tools like this also help identify similar pages across products/categories http://www.copyscape.com/compare.php
It's only been 2 weeks, so it looks like we have pretty much 80% recovered and still improving - still looking at numbers and over Christmas and NY obviously traffic is quiet. I think 100% recovery is dependent on too many variables, like whether you continue link building during your time fixing the site, losing links by removing pages, adding more pages, competitors gaining authority/rankings etc
-
Hey Kerry,
There was addition of additional pages in April which is also when our sites started seeing a decrease in rankings - so the timing adds up.
The drops starting June have no clear root for us - we started our de-indexation process starting of December.
We are thinking to speed up indexation exclusively through a second Google Sitemap as anything else would need to be a very artificial landing page with a high number of links at this point. Would you be concerned exclusively using a Sitemap over keeping the unwanted pages linked from your linking structure?
Further, I am interested in how you determined the set of pages you know were part of the Google index to be delisted? It appears the best way to do so is to scrape the Google search results of pages returned for a domain and build up a list this way.
Did you recover completely to prior Panda?
Best /Thomas
-
Hi
No problem, I am happy to help!
Yes. graph declined sloooowly but only when we started removing pages. This is half the problem - you have to wait for Google to find the changes. The waiting is frustrating as you don't know if what you have done is right, but the stuff I listed will help speed it up. We literally had to wait until none of the pages could be found in the index.
I see a big increase in your indexation from April to May 2012. When did you get hit and what happened over that month - did you add a lot of new pages/products? Are those drops in indexation from June to Dec 2012 you removing pages or did the drop just start to 'happen' and then you got hit?
-
Kerry,
Thank your for your amazing response on the deindexing question I had. It was incredibly well written and very easy to follow. Very happy to hear you were able to recover.
You make a really good point; allowing Google to still be able to reach the pages; when we started reviewing our site structure we also changed our linking structure so while all pages we dont want to have longer in the index return a 410 they certainly aren't all discoverable. Our assumption was that Google will revisit them sooner or later given that they are part of the index but I can definitely imagine that thinks would get sped up by compiling a dedicated sitemap.
A big question I would have for you is how did the index status graph adjust for you in GWT over time? We started our restructuring start of January and we can't see a difference yet: http://imgur.com/eKBJ0
Did you graph decline step by step?
Thanks again
-
Hi
We just recovered from Panda - took us 6 months, but the best way to do this is to 410 or 404 your pages, but don't remove the links. If you remove the links to those pages then Google won't be able to find those pages and know that you have removed them.
Here are the steps you need to follow to get the changes indexed:
1. Remove the pages but leave the links to them on your site (we left these discretely at the bottom of the pages they were on, so users wouldn't find them easily, but Google would). You will see Google slowly start to pick up the number of 404s/410s in Webmaster Tools - don't worry about so many 410s being picked up - it won't hurt you. Don't no follow links, remove links, or block pages with robots.txt. You want Google to find your changes.
2. Revise your sitemaps - take the 410 pages out of the original sitemap and add them to a new separate sitemap and submit this in Webmaster Tools. Then you can see the true indexation rates of your current pages (gives you a good idea of how many are indexed vs not and if you still have issues). You can then also track the deindexation of your 410s separately - see how fast they are being deindexed - be patient, it takes time. We only recovered once they were all deindexed.
Our decision to use sitemaps as well as internal links was due to the fact that some deep pages are only crawled periodically and we wanted Google to find the changes quickly. This is useful: http://www.seomoz.org/blog/logic-meet-google-crawling-to-deindex
4. Then Wait If all your pages are removed and you are still affected by Panda, start looking for more duplicate content, and look with an objective view at your pages that still exist. You may be surprised with what you find. The process took us 6 months because we had to wait for Google to pick up our changes, and then revise, tweak, look for more to do etc.
I will write a case study soon, but in the meantime hope this helps you! I know how frustrating it is.
PS. If you are losing link value from 410s, 410 first, recover from Panda, and then 301 the select pages that have links to get the link juice back. It will be faster that way.
-
Google is already recrawling those pages for the last months but is returning to the pages that return 410. We have very explicit logging configured.
Google URL removal tool is not an option due to the manual character of the submission.
-
I think you need to wait for Google to get them recrawl these pages .. however, you can use Google URL removal tool in Webmaster Tools...
-
Thanks,
To be clear - my question does not look for proposals to recovery but implementation advice around shrinking the Google index size. We are talking about a scale of 10 thousands of pages. /Thomas
-
what about this approach - I am assuming that you know the exact date when the rank falls ..
You need to compare the traffic from Google for each pages. Find out those pages that suffered the most. Either get them removed [just exactly what you are doing] or completely rewrite them, adding nice images, videos etc, in short make it more interactive.
Now locate pages that are not that much affected. You need to make slight changes in them. Do not remove these pages.
Now locate those pages that have not affected at all. If those pages are content heavy, you need to produce some more pages with well written content./
Hope that helps.
-
Correct, it is intentional. The removed links have no link juice. The hop is though that an explicit 410 is a clearer signal for Google to remove the pages form the index.
I have been reading warnings around implementing a significant volume of 301s as it could be considered unnatural.
-
Just curious, is there any reason you did a 410 instead of a 301? I think most webmasters would setup 301 redirects to the most relevant remaining page for each of the pages that you did remove. With a 410, you're effectively dropping backlinks that might have existed to any of the pages that you had.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Google slow to index pages
Hi We've recently had a product launch for one of our clients. Historically speaking Google has been quick to respond, i.e when the page for the product goes live it's indexed and performing for branded terms within 10 minutes (without 'Fetch and Render'). This time however, we found that it took Google over an hour to index the pages. we found initially that press coverage ranked until we were indexed. Nothing major had changed in terms of the page structure, content, internal linking etc; these were brand new pages, with new product content. Has anyone ever experienced Google having an 'off' day or being uncharacteristically slow with indexing? We do have a few ideas what could have caused this, but we were interested to see if anyone else had experienced this sort of change in Google's behaviour, either recently or previously? Thanks.
Intermediate & Advanced SEO | | punchseo0 -
Why Google isn't indexing my images?
Hello, on my fairly new website Worthminer.com I am noticing that Google is not indexing images from my sitemap. Already 560 images submitted and Google indexed only 3 of them. Altough there is more images indexed they are not indexing any new images, and I have no idea why. Posts, categories and other urls are indexing just fine, but images not. I am using Wordpress and for sitemaps Wordpress SEO by yoast. Am I missing something here? Why Google won't index my images? Thanks, I appreciate any help, David xv1GtwK.jpg
Intermediate & Advanced SEO | | Worthminer1 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Website is not getting indexed in Google! Not sure why?
I just came up with my new blog, its not live yet but the 1<sup>st</sup> landing page is ready, up and running… all is fine but here is the only problem is its not getting indexed in Google and I am not really sure why? .xml sitemap is there Google webmaster and analytics are there Website contain at least that much real social shares that it should get indexed in Google Few Links may be coming from Famous Bloggers and SEOmoz (both sites are very authentic in their respective domains) It’s the 4 day the website is up I don’t think website is not getting indexed in Google just because it contains 1 landing page and a thank you page! Any clue or help will be appreciated. www.setalks.com is the domain
Intermediate & Advanced SEO | | MoosaHemani0