De-indexing thin content & Panda--any advantage to immediate de-indexing?
-
We added the nonidex, follow tag to our site about a week ago on several hundred URLs, and they are still in Google's index. I know de-indexing takes time, but I am wondering if having those URLs in the index will continue to "pandalize" the site. Would it be better to use the URL removal request? Or, should we just wait for the noindex tags to remove the URLs from the index?
-
Whenever Matt Cutts discusses this subject in the Webmaster Tools videos and elsewhere, there is always a caveat along the lines of "while google mostly take notice of noindex and robots.txt, this may not always be acted upon". The primary reason given for this seems to be if content is indexed via a link from another site, or exists in google cache. In these cases it seems logical that it may continue to appear in the index.
Your question reminded me of Dr Pete's Catastrophic Canonicalization Experiment - it seems his method proved quite effective
-
Hey
I don't think it would make a great deal of difference as you are going to need to wait for a full crawl of your site anyhow before you see any benefits.
Out of interest, how are you identifying the low quality pages? One way to have a go at this is to use your analytics and identify all pages with a 100% bounce rate and noindex all of them. If there are lots (sounds like there are) you can do them in chunks and see what happens.
Don't get rid of pages that are doing good search traffic or have a low bounce rate UNLESS you know they are really poor pages as sooner or later, they will be picked up.
Ultimately, it sounds like a big site so you are going to have to be patient here and make incremental changes based on analytical and crawl data until you get the results you are looking for.
I have pulled a site back from the depths, a rather unfairly punished site in my opinion that just got it's content copied by several other sites but the same rules applied. We updated pages, removed blocks of template content to their own pages and just kept on watching and like magic, it came back stronger than before a week or so after we made all the changes.
Hope this helps!
Marcus -
You want to be a bit more patient. Depending on how popular and deep these pages are within your site, I would expect it to take several weeks to see most of them disappear. There is a good chance if you check you will find a percentage of those pages are disappearing each day.
The de-index tool is to remove content which you consider harmful to your business. Of course, any damage to your SEO rankings could be considered harmful, but that is clearly not what Google means. If you use the tool, they clearly explain it is for pages which need to "urgently" need to be removed due to legal reasons, copyright issues, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index bloating issue
Hello, In the last month, I noticed a huge spike in the number of pages indexed on my site, which I think is impacting my SEO quality score. While I've only have about 90 pages on my site map, the number of pages indexed jumped to 446, with about 536 pages being blocked by robots. At first we thought this might be due to duplicate product pages showing up in different categories on my site, but we added something to our robot.txt file to not index those pages. But the number has not gone down. I've tried to consult with our hosting vendor, but no one seems to be concerned or have any idea why there was such a big jump in the last month. Any insights or pointers would be so greatly appreciated, so that I can fix/improve my SEO as quickly as possible! Thanks!
Technical SEO | | Saison0 -
Why my website does not index?
I made some changes in my website after that I try webmaster tool FETCH AS GOOGLE but this is 2nd day and my new pages does not index www. astrologersktantrik .com
Technical SEO | | ramansaab0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Website is not indexed in Google
Hi Guys, I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause. I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual. This is the website i am talking about: http://www.xxxx.nl/ (Dutch) The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely. I hope you guys discover something i could not find! Thanks in advance 🙂
Technical SEO | | B.Great0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Bing and Yahoo Indexing
I have a young site (6 most) that is almost completely indexed by Google but Bing and Yahoo will only index a few pages. Does anyone have any tips for getting more pages indexed in Bing and Yahoo. The site is registered with Bing Webmaster tools and has a valid XML sitemmap.
Technical SEO | | waynekolenchuk0 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0