Removing duplicated content using only the NOINDEX in large scale (80% of the website).
-
Hi everyone,
I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content.
However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user.
What do you think about this "theory"? What would you do?
Thank you for your help!
-
-
it has been almost a year now from the massive hit. after that, there were also some smaller hits
-
we are putting effort into improvements. that is quite frustrating for me, because I believe that our effort is demolished by old duplicated content (that creates 80% of the website :-))
Yeah, we will need to take care about the link-mess...
Thank you! -
-
Yeah, this strategy will be definitely part of the guidelines for the editors.
One last question: do you know some good resources I can use as an inspiration?
Thank you so much..
-
We deleted thousands of pages every few months.
Before deleting anything we identified valuable pages that continued to receive traffic from other websites or from search. These were often updated and kept on the site. Everything else was 301 redirected to the "news homepage" of the site. This was not a news site, it was a very active news section on an industry portal site.
You have set 410 for those pages and remove all internal links to them and google was ok with that?
Our goal was to avoid internal links to pages that were going to be deleted. Our internal "story recommendation" widgets would stop showing links to pages after a certain length of time. Our periodic purges were done after that length of time.
We never used hard coded links in stories to pages that were subject to being abandoned. Instead we simply linked to category pages where something relevant would always be found.
Develop a strategy for internal linking that will reduce site maintenance and focus all internal links to pages that are permanently maintained.
-
Yaikes! Will you guys still pay for it if it's removed? If so, then combining below comments with my thoughts - I'd delete it, since it's old and not time relevant.
-
Yeah, paying ... we actually pay for this content (earlier management decisions :-))
-
EGOL your insights are very appreciated :-)!
I agree with you. Makes total sense.
So you didn't experience any problems removing outdated content (or "content with no traffic value") from your website? You have set 410 for those pages and remove all internal links to them and google was ok with that?
Redirecting useless content - you mean set 301 to the most relevant page that is bringing traffic?
Thank you sir
-
But I still miss the point of paying for the content that is not accessible from SE
- "paying"?
Is my understanding right, that if I would set canonical for these duplicates, Google has no reason to show this pages in the SERP?
- correct
-
HI Dimitrii,
thank you very much for your opinion. The idea of canonical links is very interesting. We may try that in the "first" phase. But I still miss the point of paying for the content that is not accessible from SE.
Is my understanding right, that if I would set canonical for these duplicates, Google has no reason to show this pages in the SERP?
-
Just seeing the other responses. Agree with what EGOL mentions. A content audit would be even better to see if there was any value at all on those pages (GA traffic, links, etc). Odds are though that there was not any and you already killed all of it with the noindex tag in place.
-
Couple of things here.
-
If a second Panda update has not occurred since the changes that were made then you may not get credit for the noindexed content. I don't think this is "cheating" as with the noindex, it just told Google to take 350K of its pages out of the index. The noindex is one of the best ways to get your content out of Google's index.
-
If you have not spent time improving the non-syndicated content then you are missing the more important part and that is to improve the quality of the content that you have.
A side point to consider here, is your crawl budget. I am assuming that the site still internally links to these 350K pages and so users and bots will go to them and have to process etc. This is mostly a waste of time. As all of these pages are out of Google's index thanks to the noindex tag, why not take out all internal links to those pages (i.e. from sitemaps, paginated index pages, menus, internal content) so that you can have the user and Google focus on the quality content that is left over. I would then also 404/410 all those low quality pages as they are now out of Google's index and not linked internally. Why maintain the content?
-
-
Good point! News gotta be new
-
If there are 500,000 pages of "news" then a lot of that content is "history" instead of "news". Visitors are probably not consuming it. People are probably not searching for it. And actively visited pages on the site are probably not linking to it.
So, I would use analytics to determine if these "history" pages are being viewed, are pulling in much traffic, have very many links, and I would delete and redirect them if they are not important to the site any longer. This decision is best made at the page level.
For "unique content" pages that appear only on my site, I would assess them at regular intervals to determine which ones are pulling in traffic and which ones are not. Some sites place news in folders according to their publication dates and that facilitates inspecting old content for its continued value. These pages can then be abandoned and redirected once their content is stale and not being consumed. Again, this can best be done at the page level.
I used to manage a news section and every few months we would assess, delete and redirect, to keep the weight of the site as low as possible for maximum competitiveness.
-
Hi there.
NOINDEX !== no crawling. and surely it doesn't equal NOFOLLOW. what you probably should be looking at is canonical links.
My understanding is (and i can be completely wrong) that when you get hit by Panda for duplicate content and then try to recover, Google checks your website for the same duplicate content - it's still crawlable, all the links are still "followable", it's still scraped content, you aren't telling crawlers that you took it from somewhere else (by canonicalizing), it's just not displayed in SERPs. And yes, 80% of content being noindex probably doesn't help either.
So, I think that what you need to do is either remove that duplicate content whatsoever, or use canonical links to originals or (bad idea, but would work) block all those links in robots.txt (at least this way those pages will become uncrawlable whatsoever). All this still is unreputable techniques though, kinda like polishing the dirt.
Hope this makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I create a new Website to promote just one set of services from a list of several services?
Hi, I have a 10 years old website, where I promote all my services - around 30 of them under 5 main categories. For example, my current website promotes these services. A service - with a1, a2, a3 services B service - with b1, b2, b3 services C service - with c1, c2, c3 services D service - with d1, d2, d3 services E service - with e1, e2, e3 services Now I want to promote just "A service" with its sub-services into a separate website, as that service is in demand now and also those keywords should be my main keywords. I want to connect my old website with the new one, to increase the trust among users. Can I do this? I hope I am not violating any Google rules by doing this. Please help with suggestions. Thanks. Jessi.
White Hat / Black Hat SEO | | Sudsat0 -
What if i dont use an H1, but rather, h2 with multiple keywords.
the reason i dont want to use h1 is because i can have only one h1, however if i use several h2s. is it gonna help me rank? bacause google likes h1 more than h2, is google gonna give more priority or same priority to h2., and if that priority is gonna be less, what will be the percentage of that lessness? for ex: h1 gets 90 score if my h1 is missing how much score my h2 will get out of hundred(i know there is no such metric but i am just wondering anyways)
White Hat / Black Hat SEO | | Sam09schulz0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
20-30% of our ecommerce categories contain no extra content, could this be a problem
Hello, About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Website slipped for particular keyword this year
www.schupepetents.co.nz I am webmaster for this site which slipped in rankings around 18/1/13. It was doing really well for keyword "marquee hire" ranking 6-8 for google.co.nz. It is now ranked about 30. Background. It has been ranked well for keyword for about 3 years. From what I can see there was a few websites that tumbled around this time. The website has been completely redone to a wordpress site this was in December last year. The switch was done before putting in 301 re directs to preserve internal page rank. As a result the internal page rank of pages has had to start again! Three things from online research into what has potentially affected this:
White Hat / Black Hat SEO | | chopchop
1. Lack of website substance. As website has been redone it has lost Google "esteem" (lack of internal pages PR). But the time that it dropped was a long time after the site redesign.
2. Possible penalty for too high proportion of links from directory websites. Heard some whispers from forums that this had happened. But links are not just from directories.
3. Too many links with specific keyword (marquee hire) for junk sites. This is possibly true but why the drop at that time when no one else experienced this. Also we hired a company mid last year who link bombed using "marquee hire" and a couple other keywords. Moz seems to be very happy with the website!! Good link scores and on-page optimisation is great. Wondering what has happened?!0 -
Are duplicate item titles harmful to my ecommerce site?
Hello everyone, I have an online shopping site selling, amongst other items, candles. We have lots of different categories within the LED candles category. One route a customer can take is homepage > LED candles > Tealights. Within the tealights category we have 7 different products which vary only in colour. It is necessary to create separate products for each colour since we have fantastic images for each colour. To target different keywords, at present we have different titles (hence different link texts, different URLs and different H1 tags) for each colour, for example "Battery operated LED candles, amber", "Flameless candles, red" and "LED tealights, blue". I was wondering if different titles to target different keywords is a good idea. Or, is it just confusing to the customer and should I just stick with a generic item title which just varies by colour (eg. "LED battery candles, colour")? If I do the latter, am I at risk of getting downranked by Google since I am duplicating the product titles/link texts/URLs/H1 tags/img ALTs? (the description and photos for each colour are unique). Sorry if this is a little complicated - please ask and I can clarify anything...because I really want to give the best customer experience but still preserve my Google ranking. I have attached screenshots of the homepage and categories to clarify, feel free to go on the site live too. Thank you so much, Pravin BqFCp.jpg KC2wB.jpg BEcfX.jpg
White Hat / Black Hat SEO | | goforgreen0 -
Competitors Developing Spammy Link For My Website
Well Guys there are lot of discussions in almost all the communities, blogs, forums about Post Penguin impact. Google says that if find that you're involved in any link building activities, we may penalize you. People out there have already started their developed links. But what if our competitors would have developed those links. Initially it was okay to develop one way links, I even developed lot of quality, but deliberately, links. around 95% links are placed manually, if return to some favor or money but all links looks natural. Most of the links I developed through content only, like articles, blog comments, PR submission, etc now really skeptical about the quality (after hearing lot of talks and reading n number of posts). Now, can I also submit my competitor's websites in 1000 topic directory (obviously not in any spammy directory), would it effect that website adversely? What if I spun an existing content and submit it into 500 article directories and give backlink to competitor site from using only one anchor text (which is obviously the main keywords - highest sales generating keyword) I look forward to some experts comments.
White Hat / Black Hat SEO | | Khem_Raj70