How to noindex lots of content properly : bluntly or progressively ?
-
Hello Mozers !
I'm quite in doubt, so I thought why not ask for help ?
Here's my problem : I need to better up a website's SEO that consists in a lot (1 million+ pages) of poor content. Basically it's like a catalog, where you select a brand, a product series and then the product, to then fill out a form for a request (sorry for the cryptic description, I can't be more precise).
Beside the classic SEO work, a part of what (I think) I need to do is noindex some useless pages and rewrite important ones with great content, but for the noindexing part I'm quite hesitant on the how.
There's like 200 000 pages with no visits since a year, so I guess they're pretty much useless junk that would be better off in noindex. But the webmaster is afraid that noindexing that much pages will hurt its long tail (in case of future visits), so he wants to check the SERP position of every one of them, to only eliminate those that are in the top 3 (for these there's no hope of amelioration he thinks). I think it would be wasting a lot of time and resources for nothing, and I'd advise to noindex them regardless of their position.
The problem is I lack the experience to be sure of it and how to do it : Is it wise to noindex 200 000 pages bluntly in one time (isn't it a bad signal for google ?) or should we do this progressively in like a few months ?
Thanks a lot for your help !
Johann.
-
Sorry you're stuck in that spot. I really would be worried that this "fix" would make life worse for everyone, but it's tough to come up with solutions that don't seem like band-aids. Best you may be able to do is get more aggressive about the de-indexation, focus on improving some core content, and maybe re-work the internal linking to focus more on key pages (and spread internal PR a bit less thinly).
-
Yeah, I get what you're saying and totally agree, since a radical overhaul is what I recommended from the start, but only got a no-can-do response... until now. But their "yes" is more like :
-
Ok, rebuild our website entirely, just don't touch our website.
-
Errr what ?
Anyway, so a similar domain name and brand was in fact a bad idea.
Thanks a lot for your input (and your awesome moz posts !
Cheers,
Johann.
-
-
Given their history, two domains with overlapping content and a similar name seems like a terrible idea to me, to be blunt. If this really is a Panda issue, then you're potentially going to aggravate the situation and send out even more low quality signals.
It's hard to speculate, but I've seen a few situations where what seemed like Panda turned out to be something deeper. Directory clients have been hit hard, for example, as Google just seems to be devaluing the entire space (along with price comparison sites, many types of affiliates, etc.). I'm not talking about spammy sites, even, but the ones that provide some original value. It's just that Google doesn't see them as the end-supplier, and so they're getting discounted.
An end-run to a new domain isn't going to fix this. I strongly suspect that you've got something deeper going on that may take a radical overhaul of the main site and even the business/brand. I think it's better to accept that now than continue a gradual decline over the next couple of years.
-
Hi everyone,
Some news on this story that may (or may not) be of interest for some (even if I can't give the domain name), and a new question (I may also start another discussion for that one) :
-
The website has lost a significant amount of trafic over the passing year, even with the massive noindexing of 200 000 pages (I finally convinced him to do it, but it clearly wasn't enough). About a 40% loss gradually with some panda updates (dates coincide nicely).
-
We've worked hard on it to offer a new section of interesting content (not a blog but nearly) that presented interesting original statistics on the niche with visual presentations, and a bunch of related content, about a hundred pages total. It's like a drop in the ocean, but it gained a bit of popularity, some nice links and good branding. I think it's probably the reason why the website is still standing, it even made a few top positions on new important keywords.
-
Last but not least, we've improved the user experience and bumped up our conversion rates so the loss in trafic is partly compensated by the gains in conversion (not completely though).
It still drags nearly a million pages of thin content, and still takes a little hit with every Panda roll-out... So no recovery, but a controled descent, as it's still alive.
Now I got the green light to a complete do-over, starting a rebuild with a completely new (lighter) structure and a new design. We're pumped full of ideas of great content and user experience, so it's gonna be a fresh new start. BUT, (there's always a but), the webmaster wants to keep the old website while it's still alive and I wonder if we can take a similar domain name to capitalize on the brand popularity. Like www.brand-domain.com instead of www.branddomain.com (in case it's not clear, we'll take the same domain name with a dash in it, so the brand stays recognizable).Is it gonna look manipulative for Google to have two websites with nearly the same domain name, the exact same brand, the same service (so the same keywords targeted) ? Any other caveats ?
(I know they are going to compete with each other, but they'll have different contents, and it would be temporary : as soon as the new one reaches the first one's popularity, we'll prepare a proper redirect - could be a month, could be a year later)Thanks for any input ! I'll wait before trying to start a new discussion to avoid any clutter^^
Johann
-
-
Thanks a lot for your insight dr pete
I'll sell the large cut sooner or later by convincing him. It's either that or I use a time machine to show his future stats when Google release the next Panda tweaks ^^
Option 1 is easier after all !
-
I wish I could convince people that more DOES NOT EQUAL better when it comes to index size. You'd think Panda would've been the nail in that coffin, but too many webmasters are still operating in 2005.
-
I've never seen an issue where a large-scale META NOINDEX caused Google to get suspicious. It's possible to NOINDEX the wrong pages and lose traffic, but Google generally doesn't get jumpy about it like they would a large scale 301-redirect (where you might be PR-sculpting).
If these are really duplicates, canonical tags might be a better bet. Honestly, while I agree with Stephen 99.9%, if there's no glaring current issue, you could ease into it. Start with the worst culprits - obvious, 100% duplicates. That should be an easier sell, too. If you can't sell the larger cut, it's not going to matter.
-
Damn, even by saying pages that don't generate traffic now won't much more in the future, and by giving an educated estimation of 0.05% potential future gains by keeping them versus the boatload of progress it could mean for the website to noindex them, it couldn't convince the webmaster to cut them out of the index...
Anyway thanks for your help everyone !
-
noindex asap
thumbs up for this
its not going to suddenly appear out of nowhere
ha ha... for sure!
-
Can you change the structure of the site and perhaps see this as an opportunity...
(granted lots of work required)
Adding another level of Sub categories to separate the content further and allow better indexing ?
-
If you use robots, it will not be able to read the follow tag, what i was suggesting is dont use robots but use meta tage "no-index,follow" to allow link juice to flow even though they are not indexed.
Search engines can still follow links of pages not indexxed, but a robots tells them they are not allowed to crawl the page.
-
Thanks for your replies.
Well, I'm not asking whether I should noindex those pages, I'm pretty sure I have to.
It's just that, noindex brutally one fifth of a website in one time would seem potentially suspect for the search engines... So I wonder if I should very carefully choose which ones to noindex and which ones to keep indexed even among unvisited pages, like the webmaster suggests, or do it slowly over a long period of time.
It's a big decision, I'm appealing to your professional experience to prevent me from making a potential mistake.
@AWCthreads : For the case of an e-commerce website, your suggestion would seem reasonable, because a robots.txt won't keep the pages out of the index if there's links to them, but would reduce the quantity of duplicate content. But in my case, it would not be enough, so the noindex meta tag is my only option it seems.
@Stephen : you're right, traffic can't appear out of thin air for these pages. Even if some of those should begin to see visits, they would still add up to a negligible part I believe. But I don't have the experience to support it or the numbers to prove it.
@Alan Mosley : I'll sure add the follow tag on these pages even if they're not indexed any more, it'll still be valuable. And I guess maybe it would prevent it from appearing too suspicious for the engines, wouldn't it ?
-
First remember that all pages in the index have PageRank and you should use that link juice to your advantage
http://perthseocompany.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Blocking in robots is clumsy, you will have links pointing to pages that are not in the index poring link juice to nowhere. You can add a meta “noindex, follow” tag that will allow link juice to flow in and out of the pages.If the pages are duplicates then I would remove them and fix the broken links it causes.
-
remove from sitemap, noindex asap. he has no longtail from those pages, its not going to suddenly appear out of nowhere
-
Hi Johann. Excellent question and a source of dispute for some people. I've not done it, but many people who want to no-index a large volume of pages will create a directory and put those files in the directory and then put a robots.txt on the directory.
Some people would argue why you would want to put a bunch pages (product pages on an ecommerce site) in a no-index file as they will not be seen/shared/sold etc. Well, my response to that would be to prevent juice dillution on pages of little SEO value and help keep the juice directed at the 20-30% of the products that are making you the most money.
I'm curious what others have to say about this and hope people weigh in on it.
-
Yeah they are mostly duplicates (only about 10% difference in text with variations)...
But near 80% of the pages are indexed, probably because the website has a strong authority and a lot of visits : these are useful pages for people, just not useful to read^^. That's why I'm so hesitant to noindex that much content, even if the website HAS to improve its quality content ratio if it wants to stay for the long run.
Maybe I'll start with testing your sitemap idea. Thanks for the suggestion.
-
Are the pages mostly duplicate content? Do you know how many have been indexed?
If it's a lot, then yes, noindexing them will make it look like your site has dropped a ton of content. But if it's duplicate then I'd go for it anyway as it will probably help things.
Alternatively, how about removing them from the sitemap instead? They may still get found but at least you're giving them a clue that those pages don't matter to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dropdown content on page being crawled
Hi, will the content within a dropdown on a page be crawled? I.e. if the page visitor has to click to reveal the content as a dropdown will it be crawled by bots. Thanks
Technical SEO | | BillSCC1 -
Content spamming risk
If some websites, which provide information about apps in a particular niche, are publishing the same content which we have given in our app's description when they refer our app for that particular niche then would it lead to spamming? Our website is getting a backlink from one such website so are we at any sort of risk? What should we do about it without having to lose that backlink?
Technical SEO | | Reema240 -
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
Do quizzes hurt your site? Thin content?
We did a 10 question quiz awhile back relating to something we were sponsoring, and it had a decent response. However, considering quizzes just aren't that long, does that contribute to making the site's content thin? Obviously, it's not a major problem at the moment, but if we did more of them would this be an issue? If there's no real issue, I'd prefer not to no-index them, but I'd love some feedback to help make the decision. Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Is anyone using Canonicalization for duplicate content
Hi i am trying to find out if anyone is using Canonicalization for duplicate content on a joomla site. I am using joomla 1.5 and trying to find either a module or manually how to sort this out as i have over 300 pages of duplicate content because i am not using this technique any help and advice would be great
Technical SEO | | ClaireH-1848860 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Copying my content
Hi there, I run a successful e-commerce website, which the product pages are rich with content linking to other products etc, one of our retailers who sell our products I just noticed copied and pasted the content I have written for these product pages leaving in all the links, which it turn are linking back to my product pages, is this a good thing? or should I make that retailer put in canonical tags? Thanks for any help
Technical SEO | | Paul780