How to noindex lots of content properly : bluntly or progressively ?
-
Hello Mozers !
I'm quite in doubt, so I thought why not ask for help ?
Here's my problem : I need to better up a website's SEO that consists in a lot (1 million+ pages) of poor content. Basically it's like a catalog, where you select a brand, a product series and then the product, to then fill out a form for a request (sorry for the cryptic description, I can't be more precise).
Beside the classic SEO work, a part of what (I think) I need to do is noindex some useless pages and rewrite important ones with great content, but for the noindexing part I'm quite hesitant on the how.
There's like 200 000 pages with no visits since a year, so I guess they're pretty much useless junk that would be better off in noindex. But the webmaster is afraid that noindexing that much pages will hurt its long tail (in case of future visits), so he wants to check the SERP position of every one of them, to only eliminate those that are in the top 3 (for these there's no hope of amelioration he thinks). I think it would be wasting a lot of time and resources for nothing, and I'd advise to noindex them regardless of their position.
The problem is I lack the experience to be sure of it and how to do it : Is it wise to noindex 200 000 pages bluntly in one time (isn't it a bad signal for google ?) or should we do this progressively in like a few months ?
Thanks a lot for your help !
Johann.
-
Sorry you're stuck in that spot. I really would be worried that this "fix" would make life worse for everyone, but it's tough to come up with solutions that don't seem like band-aids. Best you may be able to do is get more aggressive about the de-indexation, focus on improving some core content, and maybe re-work the internal linking to focus more on key pages (and spread internal PR a bit less thinly).
-
Yeah, I get what you're saying and totally agree, since a radical overhaul is what I recommended from the start, but only got a no-can-do response... until now. But their "yes" is more like :
-
Ok, rebuild our website entirely, just don't touch our website.
-
Errr what ?
Anyway, so a similar domain name and brand was in fact a bad idea.
Thanks a lot for your input (and your awesome moz posts !
Cheers,
Johann.
-
-
Given their history, two domains with overlapping content and a similar name seems like a terrible idea to me, to be blunt. If this really is a Panda issue, then you're potentially going to aggravate the situation and send out even more low quality signals.
It's hard to speculate, but I've seen a few situations where what seemed like Panda turned out to be something deeper. Directory clients have been hit hard, for example, as Google just seems to be devaluing the entire space (along with price comparison sites, many types of affiliates, etc.). I'm not talking about spammy sites, even, but the ones that provide some original value. It's just that Google doesn't see them as the end-supplier, and so they're getting discounted.
An end-run to a new domain isn't going to fix this. I strongly suspect that you've got something deeper going on that may take a radical overhaul of the main site and even the business/brand. I think it's better to accept that now than continue a gradual decline over the next couple of years.
-
Hi everyone,
Some news on this story that may (or may not) be of interest for some (even if I can't give the domain name), and a new question (I may also start another discussion for that one) :
-
The website has lost a significant amount of trafic over the passing year, even with the massive noindexing of 200 000 pages (I finally convinced him to do it, but it clearly wasn't enough). About a 40% loss gradually with some panda updates (dates coincide nicely).
-
We've worked hard on it to offer a new section of interesting content (not a blog but nearly) that presented interesting original statistics on the niche with visual presentations, and a bunch of related content, about a hundred pages total. It's like a drop in the ocean, but it gained a bit of popularity, some nice links and good branding. I think it's probably the reason why the website is still standing, it even made a few top positions on new important keywords.
-
Last but not least, we've improved the user experience and bumped up our conversion rates so the loss in trafic is partly compensated by the gains in conversion (not completely though).
It still drags nearly a million pages of thin content, and still takes a little hit with every Panda roll-out... So no recovery, but a controled descent, as it's still alive.
Now I got the green light to a complete do-over, starting a rebuild with a completely new (lighter) structure and a new design. We're pumped full of ideas of great content and user experience, so it's gonna be a fresh new start. BUT, (there's always a but), the webmaster wants to keep the old website while it's still alive and I wonder if we can take a similar domain name to capitalize on the brand popularity. Like www.brand-domain.com instead of www.branddomain.com (in case it's not clear, we'll take the same domain name with a dash in it, so the brand stays recognizable).Is it gonna look manipulative for Google to have two websites with nearly the same domain name, the exact same brand, the same service (so the same keywords targeted) ? Any other caveats ?
(I know they are going to compete with each other, but they'll have different contents, and it would be temporary : as soon as the new one reaches the first one's popularity, we'll prepare a proper redirect - could be a month, could be a year later)Thanks for any input ! I'll wait before trying to start a new discussion to avoid any clutter^^
Johann
-
-
Thanks a lot for your insight dr pete
I'll sell the large cut sooner or later by convincing him. It's either that or I use a time machine to show his future stats when Google release the next Panda tweaks ^^
Option 1 is easier after all !
-
I wish I could convince people that more DOES NOT EQUAL better when it comes to index size. You'd think Panda would've been the nail in that coffin, but too many webmasters are still operating in 2005.
-
I've never seen an issue where a large-scale META NOINDEX caused Google to get suspicious. It's possible to NOINDEX the wrong pages and lose traffic, but Google generally doesn't get jumpy about it like they would a large scale 301-redirect (where you might be PR-sculpting).
If these are really duplicates, canonical tags might be a better bet. Honestly, while I agree with Stephen 99.9%, if there's no glaring current issue, you could ease into it. Start with the worst culprits - obvious, 100% duplicates. That should be an easier sell, too. If you can't sell the larger cut, it's not going to matter.
-
Damn, even by saying pages that don't generate traffic now won't much more in the future, and by giving an educated estimation of 0.05% potential future gains by keeping them versus the boatload of progress it could mean for the website to noindex them, it couldn't convince the webmaster to cut them out of the index...
Anyway thanks for your help everyone !
-
noindex asap
thumbs up for this
its not going to suddenly appear out of nowhere
ha ha... for sure!
-
Can you change the structure of the site and perhaps see this as an opportunity...
(granted lots of work required)
Adding another level of Sub categories to separate the content further and allow better indexing ?
-
If you use robots, it will not be able to read the follow tag, what i was suggesting is dont use robots but use meta tage "no-index,follow" to allow link juice to flow even though they are not indexed.
Search engines can still follow links of pages not indexxed, but a robots tells them they are not allowed to crawl the page.
-
Thanks for your replies.
Well, I'm not asking whether I should noindex those pages, I'm pretty sure I have to.
It's just that, noindex brutally one fifth of a website in one time would seem potentially suspect for the search engines... So I wonder if I should very carefully choose which ones to noindex and which ones to keep indexed even among unvisited pages, like the webmaster suggests, or do it slowly over a long period of time.
It's a big decision, I'm appealing to your professional experience to prevent me from making a potential mistake.
@AWCthreads : For the case of an e-commerce website, your suggestion would seem reasonable, because a robots.txt won't keep the pages out of the index if there's links to them, but would reduce the quantity of duplicate content. But in my case, it would not be enough, so the noindex meta tag is my only option it seems.
@Stephen : you're right, traffic can't appear out of thin air for these pages. Even if some of those should begin to see visits, they would still add up to a negligible part I believe. But I don't have the experience to support it or the numbers to prove it.
@Alan Mosley : I'll sure add the follow tag on these pages even if they're not indexed any more, it'll still be valuable. And I guess maybe it would prevent it from appearing too suspicious for the engines, wouldn't it ?
-
First remember that all pages in the index have PageRank and you should use that link juice to your advantage
http://perthseocompany.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Blocking in robots is clumsy, you will have links pointing to pages that are not in the index poring link juice to nowhere. You can add a meta “noindex, follow” tag that will allow link juice to flow in and out of the pages.If the pages are duplicates then I would remove them and fix the broken links it causes.
-
remove from sitemap, noindex asap. he has no longtail from those pages, its not going to suddenly appear out of nowhere
-
Hi Johann. Excellent question and a source of dispute for some people. I've not done it, but many people who want to no-index a large volume of pages will create a directory and put those files in the directory and then put a robots.txt on the directory.
Some people would argue why you would want to put a bunch pages (product pages on an ecommerce site) in a no-index file as they will not be seen/shared/sold etc. Well, my response to that would be to prevent juice dillution on pages of little SEO value and help keep the juice directed at the 20-30% of the products that are making you the most money.
I'm curious what others have to say about this and hope people weigh in on it.
-
Yeah they are mostly duplicates (only about 10% difference in text with variations)...
But near 80% of the pages are indexed, probably because the website has a strong authority and a lot of visits : these are useful pages for people, just not useful to read^^. That's why I'm so hesitant to noindex that much content, even if the website HAS to improve its quality content ratio if it wants to stay for the long run.
Maybe I'll start with testing your sitemap idea. Thanks for the suggestion.
-
Are the pages mostly duplicate content? Do you know how many have been indexed?
If it's a lot, then yes, noindexing them will make it look like your site has dropped a ton of content. But if it's duplicate then I'd go for it anyway as it will probably help things.
Alternatively, how about removing them from the sitemap instead? They may still get found but at least you're giving them a clue that those pages don't matter to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old Content after 301 Redirect Success
Hi, I want to ask what need I do to the old content after my 301 redirect to the new domain with the same content success? Do I need to remove that old content? Nothing bad happen right? Thanks
Technical SEO | | matthewparkman0 -
Two domains / same Content
Hi MOZzers, I have recently started working for a client who owns two domains (as recommended by their Web Development company), each omain is a complete duplication of the other. The only difference is one is a totally keyword focused domain name, the other is their brand name which also contains keyword. In a search for blocks of content the keyword focused domain comes up, the other doesn't and when I conducted a search for one of their primary services again the keyword focused domain name came up on the first page, but the branded search also appeared on the second. The web development company have been managing this company's Adwords account and promoting their brand name and up until today I was unaware of the other. Can I have some thoughts - do I ask the web developers to re-direct one to the other, or leave as it?
Technical SEO | | musthavemarketing0 -
How to Handle Subdomains with Irrelevant Content
Hi Everyone, My company is currently doing a redesign for a website and in the process of planning their 301 redirect strategy, I ran across several subdomains that aren't set up and are pointing to content on another website. The site is on a server that has a dedicated IP address that is shared with the other site. What should we do with these subdomains? Is it okay to 301 them to the homepage of the new site, even though the content is from another site? Should we try to set them up to go to the 404 page on the new site?
Technical SEO | | PapercutInteractive0 -
Duplicate content problem
Hi, i work in joomla and my site is www.in2town.co.uk I have been looking at moz tools and it is showing i have over 600 pages of duplicate content. The problem is shown below and i am not sure how to solve this, any help would be great, | Benidorm News http://www.in2town.co.uk/benidorm-news/Page-2 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-102 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-103 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-104 9 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-106 28 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-11 50 22 3 In2town http://www.in2town.co.uk/blog/In2town/Page-112 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-114 45 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-115 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-116 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-12 50 22 3 In2town http://www.in2town.co.uk/blog/In2town/Page-120 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-123 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-13 50 22 3 In2town http://www.in2town.co.uk/blog/In2town/Page-130 50 23 3 In2town http://www.in2town.co.uk/blog/In2town/Page-131 50 22 3 In2town http://www.in2town.co.uk/blog/In2town/Page-132 31 22 3 In2town http://www.in2town.co.uk/blog/In2town/Page-140 4 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-141 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-21 10 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-22 50 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-23 50 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-26 50 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-271 50 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-274 50 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-277 50 21 2 In2town http://www.in2town.co.uk/blog/In2town/Page-28 50 21 2 In2town http://www.in2town.co.uk/blog/In2town/Page-29 50 18 1 In2town http://www.in2town.co.uk/blog/In2town/Page-310 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-341 21 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-342 4 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-343 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-345 1 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-346 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-348 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-349 50 1 0 In2town http://www.in2town.co.uk/blog/In2town/Page-350 50 16 0 In2town http://www.in2town.co.uk/blog/In2town/Page-351 50 19 1 In2town http://www.in2town.co.uk/blog/In2town/Page-82 24 1 0 In2town http://www.in2town.co.uk/blog/in2town 50 20 1 In2town http://www.in2town.co.uk/blog/in2town/Page-10 50 23 3 In2town http://www.in2town.co.uk/blog/in2town/Page-100 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-101 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-105 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-107 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-108 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-109 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-110 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-111 50 22 3 In2town http://www.in2town.co.uk/blog/in2town/Page-113 |
Technical SEO | | ClaireH-1848860 -
301 redirecting old content from one site to updated content on a different site
I have a client with two websites. Here are some details, sorry I can't be more specific! Their older site -- specific to one product -- has a very high DA and about 75K visits per month, 80% of which comes from search engines. Their newer site -- focused generally on the brand -- is their top priority. The content here is much better. The vast majority of visits are from referrals (mainly social channels and an email newsletter) and direct traffic. Search traffic is relatively low though. I really want to boost search traffic to site #2. And I'd like to piggy back off some of the search traffic from site #1. Here's my question: If a particular article on site #1 (that ranks very well) needs to be updated, what's the risk/reward of updating the content on site #2 instead and 301 redirecting the original post to the newer post on site #2? Part 2: There are dozens of posts on site #1 that can be improved and updated. Is there an extra risk (or diminishing returns) associated with doing this across many posts? Hope this makes sense. Thanks for your help!
Technical SEO | | djreich0 -
Pages with content defined by querystring
I have a page that show traveltips: http://www.spies.dk/spanien/alcudia/rejsemalstips-liste This page shows all traveltips for Alcudia. Each traveltip also has its own url: http://www.spies.dk/spanien/alcudia/rejsemalstips?TravelTipsId=19767 ( 2 weeks ago i noticed the url http://www.spies.dk/spanien/alcudia/rejsemalstips show up in google webmaster tools as a 404 page, along with 100 of others urls to the subpage /rejsemalstips WITHOUT a querystring. With no querystring there is no content on the page and it goes 404. I need my technicians to redirect that page so it shows the list, but in the meantime i would like to block it in robots.txt But how do i block a page if it is called without a querystring?
Technical SEO | | alsvik0 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0