Can too many "noindex" pages compared to "index" pages be a problem?
-
Hello,
I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages.
Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow".
At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages.
Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter?
Any thoughts on this issue are very welcome.
Thank you!
Fabrizio
-
Julian, we sell digital sheet music and the additional 100,000 are products from Alfred music publishing company. Of course they will not be "high quality pages", but they are product pages, each one offering a piece of music. We are an e-commerce website, how can we avoid having product pages?! But of course, as Wesley said above, we can improve each product page quality content by giving more/custom information for each product, increasing user reviews, etc.
Other suggestions?
-
Thank you Wesley, yes, I think you are right. Our business is suffering really too much without traffic coming from the "noindex" pages, and after many months we still don't see recovery. I think the best approach would be probably to keep the pages in the index and differentiate them as much as we can.
Thank you!
-
Panda is probably the worst penalty to have. Very few site ever recover, even though site owner have spent a lot of time, effort and money trying to solve it. e.g. http://searchengineland.com/google-panda-two-years-later-losers-still-losing-one-real-recovery-149491
In this video, about 12.43 - matt cutts is clear, if you think its low quality 404 it, in other delete it.
May I ask why you want to keep these 180,000 pages live? And why are you planning to add another 100,000 pages? Surely they cant be high quality pages?
-
Fabrizo, as far as I know Google Panda is now part of the standard Google algorithm and it won't be a periodic event anymore. Penguin still is though.
If your product pages are duplicate content according to Google try and see if you can do something about that instead of no-indexing it. Is there no way you can update the products so they display a more prominent description? I understand that manually it's not a possibility because there are way too much products for that to be an option.
I did notice that on a lot of your product pages you have a standard text: "This item includes: PDF (digital sheet music to print), Scorch files (for online playing, transposition and printing), Videos, MIDI and Mp3 audio files (including <a title="This item includes Mp3 music accompaniment files.">Mp3 music accompaniment files</a>)*
Genre: classical
Skill Level: medium"Since this is basicly the only text on a lot of pages I think it's a big part of the problem. Maybe you can change this text so it looks different for every product?
Try tools like http://www.plagspotter.com/ to find the duplicate content and see which solution is best for your specific problem.
I hope i helped and if you need more help let me know
-
I understand what you mean and I agree with you in general, but specifically to our own website, I have no idea who put that link on that page, which is by the way a "nofollow" link. We never built links, all our incoming links are either natural and/or links from our own affiliates. I don't see much of "that stuff" on our back-link profile... am I in error?
Anyhow, yes, we are aware the situation is quite complex. Thank you again.
-
I actually looked at the competitors ranking #3 and #4 for the phrase "download sheet music" since your ranking 5th. Either way, its not a matter of too much or too little. It's how much of the link profile is authentic vs how much is made up of stuff like this....
http://www.dionneco.com/2011/02/love-is-a-parallax/
that's what I meant by fake links.
I think what you may be missing is how complex the situation really is. There's a lot more to be considered than a number in Open Site Explorer - which is actually only a portions of what's really out there.
You may also want to look at changes you can make on-site. I'm a firm believer that proper HTML, accessibility, UX and all that really matter.
-
Thank you Takeshi, I think you got the problem right. The "crawling" side of the issue is something I was thinking about too!
We are actually working on every aspect of our website to improve its content because we have suffered by Panda a lot in the past two years, so here is the strategy we begun to take since March:
1. "noindexing" most of our thin or almost-duplicate content to get it removed from the index
2. Improve our best content and differentiate it as much as we can with compelling content (this takes a long time!)
3. Consolidating similar pages with the use of canonical tags.
In order to tackle the "slower crawling" problem you have highlighted here, do you think that would be probably better for us to stop engines to crawl those pages altogether via robots.txt once they have been removed? Would that solve the crawl issue? I could do that at least with these new 100,000 new product pages we plan to add!
Thank you!
-
Wesley, that's because of being penalized by Panda several times in the past... so we are trying the "clean-up" strategy with the hope to be "de-penalized" by Panda at the next related algorithm update. Looks like we had too many "thin" or "almost duplicate" pages... that's why we removed so many pages from the index! But if we don't see improvements in the coming 1-2 months, I guess we'll put the product pages in the index because our business is suffering a big deal!
-
Colin, what do you mean with "fake links" exactly? Our link profile looks actually in better shape than our main competitors:
virtualsheetmusic.com (our site): links: 614,013 root domains: 2,233
sheetmusicplus.com (competitor): links: 5,322,596 root domains: 6,149 (worse than our profile!)
musicnotes.com (competitor): links: 6,527,429 root domains: 2,914 (much worse than our profile!)
Am I missing anything?
-
The discrepancy between noindexed/indexed pages is not in itself a problem. However having all those pages will present a challenge to Google, in terms of crawling. Even though the pages won't be indexed, Google will need to spend some of your limited crawl budget crawling all those pages.
Also, to recover from Panda it's necessary to not only noindex duplicate content, but improve your indexed content. That means things like consolidating similar pages into one page, writing unique content for your pages, and getting unique user-generated content such as reviews.
-
Why would you want to no-index your product pages? They seem like the kind of pages you want to get found on.
There shouldn't be a problem between the amount of index pages VS no-index pages except you won't get found on the no-index ones. Product pages tend to be the kind of pages that you REALLY want to get found on.
I think you should rethink your strategy to recover from the penalties.
Try to find out where exactly the penalties came from and fix the errors in that area of our website. -
Can't say I've been in that situation, but search engines seem to interpret that tag as an on/off situation. and I think you probably know that your problems aren't related to or able to be solved by robots meta tags.
You need less fake links. OSE finds well over half a million links from 3K root domains to your site. Look at your competitors - a few thousand links from a handful of domains.
It's a shame because it seems like the internet wanted to make you the authority naturally - You've got a handful of really solid links coming in. If you could shed the spam somehow you'd be doing a lot better.
So yea, stating the obvious, I know. best of luck to you and hope the site recovers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Conditional Noindex for Dynamic Listing Pages?
Hi, We have dynamic listing pages that are sometimes populated and sometimes not populated. They are clinical trial results pages for disease types, some of which don't always have trials open. This means that sometimes the CMS produces a blank page -- pages that are then flagged as thin content. We're considering implementing a conditional noindex -- where the page is indexed only if there are results. However, I'm concerned that this will be confusing to Google and send a negative ranking signal. Any advice would be super helpful. Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Only 4 of my pages have been indexed out of 64 in total
Hi there, I submitted a sitemap for a new 64 page website 6 weeks ago and only a few pages have been indexed. The website shows in Google search but with a large amount of information on the website it should show higher. I have fetched and rendered 30 plus pages on the 9th September and others on the 16th September, today is the 5th October but in Webmaster tools, Google only acknowledge 1 page as indexed. I have checked the robots txt file which shows it is allowed. There are no messages for problems with crawl and no errors showing. The domain is www.urbaneforms.com . Can you offer a suggestion as to why we are not being indexed?
Intermediate & Advanced SEO | | simplyworld0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Hidden keywords - how many per page?
Hi All, We have a booking website we want to optimize for keywords we cannot really show, because some of our partners wouldn't want it. We figured we can put said keywords or close synonyms onpage in various places that are not too dangerous though (e.g. image names, image alt tags, URLs, etc.). The question is how much keywords we can target though? We know keyword stuffing is detrimental, and we will not start to create long URLs stuffed with keywords, same for H1 tags or page titles. So how many is acceptable/not counterproductive? Thanks!
Intermediate & Advanced SEO | | Philoups0 -
Getting Google in index but display "parent" pages..
Greetings esteemed SEO experts - I'm hunting for advice: We operate an accommodation listings website. We monetize by listing position in search results, i.e. you pay more to get higher placing in the page. Because of this, while we want individual detailed listing pages to be indexed to get the value of the content, we don't really want them appearing in Google search results. We ideally want the "content value" to be attributed to the parent page - and google to display this as the link in the search results instead of the individual listing. Any ideas on how to achieve this?
Intermediate & Advanced SEO | | AABAB0 -
Fetch as GoogleBot "Unreachable Page"
Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz
Intermediate & Advanced SEO | | shaz_lhr0 -
Does a Single Instance of rel="nofollow" cause all instances on a page to be nofollowed?
I attended the Bruce Clay training at SMX Advanced Seattle, and he mentioned link pruning/sculpting (here's an SEOMoz article about it - http://www.seomoz.org/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow) Now during his presentation he mentioned that if you have one page with multiple links leading to another page, and one of those links is nofollowed, it could cause all links to be nofollowed. Example: Page A has 4 links to Page B: 1:followed, 2:followed, 3:nofollowed, 4:followed The presence of a single nofollow tag would override the 3 followed links and none of them would pass link juice. Has anyone else encountered this problem, and Is there any evidence to support this? I'm thinking this would make a great experiment.
Intermediate & Advanced SEO | | brycebertola0