To noindex or not to noindex
-
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do?
- Don't noindex anything - let Google decide what's worth indexing and not.
- Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it.
- Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed.
Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
-
1. yes - if you no index all but 20 pages, those 20 pages would get a boost in rankings. You would end up losing the long tail searches from those other thousands of page - so you'll need to do some cost / benefit analysis on that.
2. you'll need to do a cost / benefit analysis on this one. Are most of the visitors to your site searching in Chinese or English? Are your search terms mainly in Chinese or mainly in English? Are your Chinese speaking visitors more likely to want to visit the .zh subdomain?
You could publish 20 to 50 pages on each subdomain, and then focus on doing some link building. If you have strong rankings across those 40 to 100 pages, then you could start adding more pages slowly over time.
-
Nops, no need to include the no index tag as adding canonical is an indication to Google that what are the original pages that search engine need to index and crawl so al other pages then category pages will be crawled automatically.
-
Hi Moosa. Thanks very much for your reply and great suggestions. If I add canonical tags on each URL page referencing the category page where it belongs, should I also add noindex tags on it? Should then actually all URL pages have noindex tags and only allow category pages to be indexed?
-
Thanks for suggestions. I have some follow-up questions. Would really like to know what you think about the following:
- The "page rank will get shared to all of the pages that you have across your site". In general, does this mean that if I add noindex tags to all but a few pages, they will be ranked much higher? Currently thousands of pages are indexed. Is it correct to say that if only say 20 pages were indexed that would greatly improve their ranking?
- The zh and en versions of the website have different templates and most of the text content is also translated (with the main exception of old blog posts). We could add noindex on all of the zh website or all except the main pages. Would you recommend that?
-
Ok I might sound completely stupid here as I never come across this case before but here is my hypothesis….
While searching for a keyword or URL you another field (may be a checkbox) that represents the category of search.
So, ones the new URL will generate it will come under the specific category automatically.
Customize the category pages so that they look different from each other.
Index the category pages and add canonical tag on any new generated URL of the category page. For example if the new page generates like www.yourwebsite.com/movies/ice-age -3/ this page should have the canonical tag to http://www.yourwebsite.com/movies/
Why?
Creating category pages will allow you more unique pages to get indexed in SERPs without the duplicate content issue. Adding canonical tag on all other URLs will tell category pages are the real pages that Google should consider.
This might help you cater more chances to earn more search traffic from Google.
**This is my assumption what I think should work!
-
Creating a page every time someone performs a search could probably spiral out of control pretty quickly. If you have a certain amount of 'page rank', based on all of the back links you have, that page rank will get shared to all of the pages that you have across your site.
One way you could more naturally control what gets indexed, is by what you link to from your home page. For instance, if you track the most blocked big sites, as well as the most blocked keywords, and have those pages 1 link from your homepage, you could expect those to get indexed naturally when your site is spidered.
As you get more links from other sites, and your trust from the search engines and page rank grows, you should be able to support more pages getting indexed across your site.
There is the issue of your site contents potentially being regarded as 'thin content', since many of the pages appear to be the same from page to page.
One question I had - I saw your site hosts both Chinese language words and English language words, and checks whether those words are being filtered. Perhaps it would make more sense to only show the words in Chinese characters on the zh. subdomain, and the English words on the en. subdomain? Just a thought. Is there any difference between the zh and en subdomains, aside from the language of the template?
Really interesting website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is domain authority lost if you create a 301 redirect but mark it as noindex, nofollow?
Hi everyone, Our company sells products in various divisions. While we've been selling Product A and Product B under our original brand, we've recently created a new division with a new domain to focus on a Product B. The new domain has virtually no domain authority (3) while the original domain has some (37). We want customers to arrive on the new domain when they search for key search terms related to Product B instead of the pages that previously existed on our main website. If we create 301 redirects for the pages and content on the main site and add noindex, nofollow tags, will we lose the domain authority that we have from our original domain because the pages now have the noindex, nofollow tags? I read a few blog posts from Moz that said there isn't any domain authority lost with 301 redirects but I'm not sure if that is true if the pages are noindex, nonofollow. Do you follow? 🙂 Apologies for the lengthy post. Love this community and the great Moz team. Thanks, Joe
Intermediate & Advanced SEO | | jgoehring-troy0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Should I "NoIndex" Pages with Almost no Unique Content
I have a real estate site with MLS data (real estate listings shared across the Internet by Realtors, which means data exist across the Internet already). Important pages are the "MLS result pages" - the pages showing thumbnail pictures of all properties for sale in a given region or neighborhood. 1 MLS result page may be for a region and another for a neighborhood within the region:
Intermediate & Advanced SEO | | khi5
example.com/region-name and example.com/region-name/neighborhood-name
So all data on the neighborhood page will be 100% data from the region URL. Question: would it make sense to "NoIndex" such neighborhood page, since it would reduce nr of non-unique pages on my site and also reduce amount of data which could be seen as duplicate data? Will my region page have a good chance of ranking better if I "NoIndex" the neighborhood page? OR, is Google so advanced they know Realtors share MLS data and worst case simple give such pages very low value, but will NOT impact ranking of other pages on a website? I am aware I can work on making these MLS result pages more unique etc, but that isn't what my above question is about. thank you.0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Will disallowing in robots.txt noindex a page?
Google has indexed a page I wish to remove. I would like to meta noindex but the CMS isn't allowing me too right now. A suggestion o disallow in robots.txt would simply stop them crawling I expect or is it also an instruction to noindex? Thanks
Intermediate & Advanced SEO | | Brocberry0 -
Crawl Budget on Noindex Follow
We have a list of crawled product search pages where pagination on Page 1 is indexed and crawled and page 2 and onward is noindex, noarchive follow as we want the links followed to the Product Pages themselves. (All product Pages have canonicals and unique URLs) Orr search results will be increasing the sets, and thus Google will have more links to follow on our wesbite although they all will be noindex pages. will this impact our carwl budget and additionally have impact to our rankings? Page 1 - Crawled Indexed and Followed Page 2 onward - Crawled No-index No-Archive Followed Thoughts? Thanks, Phil G
Intermediate & Advanced SEO | | AU-SEO0 -
If we add noindex to a subdomain, will the traffic to that subdomain still generate domain authority for the primary domain?
We are trying to decide whether a password protected site, that we will noindex, should be set up as a subdomain or if it should be its own domain. The determining factor here is whether or not having that noindexed subdomain will increase domain authority since its noindexed. Any ideas???
Intermediate & Advanced SEO | | grayloon0 -
Noindex junk pages with inbound links?
I recently came across what is to me a new SEO problem. A site I consult with has some thin pages with a handful of ads at the top, some relevant local content sourced from a third party beneath that... and a bunch of inbound links to said pages. Not just any links, but links from powerful news sites. My impression is that said links are paid (sidebar links, anchor text... nice number of footprints.) Short version: They may be getting juice from these links. A preliminary lookup for one page's keywords in the title finds it top 100 on Google. I don't want to lose that juice, but do think the thin pages they link to can incur Panda's filter. They've got the same blurb for lots of [topic x] in [city y], plus the sourced content (not original...). So I'm thinking about noindexing said pages to avoid Panda filters. Also, as a future pre-emptive measure, I'm considering figuring out what they did to get these links and aiming to have them removed if they were really paid for. If it was a biz dev deal, I'm open to leaving them up, but that possibility seems unlikely. What would you do? One of the options I laid out above or something else? Why? p.s. I'm asking this on my blog (seoroi.com/blog/ ) too, so if you're up for me to quote you (and link to your site, do say so. You aren't guaranteed to be quoted if you answer here, but it's one of the easier ways you'll get a good quality link. p.p.s. Related note: I'm looking for intermediate to advanced guest posts for my blog, which has 2000+ RSS subs. Email me at gab@ my site if you're interested. You can also PM me here on SEOmoz, though I don't login as frequently.
Intermediate & Advanced SEO | | Gab-Goldenberg0