To noindex or not to noindex
-
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do?
- Don't noindex anything - let Google decide what's worth indexing and not.
- Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it.
- Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed.
Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
-
1. yes - if you no index all but 20 pages, those 20 pages would get a boost in rankings. You would end up losing the long tail searches from those other thousands of page - so you'll need to do some cost / benefit analysis on that.
2. you'll need to do a cost / benefit analysis on this one. Are most of the visitors to your site searching in Chinese or English? Are your search terms mainly in Chinese or mainly in English? Are your Chinese speaking visitors more likely to want to visit the .zh subdomain?
You could publish 20 to 50 pages on each subdomain, and then focus on doing some link building. If you have strong rankings across those 40 to 100 pages, then you could start adding more pages slowly over time.
-
Nops, no need to include the no index tag as adding canonical is an indication to Google that what are the original pages that search engine need to index and crawl so al other pages then category pages will be crawled automatically.
-
Hi Moosa. Thanks very much for your reply and great suggestions. If I add canonical tags on each URL page referencing the category page where it belongs, should I also add noindex tags on it? Should then actually all URL pages have noindex tags and only allow category pages to be indexed?
-
Thanks for suggestions. I have some follow-up questions. Would really like to know what you think about the following:
- The "page rank will get shared to all of the pages that you have across your site". In general, does this mean that if I add noindex tags to all but a few pages, they will be ranked much higher? Currently thousands of pages are indexed. Is it correct to say that if only say 20 pages were indexed that would greatly improve their ranking?
- The zh and en versions of the website have different templates and most of the text content is also translated (with the main exception of old blog posts). We could add noindex on all of the zh website or all except the main pages. Would you recommend that?
-
Ok I might sound completely stupid here as I never come across this case before but here is my hypothesis….
While searching for a keyword or URL you another field (may be a checkbox) that represents the category of search.
So, ones the new URL will generate it will come under the specific category automatically.
Customize the category pages so that they look different from each other.
Index the category pages and add canonical tag on any new generated URL of the category page. For example if the new page generates like www.yourwebsite.com/movies/ice-age -3/ this page should have the canonical tag to http://www.yourwebsite.com/movies/
Why?
Creating category pages will allow you more unique pages to get indexed in SERPs without the duplicate content issue. Adding canonical tag on all other URLs will tell category pages are the real pages that Google should consider.
This might help you cater more chances to earn more search traffic from Google.
**This is my assumption what I think should work!
-
Creating a page every time someone performs a search could probably spiral out of control pretty quickly. If you have a certain amount of 'page rank', based on all of the back links you have, that page rank will get shared to all of the pages that you have across your site.
One way you could more naturally control what gets indexed, is by what you link to from your home page. For instance, if you track the most blocked big sites, as well as the most blocked keywords, and have those pages 1 link from your homepage, you could expect those to get indexed naturally when your site is spidered.
As you get more links from other sites, and your trust from the search engines and page rank grows, you should be able to support more pages getting indexed across your site.
There is the issue of your site contents potentially being regarded as 'thin content', since many of the pages appear to be the same from page to page.
One question I had - I saw your site hosts both Chinese language words and English language words, and checks whether those words are being filtered. Perhaps it would make more sense to only show the words in Chinese characters on the zh. subdomain, and the English words on the en. subdomain? Just a thought. Is there any difference between the zh and en subdomains, aside from the language of the template?
Really interesting website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex my categories?
Hello! I have created a directory website with a pretty active blog. I probably messed this up, but I pretty much have categories (for my blog) and custom taxonomy (for different categories of services) that are very similar. For example I have the blog category "anxiety therapists" and the custom taxonomy "anxiety". 1- is this a problem for google? Can it tell the difference between archive pages in these different categories even though the names are similar? 2- should I noindex my blog categories since the main purpose of my site is to help people find therapists ie my custom taxonomy?
Intermediate & Advanced SEO | | angelamaemae0 -
Conditional Noindex for Dynamic Listing Pages?
Hi, We have dynamic listing pages that are sometimes populated and sometimes not populated. They are clinical trial results pages for disease types, some of which don't always have trials open. This means that sometimes the CMS produces a blank page -- pages that are then flagged as thin content. We're considering implementing a conditional noindex -- where the page is indexed only if there are results. However, I'm concerned that this will be confusing to Google and send a negative ranking signal. Any advice would be super helpful. Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
Should I Keep adding 301s or use a noindex,follow/canonical or a 404 in this situation?
Hi Mozzers, I feel I am facing a double edge sword situation. I am in the process of migrating 4 domains into one. I am in the process of creating URL redirect mapping The pages I am having the most issues are the event pages that are past due but carry some value as they generally have one external followed link. www.example.com/event-2008 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2007 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2006 301 redirect to www.newdomain.com/event-2016 Again these old events aren't necessarily important in terms of link equity but do carry some and at the same time keep adding multiple 301s pointing to the same page may not be a good ideas as it will increase the page speed load time which will affect the new site's performance. If i add a 404 I will lose the bit of equity in those. No index,follow may work since it won't index the old domain nor the page itself but still not 100% sure about it. I am not sure how a canonical would work since it would keep the old domain live. At this point I am not sure which direction I should follow? Thanks for your answers!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
I just want a second opinion 🙂 The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
Intermediate & Advanced SEO | | Zanox0 -
Noindex,follow is a waste of link juice?
On my wordpress shopping cart plugin, I have three pages /account, /checkout and /terms on which I have added “noindex,follow” attribute. But I think I may be wasting link juice on these pages as they are not to be indexed anyway, so is there any point giving them any link juice? I can add “noindex,nofollow” on to the page itself. However, the actual text/anchor link to these pages on the site header will remain “follow” as I have no means of amending that right now. So this presents the following two scenarios – No juice flows from homepage to these 3 pages (GOOD) – This would be perfect then, as the pages themselves have nofollow attribute. Juice flows from homepage to these pages (BAD) - This may mean that the juice flows from homepage anchor text links to these 3 pages BUT then STOPS there as they have “nofollow” attribute on that page. This will be a bigger problem and if this is the case and I cant stop the juice from flowing in, then ill rather let it flow out to other pages. Hope you understand my question, any input is very much appreciated. Thanks
Intermediate & Advanced SEO | | SamBuck1