Should I be deindexing pages with thin or weak content?
-
If I have pages that rank product categories by alphabetical order should I deindex those pages? Keeping in mind the pages do not have any content apart from product titles?
For example:
If I deindexed these pages would I lose any authority passed through internal linking?
-
Cheers Guys,Thanks for clearing that up!
-
Hi - If you have too many thin pages developed on site - the best way as suggested by Chris too - is 'noindex, follow'
There is no negative Impact, rather helps Search Engines to understand site hierarchy better - by been allowed to crawl and index on rather pages with full of content. The follow tag will pass on all link authority to internal links. Only the page will be deindexed from search engines
Its in a way good - as no user will land on to pages, with very little or no content - thus avoiding single page bounces too
-
Reducing the size by eliminating those pages won't have any negative effect on your site.
-
Hi Chris,Thats great!
So If I keep them followed, the link juice will still pass on. Do you think it will have a negative impact on the site as a whole, by decreasing the amount of pages being indexed by Google. i.e. Reducing the site size?
Thanks for the articles aswell, very useful!
-
Jonathan,
If you noindex, follow them, link juice will pass from upstream links through to the downstream links but if you nofollow them, it won't.
This thread goes into some detail on the same topic http://moz.com/community/q/how-google-treat-internal-links-with-rel-nofollow
Rand wrote a pretty thorough guide on the fundamentals of PR sculpting you might want to check out: http://moz.com/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow
-
Hi Ruchi,
If you look at this website for an example:
http://www.campusexplorer.com/colleges/alphabet/j/
Now obviously, Google doesn't react well to pages that have thin or weak content, therefore what I am asking is would the value of deindexing the page outweigh the benefit these pages are receiving in internal link authority?
-
Well, your query is bit unclear. if you don't have any content on these pages than why do you need those pages on your website?
And if these are the categories than each category should have a proper name.
If if you have pagination on your website, like album a, album b, album c, then you should use Canonical Tag For Paginated Results
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Low Index: 72 pages submitted and only 1 Indexed?
Hi Mozers, I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues. I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues. I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each. Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k Any advice around this would be so much appreciated! Thanks Justin new
International SEO | | edward-may0 -
Duplicate Content - International Sites - AirBNB
Good morning Just a quick question. Why does AirBNB not get penalised for duplicate content on its sites. For example, the following two urls (and probably more for other countries), both rank appropriately in the google (UK and COM), https://www.airbnb.co.uk/help/getting-started/how-to-travel
International SEO | | joogla
https://www.airbnb.com/help/getting-started/how-to-travel Their are no canonical tags, no Alternative etc If I look at the following https://www.airbnb.co.uk/s/London--United-Kingdom
https://www.airbnb.com/s/London--United-Kingdom They both have alternative to point to the other language versions which I would expect. However they also both point to them selves as canonical. Would this not be duplicate content ? Thanks for your insights Shane0 -
Duplicate content on .co.uk and .com TLDs with different domain authority
What's the best approach to take for a site that has identical content on the .co.uk and .com versions of the root domain? The .co.uk version has a significantly higher domain authority (54 vs 32 according to Open Site Explorer - see attached screenshot). But it's an international company with its largest customer base in North America and customers in over 60 countries. The company does not intend to localize content. My initial thought before seeing the domain authority was to 301 redirect the .co.uk to the .com domain to consolidate all the link equity under one international TLD. However, I wondered if the higher domain authority for .co.uk would be passed on if we did this. I figured that a non-UK audience would be more likely to trust a .com site. I still think 301 redirecting .co.uk to .com might be the best strategy in the long term. But is there likely to be a dip in rankings and organic search volume in the short term until .co.uk is replaced in the index by .com? I'd really appreciate your thoughts on this. CbVnfSO.png
International SEO | | Torchbox0 -
Delivering different content according to country
Hey, I have a question regarding different content according to country (IP)-
International SEO | | Kung_fu_Panda
We planing to serve mobile users using dynamic HTML serving (on the same url)
Is it possible to serve different content for different devices + different IPs (for example different content for a user from US android and someone from UK android ) thanks!0 -
Duplicate content international homepage
Hi, We have a website which is in english and dutch language. Our website has the following structure www.eurocottage.com:
International SEO | | Bram76
Dutch or English language ones the user has set his language in a cookie. www.eurocottage.com/nl/ :
Dutch language www.eurocottage.com/en/:
English language The .com and the eurocottage.com/nl/ and eurocottage.com have according to Google duplicate content because they are initial both in Dutch. What would be the best strategy to fix this problem? Thanks, Bram0 -
Is having duplicated content on different domains a problem when using alternate tag, but no canonical?
We will be launching a couple of new language versions. I understand that ccTLD is mostly considered as best option, however I thought that to start with it might be better to launch the new language version first on a subdirectory of our established domain with strong backlink profile as it may rank much better until I can attract some strong links to new ccTLD. I would wait for the pages of new language versions to be indexed on the main domain and then after a month launch the same content paralell on the ccTLD setting up an alternate tag in the main domain pointing to the ccTLD. I would not setup any canonical tag. As I understand google would rank whatever of the 2 versions ranks higher. Should not cause duplicated content issues right?
International SEO | | lcourse
Any thoughts? EDIT:
For clarification. The language we are launching are mostly spoken in several countries. E.g. for Portuguese I would add in main domain an altnernate tag for Brazilian visitors to Brazilian ccTLD, but no alternate tag for Portuguese visitors. For Corean I would add in main domain an alternate tag for visitors in south corea, but not one for visitors in north corea.0 -
CcTLD and duplicate content
Hello people, I would like some help with this question... I am building 2 websites www.domain.com.ec and www.domain.com , both on the same languages, and same content, but the domain.com.ec will show a different price for local ecommerce and focus to target Ecuador... the www.domain.com will sell on all the other spanish languages countries with a fob price... So my question ... is there any way to fail into the duplicate content on the google eyes? What could be the best way to do it? Using the multistore option, with different cctld could change anything? Thank you guys
International SEO | | lans27870