Medium sizes forum with 1000's of thin content gallery pages. Disallow or noindex?
-
I have a forum at http://www.onedirection.net/forums/ which contains a gallery with 1000's of very thin-content pages. We've currently got these photo pages disallowed from the main googlebot via robots.txt, but we do all the Google images crawler access.
Now I've been reading that we shouldn't really use disallow, and instead should add a noindex tag on the page itself.
It's a little awkward to edit the source of the gallery pages (and keeping any amends the next time the forum software gets updated).
Whats the best way of handling this?
Chris.
-
Hey Chris,
I agree that your current implementation, while not ideal, is perfectly adequate for the purposes of ensuring you don't have duplicate content or cannibalisation problems - but still allows Google to index the UCG images.
You're also preventing Googlebot from seeing the user profile pages, which is a good idea, since many of them are very thin and mostly duplicate.
So, from a pure SEO perspective, I think you've done a good job.
However... I think you should also consider the ethical implications of potentially blocking the image googlebot as well. By preventing Google from indexing all those images of young girls fawning over the vacuous runners up of a televised talent show, you would undoubtedly be doing the world a great service.
-
Hi Chris, I second Jarno's opinion in this regard. If it is going to be a huge overhead to add the page level blocking, you can rely on your current robots.txt setup. There is a small catch here though. Even if you block using robots.txt file, if Google finds a reference to the blocked content elsewhere on the Internet, then it would index the blocked content. In situations like this, page level content blocking is the way forward. So to fully restrict Google bot indexing your content, you should ideally be using the page level robots meta tag or x-robots-tag.
Here you go for more: https://support.google.com/webmasters/answer/156449?hl=en
Hope it helps.
Best,
Devanur Rafi.
-
Chris,
is the disallow meta update is too complicated for you to add due to software issues etc. then I feel that your current method is the right way to go. Normally you would be absolutely right for the simple reason that page level overrules the robots.txt. But if a software update overrules the rules places in your code then you have to manually add it after each and every update and i'm not sure you want to do that.
regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issue
Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?
Technical SEO | | kirupa0 -
Duplicate Page Content for www and non-www. Help!
Hi guys, having a bit of a tough time here... MOZ is reporting duplicate content for 21 pages on eagleplumbing.co.nz, however the reported duplicate is the www version of the page. For example: http://eagleplumbing.co.nz and http://www.eagleplumbing.co.nz are considered duplicates (see screenshot attached) Currently in search console I have just updated the non-www version to be set as the preferred version (I changed this back and forth twice today because I am confused!!!). Does anyone know what the correct course of action should be in this case? Things I have considered doing include: changing the preferred version to the www version in webmaster tools, setting up 301 redirects using a wordpress plugin called Eggplant 301 redirects. I have been doing some really awesome content creation and have created some good quality citations, so I think this is only thing that is eaffecting my rank. Any help would be greatly appreciated. view?usp=sharing
Technical SEO | | QRate0 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Ok to internally link to pages with NOINDEX?
I manage a directory site with hundreds of thousands of indexed pages. I want to remove a significant number of these pages from the index using NOINDEX and have 2 questions about this: 1. Is NOINDEX the most effective way to remove large numbers of pages from Google's index? 2. The IA of our site means that we will have thousands of internal links pointing to these noindexed pages if we make this change. Is it a problem to link to pages with a noindex directive on them? Thanks in advance for all responses.
Technical SEO | | OMGPyrmont0 -
Search result pages - noindex but auto follow?
Hi guys, I don't index my search pages, and currently my pages are tagged name="robots" content="noindex"> Do I need to specify follow or will it automatically be done? Thanks Cyto
Technical SEO | | Bio-RadAbs0 -
NoIndex user generated pages?
Hi, I have a site, downorisitjustme (dot) com It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up. Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
Technical SEO | | Wardy0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
What's the best format for a e-commerce URL product page
We have over 2000 non branded experiences and activities sold through our website. The website is having a face lift with the a new look and a stronger focus on SEO. As part of this, I am keen to establish what the best practice is for product based URLs. I've researched the market and come up with a few alternatives that are used: domain/category/subcategory/activity_name domain/activity_name/category/subcategory/activity_reference domain/generic_term/activity_reference/activity_name domain/category/activity_location/activity_name Activities are location based but the location can change (say once every 2 years). Activity names, category, subcategory and activity_reference rarely change. Are there any thoughts/ research on the best method? (If there is one) Many thanks in advance for your insights.
Technical SEO | | philwill0