Can duplicate content issues be solved with a noindex robot metatag?
-
Hi all
I have a number of duplicate content issues arising from a recent crawl diagnostics report.
Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem?
Thanks for any / all replies
-
Thanx!
-
This is an old question... And the answer is yes In fact a page blocked in a robots.txt can be reindexed if that same page is linked in an external site. check this old webmaster help thread > http://www.google.com/support/forum/p/Webmasters/thread?tid=3747447eb512f886&hl=en That is why is always better use the meta robots no index to be really sure we don't want a page to be indexed
-
Yes it
would, but i would rather use the canonical tag, all pages have pagerank and
even weak pages help you site rank better. Google once released their page
rank, since then they have changed it many times, but from testing we know that
the main idea still holds true. Pages not in the index can not add to your
sites pagerank.Take a
look at this page it explains it very well. http://www.webworkshop.net/pagerank.htmlUse the calculator,
it is very intuitive -
Using a noindex meta tag is one way to resolve duplicate content issues. If you take this approach, it is most likely you wish to use only the "noindex" tag and not the "nofollow" tag. You don't want to prevent Google from following the links on the page, but instead simply stop the content from being viewed as duplicate.
If you wish to explicitly include the "follow" you can but it is unnecessary since it is the default setting.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
Is there some reputed company/person which can fix the SEO issues for me?
One of my websites http://forum32.com/ was ruling the roost shortly after it launched. We cover BTS (behind the scenes) etc of the TV shows that my production company makes. But then it all went downhill. I do understand that my readership fluctuates with the number of shows I am doing and therefore covering but for a show like "Qubool Hai" which has been running for the past 3 years and we do almost two posts about it - we rank nowhere. Could posting about this "too much" be a problem? Anyway I have been breaking my head over this for the past six months and spent a lot of time trying to set this right but: 1. I am running out of time as I am about to start directing another show. 2. I am not even sure if what I am doing is right! Can any person/company Which is well trusted help me out of this situation? Here is a screenshot of my latest report from MOZ: https://infinit.io/_/jPcUTJP
On-Page Optimization | | gmaxstudios0 -
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Noindex tags in WordPress
I'm receiving 4 duplicate content warnings from Roger.They are all "www.mysite/tag/tag-name". I'm using WordPress and have Yoast WordPress SEO installed. Should I set the tags to "noindex" in the plugin settings?
On-Page Optimization | | brandzz0 -
Is my blog simply duplicate content of my authors' profiles?
www.example.com/blog is the full list of blog posts by various writers. The list contains the title of each article and the first paragraph from the article. In addition to /blog being indexed, each author's contribution list is being indexed separately. It's not a profile, really, just a list of articles in the same title & paragraph format of the /blog page. So if /blog a list of 10 articles written by two writers, I have three pages: /blog/author1 is a list of 4 articles /blog/author2 is a list of 6 different articles /blog is a list of 10 articles (the 4+6 from the two writers) Is this going to be considered duplicate content?
On-Page Optimization | | Brocberry0 -
Duplicate content on area specific sites
I have created some websites for my company Dor-2-Dor and there is a main website where all of the information across the board is on (www.dor2dor.com) but I also have area specific sites which are for our franchisees who run certain areas around the country (www.swansea.dor2dor.com or www.oxford.dor2dor.com) The problem is that the content that is on a lot of the pages is the same on all of them for instance our faq's page, special offers etc. What is the best way to get these pages to rank well and not have the duplicate content issues and be ranked down by search engines? Any help will be greatly received.
On-Page Optimization | | D2DWeb0 -
Archetecture to avoid content duplicate
Hi, I have lots of duplicate stuff and I need a better site architecture. http://www.furnacefilterscanada.com/ We are selling furnace filters. All furnace filters are sold in 50 different sizes, each sizes comes in 3 different qualities, Bronze, Silver and Gold. Total: 150 products. Right now I have created many categories and subcategories for furnace filters sizes. When the client pickup is sizes, he will end-up to the products page with 3 different options, Bronze, Silver and Gold. They can then compare the filter a select the one he wants to purchase. The problem is, it is not possible to provide different content for each filters, Gold has a description, Silver has another one and also Bronze. The only text that will change in the descriptions, is the filter size. This makes Duplicates text description. Not good when you what to index your site. The positive things to 150 different products, is the page title. example 16x25x4 furnace filters. Those exacte tem get search in Google. A new site architecture with 3 categories, Gold, Silver and Bronze & 50 variables by products (filters sizes) might not be the best options, because no filter size will be index. Can you please help me to find the best architecture in a SEO point of view? Also what about the top navigation bar menu, what is the best options in using it? Right now it is use for Legal, Contact, Policy and I fill it is a wast, those page only get less then 1% clicks. It might be more convenient to use those for categories for example, what is your recommendations in a SEO point of view? Can I create a information page in the left navigation menu and includ all the standard page, like: Policy, Legal ... If I do, will I get penalize by Google? Thank you for your help. We have puts lots of money in AdWords before, but now the next step is to come home organics. I'm using SEOmoz tools, read there new book, and I want increase traffic. I just need your help. Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050