Can duplicate content issues be solved with a noindex robot metatag?
-
Hi all
I have a number of duplicate content issues arising from a recent crawl diagnostics report.
Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem?
Thanks for any / all replies
-
Thanx!
-
This is an old question... And the answer is yes In fact a page blocked in a robots.txt can be reindexed if that same page is linked in an external site. check this old webmaster help thread > http://www.google.com/support/forum/p/Webmasters/thread?tid=3747447eb512f886&hl=en That is why is always better use the meta robots no index to be really sure we don't want a page to be indexed
-
Yes it
would, but i would rather use the canonical tag, all pages have pagerank and
even weak pages help you site rank better. Google once released their page
rank, since then they have changed it many times, but from testing we know that
the main idea still holds true. Pages not in the index can not add to your
sites pagerank.Take a
look at this page it explains it very well. http://www.webworkshop.net/pagerank.htmlUse the calculator,
it is very intuitive -
Using a noindex meta tag is one way to resolve duplicate content issues. If you take this approach, it is most likely you wish to use only the "noindex" tag and not the "nofollow" tag. You don't want to prevent Google from following the links on the page, but instead simply stop the content from being viewed as duplicate.
If you wish to explicitly include the "follow" you can but it is unnecessary since it is the default setting.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can bots crawl this homepage's content?
The website is https://ashleydouglas.com.au/ I tried using http://www.seo-browser.com/ to see if bots could see the content on the site, but the tool was unable to retrieve the page. I used mobile-friendly test and it just rendered some menu links - no content and images. I also used Fetch and Render on Search Console. The result for 'how google sees the page' and 'how a visitor sees the page' are the same and only showing the main header image. Anything below isn't shown. Does this mean that bots can't actually read all content on the page past the header image? I'm not well versed with what's going on with the code. Why are the elements below the header not rendering? Is it the theme? Plugins? Thank you.
On-Page Optimization | | nhhernandez0 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
Meta descriptions better empty or with duplicate content?
I am working with a yahoo store. Somehow all of the meta description fields were filled in with random content from throughout the store. For example, a black cabinet knob product page might have in its description field the specifications for a drawer slide. I don't know how this happened. We have had a programmer auto populate certain fields to get them ready for product feeds, etc. It's possible they screwed something up during that, this was a long time ago. My question. Regardless of how it happened. Is it better for me to have them wipe these fields entirely clean? Or, is it better for me to have them populate the fields with a duplicate of our text from the body. The site has about 6,500 pages so I have and will make custom descriptions for the more important pages after this process, but the workload to do them all is too much. So, nothing or duplicate content for the pages that likely won't receive personal attention?
On-Page Optimization | | dellcos1 -
How Should I Fix Duplicate Content in Wordpress Pages
In GWMT i see google found 41 duplicate content in my wordpress blog. I am using Yoast SEO plugin to avoid those type of duplicates but still the problem was stick.. You can check the screenshot here - http://prntscr.com/dxfjq Please help..
On-Page Optimization | | mamuti0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0