Duplicate Tag Content Mystery
-
Hello Moz Communtiy!
i am also having error of Duplicate Tag Content Mystery like:
http://www.earnmoneywithgoogleadsense.com/tag/blog-post/
http://www.earnmoneywithgoogleadsense.com/tag/effective-blog-post/
Pages are same. I have 100+ Error on website so how can i remove this error? DO you have any tutorial based on this? Can i change canonical url at once or i need to set it one by one? If you have any video basis on it, i will recommend.
-
Finally, with not too good result, i have simply deindex the tag archieves.
-
Dear,
First of all, thanks for your reply. I want that google index this tag archieve but while click on that tag archieve:
http://www.earnmoneywithgoogleadsense.com/tag/blog-post/
http://www.earnmoneywithgoogleadsense.com/tag/effective-blog-post/
It must come to the:
http://www.earnmoneywithgoogleadsense.com/7-latest-tips-to-make-effective-blog-post/
-
of course he could use the robots.txt
Yoast may be the easiest way (but exactly that point sometimes isn't working - depends on installed plugins) -
WordPress is good at creating duplicate content. I would check at the official site to see if there is way to disable the archiving "tag" feature as it appears to be redundant.
Andreas recommendation would work as well, simply check if you're at an archive tag page, if so then display the meta tag noindex.
You also could do this inside of a robots.txt
User-agent: *
Disallow: /tag/*This would effectively stop any pages from being crawled when the crawler sees the "tag" directory.
Hope that helps,
Don
-
I would simply ad noindex in taxonomy (tag) - easy done with yoast, otherwise edit header.php and ask if is tag, than noindex meta tag. I guess it is Wordpress
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Decline in traffic and duplicate content in different domains
Hi, 6 months ago my customer purchased their US supplier and moved the supplier's website to their e-commerce platform. When moving to the new platform they copied the descriptions of the products from their site to the supplier's site so now both sites have the same content in the product pages. Since then they have experienced decrease in traffic in about 80%. They didn't implement canonical tag or hreflang. My customer's domain format is https://www.xxx.biz and the supplier's domain is https://www.zzz.com The last one is targeting the US and when someone from outside of the US wants to purchase a product they get a message that they need to move to the first website, the www.xxx.biz. Both sites are in English. The old site version of www.zzz.com, before the shit to the new platform, contained different product descriptions, and BTW, the old website version is still live and indexed under a subdomain of www.zzz.com. My question is what's the best thing to do in this case so that the rankings will be back to higher positions and they'll get back their traffic. Thanks!
Technical SEO | | digital19740 -
Magento Duplicate Content help!
How can I remove the duplicate page content in my Magento store from being read as duplicate. I added the Magento robots file that i have used on many stores and it keeps giving us errors. Also we have enabled the canonical links in magento admin I am getting 3616 errors and can't seem to get around it .. any suggestions?
Technical SEO | | adamxj20 -
Duplicate Content from Multiple Sources Cross-Domain
Hi Moz Community, We have a client who is legitimately repurposing, or scraping, content from site A to site B. I looked into it and Google recommends the cross-domain rel=canonical tag below: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html The issue is it is not a one to one situation. In fact site B will have several pages of content from site A all on one URL. Below is an example of what they are trying to accomplish. EX - www.siteB.com/apples-and-oranges is made up of content from www.siteA.com/apples & www.siteB.com/oranges So with that said, are we still in fear of getting hit for duplicate content? Should we add multiple rel=canonical tags to reflect both pages? What should be our course of action.
Technical SEO | | SWKurt0 -
Duplicate Content - Reverse Phone Directory
Hi, Until a few months ago, my client's site had about 600 pages. He decided to implement what is essentially a reverse phone directory/lookup tool. There are now about 10,000 reverse directory/lookup pages (.html), all with short and duplicate content except for the phone number and the caller name. Needless to say, I'm getting thousands of duplicate content errors. Are there tricks of the trade to deal with this? In nosing around, I've discovered that the pages are showing up in Google search results (when searching for a specific phone number), usually in the first or second position. Ideally, each page would have unique content, but that's next to impossible with 10,000 pages. One potential solution I've come up with is incorporating user-generated content into each page (maybe via Disqus?), which over time would make each page unique. I've also thought about suggesting that he move those pages onto a different domain. I'd appreciate any advice/suggestions, as well as any insights into the long-term repercussions of having so many dupes on the ranking of the 600 solidly unique pages on the site. Thanks in advance for your help!
Technical SEO | | sally580 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0