Duplicate content issue with Wordpress tags?
-
Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .
-
The harm is that Google may well class your site as having duplicated content issues. This wont always happen but there is a chance. There are also a few other issues which are covered well by Dan Petrovic (link at the bottom).
Google does indeed see millions of duplicate content issues and it tries its hardest to only rank one version of that content.
You can always make your category pages the landing pages for keywords and not the tags. If you are already being found from tags then don't deindex them but there will be a risk that a Panda data refresh or algo update may hurt your site.
-
But what's the harm in leaving it indexed? Surely Google sees millions and millions of duplicate content issues with Wordpress blogs - it must ignore them. The average blogger does not know about duplicate content issues so Google must be aware of this.
Also, if people are finding you via tags then obviously it is a problem to deindex them - a problem that may outweigh duplicate content issues even if Google does penalize duplicate content on Wordpress blogs.
-
You should noindex your tags, yeah. All the post snippets are just duplicated content. There are loads of other stuff you should also noindex. If you haven't already, install the Yoast Plugin. There are loads of guides online on how to set it up to block the correct pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue with ?utm_source=rss&utm_medium=rss&utm_campaign=
Hello,
Technical SEO | | Dinsh007
Recently, I was checking how my site content is getting indexed in Google and from today I noticed 2 links indexed on google for the same article: This is the proper link - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/ But why this URL was indexed, I don't know - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/?utm_source=rss&utm_medium=rss&utm_campaign=hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims Could you please tell me how to solve this issue? Thank you1 -
Canonical Tags - Do they only apply to internal duplicate content?
Hi Moz, I've had a complaint from a company who we use a feed from to populate a restaurants product list.They are upset that on our products pages we have canonical tags linking back to ourselves. These are in place as we have international versions of the site. They believe because they are the original source of content we need to canonical back to them. Can I please confirm that canonical tags are purely an internal duplicate content strategy. Canonical isn't telling google that from all the content on the web that this is the original source. It's just saying that from the content on our domains, this is the original one that should be ranked. Is that correct? Furthermore, if we implemented a canonical tag linking to Best Restaurants it would de-index all of our restaurants listings and pages and pass the authority of these pages to their site. Is this correct? Thanks!
Technical SEO | | benj20341 -
What is the process for allowing someone to publish a blog post on another site? (duplicate content issue?)
I have a client who allowed a related business to use a blog post from my clients site and reposted to the related businesses site. The problem is the post was copied word for word. There is an introduction and a link back to the website but not to the post itself. I now manage the related business as well. So I have creative control over both websites as well as SEO duties. What is the best practice for this type of blog post syndication? Can the content appear on both sites?
Technical SEO | | donsilvernail0 -
Duplicate content issues arise 6 months after creation of website?!
Hi, I've had the same website for 6 months and fixed all original on-site issues a long time ago.Now this week I wake up and found 3 new errors: 3 of my pages have missing titles issues, missing meta description issues and also the Moz crawl says they all have duplicate content issues. All my rankings went down a lot as well. This site is static, doesn't even have a blog, everything is rel canonical and non-indexed. It's 100% original content as well. So how can those issues arise 6 months later? All my titles and descriptions are there and non-duplicate, and the content is original and not duplicate as well. Is this a wordpress bug or virus? Anyone had this happen to them and how to fix it? Thanks a lot for you help! -Marc
Technical SEO | | marcandre0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
Duplicate Title Tags and Meta Desc even with the correct Canonical Tag
I show a large/growing number of duplicate title tags and duplicate meta descriptions in my webmaster tools. I look at both pages Link 1 - http://www.thatsmytopper.com/wedding-cake-toppers/theme-cake-toppers/beach-theme-cake-toppers/where/color/petal-pink.html Link 2 - http://www.thatsmytopper.com/wedding-cake-toppers/theme-cake-toppers/beach-theme-cake-toppers/where/color/petal-pink/limit/16.html Both pages have the following canonical url: <link rel="<a class="attribute-value">canonical</a>" href="http://www.thatsmytopper.com/wedding-cake-toppers/theme-cake-toppers/beach-theme-cake-toppers.html" > Why does this show up as a duplicate title tag and description to Google still?
Technical SEO | | bhalverson0 -
Duplicate Content - Just how killer is it?
Yesterday I received my ranking report and was extremely disappointed that my high-priority pages dropped in rank for a second week in a row for my targeted keywords. This is after running them through the gradecard and getting As for each of them on the keywords I wanted. I looked at my google webmaster tools and saw new duplicate content pages listed, which were the ones I had just modified to get my keyword targeting better. In my hastiness to work on getting the keyword usage up, I neglected to prevent these descriptions from coming up when viewing the page with filter parameters, sort parameters and page parameters... so google saw these descriptions as duplicate content (since myurl.html and myurl.html?filter=blah are seen as different). So my question: is this the likely culprit for some pretty drastic hits to ranking? I've fixed this now, but are there any ways to prevent this in the future? (I know _of _canonical tags, but have never used them, and am not sure if this applies in this situation) Thanks! EDIT: One thing I forgot to ask as well: has anyone inflicted this upon themselves? And how long did it take you to recover?
Technical SEO | | Ask_MMM0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0