Why do I get duplicate content errors just for tags I place on blog entries?
-
I the SEO MOZ crawl diagnostics for my site, www.heartspm.com, I am getting over 100 duplicate content errors on links built from tags on blog entries. I do have the original base blog entry in my site map not referencing the tags.
Similarly, I am getting almost 200 duplicate meta description errors in Google Webmaster Tools associated with links automatically generated from tags on my blog. I have more understanding that I could get these errors from my forum, since the forum entries are not in the sitemap, but the blog entries are there in the site map.
I thought the tags were only there to help people search by category. I don't understand why every tag becomes its' own link. I can see how this falsely creates the impression of a lot of duplicate data.
As seen in GWT:
Pages with duplicate meta descriptions Pages [Customer concerns about the use of home water by pest control companies.](javascript:dropInfo('zip_0div', 'none', document.getElementById('zip_0zipimg'), 'none', null);)/category/job-site-requirements/tag/cost-of-water/tag/irrigation-usage/tag/save-water/tag/standard-industry-practice/tag/water-use 6 [Pest control operator draws analogy between Children's Day and the state of the pest control industr](javascript:dropInfo('zip_1div', 'none', document.getElementById('zip_1zipimg'), 'none', null);)/tag/children-in-modern-world/tag/children/tag/childrens-day/tag/conservation-medicine/tag/ecowise-certified/tag/estonia/tag/extermination-service/tag/exterminator/tag/green-thumb/tag/hearts-pest-management/tag/higher-certification/tag/higher-education/tag/tartu/tag/united-states -
Nice if you can get your tags to reflect the broad range of topics covered by your site, but I think sometimes it can give a visitor the perception that your content is a little thin when you only have one article against each topic.
Say you're reading an article, you see that it's tagged with the topic you're interested in, click on that tag and only get the one article that you've just read.
When deciding on tags to use, try and think of how your visitors may wish to explore your content. Remember to try and maintain an external perspective and use tags that are meaningful to your visitors.
If you've got the time and enough traffic to give you the info, you could use your analytics data to see how people are using tags vs. site search for example.
Of course, as you point out, this doesn't have anything to do with the duplicate content problem.
-
Doug, thank you so much for responding quickly. I will have to explore the archive. I really haven't ever explored this section. You've given me a new place to research and I appreciate that.
As far as reducing the number of tags, that I don't understand as being an issue. Because if I am doing something wrong, i.e. presenting the whole article via tags rather than references to it, that is the problem, whether I have 2 tags or 20 tags. I want to find a solution that will solve the core issue regardless of the number of tags created.
Is that a different reason that you suggest reducing the number of tags? I have heard around town that it is good not to have so many tags, but shouldn't I create them if a post covers a wide range of topics. For example, if I wrote one post about a trip to Europe, chosing not to divide it into 15 posts for 15 countries, wouldn't I want to create a tag for each of the countries I mention in the post?
Doug, thanks again.
Gerry Weitz
-
As you mention, you use tags to give visitors the ability to browse articles with that cover the same subject matter. Normal practice would be to give the visitor a list of articles that have been tagged with this term.
A list of articles shouldn't flag up as a duplicate of any of the other article pages.
However, I notice that on a couple of the tag pages I took a look at that I just get one article being displayed, and being displayed in full. This is obviously going to be a duplicate of the original article.
I suspect that this is because for a number of the tags there is only one article to be returned.
I would look at how you are presenting articles when listed in the "tag archive" view and perhaps only display the introduction to the page with a link to the full article.
You may also want to think about the tags you are using so that you have fewer tags with only one article.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get rid of bot verification errors
I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
Technical SEO | | mfrgolfgti
Are these really errors?
If so, what can I do to fix them?
If not, can I just tell Moz to ignore them or will this cause bigger problems?0 -
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Duplicate content on report
Hi, I just had my Moz Campaign scan 10K pages out of which 2K were duplicate content and URL's are http://www.Somesite.com/modal/register?destination=question%2F37201 http://www.Somesite.com/modal/register?destination=question%2F37490 And the title for all 2K is "Register" How can i deal with this as all my pages have the register link and login and when done it comes back to the same page where we left and that it actually not duplicate but we need to deal with it propely thanks
Technical SEO | | mtthompsons0 -
Getting Errors On Server Connectivity-??
Hi Guys I am getting a massive crawl errors on googlewebmaster ,stating there is over 2162 errors connect time out - anyone know where I can see exactly where the time out is from? I have browsed through my site and I do not see any connect timeout occured. Thanks Cary
Technical SEO | | ilovebodykits1 -
Does this content get indexed?
A lot of content on this site is displayed in pop up pages. Eg. Visit the Title page http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title To access the sample report or fee details, the info is shown in a pop up page with a strange url. Example: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details I can't see any of these pages being indexed in Google or other search engines when I do a site search: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details Is there a way to get this content indexed besides telling the client to restructure this content?
Technical SEO | | Bigheadigital0 -
Canonical tag in preferred and duplicate page
Hi, I have a super spiffy (not) CMS that tends to create some pages at the root level of the site (not where I want it) i.e. www.site.com/page.htm as well as the desired location i.e. www.site.com/category/keyword/page.htm . Now obviously a canonical tag inserted into the URL at the undesired location would be the best option, however the source code is exactly the same for both pages (can’t change this) i.e. if I put the canonical tag that reads www.site.com/category/keyword/page.htm"/> it will appear in the head section of both pages, the desired URL and the non desired URL. Will a canonical tag inserted into the head section of a the preferred URL directing the search engine spiders pretty much to itself cause more grieft than the solution it offers re duplicate content ? Marc
Technical SEO | | NRMA0 -
What are some of the negative effects of having duplicate content from other sites?
This could include republishing several articles from another site with permission.
Technical SEO | | Charlessipe0