Duplicated content generated by keywords
-
Hello!
I am kind of new to SEO and MOZ, so I really need your help to understand why some of my keywords generate duplicated content. Meaning, in my blog posts I use various SEO keywords. It shows up that in my MOZ crawl analysis, I have these keywords listed as duplicates: so two/three different keywords are pointing to the same articles and are considered duplicates?
I really don't understand how it is possible. Did it also happen to you?
I highly appreciate it.
Thank you
-
Happy to help!
-
Thank you tons, Sander
-
Two solutions in my opinion:
1. Hide the keywords and place a canonical URL on the existing tag pages to the main tag,category or blogpost pages
2. Keep the tagging and do the same as in situation one (canonical) :D. Just make sure you don't have more URL's with the same article.
Hope you can sort it out.
Sander -
Yes, that's right, this is the case.
Even if I hid the keywords, will they still create URLs, generating duplicate content?
Thank you, Sander, I highly appreciate your reply
-
Is it the case that all three keywords are directing to a different URL (but same article)?
I've had some clients working with blogs and with 'tag' pages. They allocate tags (in your case keywords?) to an article. These tags are also accessible on a URL. All these URL's contain the same article.
e.g. www.example.com/blog/tag/keyword
This causes duplicate content. If this is the case, I would just work with category pages or with canonical URL's.
Sander
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Avoid Duplicate Page Content errors when using Wordpress Categories & Tags?
I get a lot of duplicate page errors on my crawl diagnostics reports from 'categories' and 'tags' on my wordpress sites. The post is 1x link and then the content is 'duplicated' on the 'category' or 'tag' that is added to the page. Should I exclude the tags and categories from my sitemap or are these issues not that important? Thanks for your help Stacey
Moz Pro | | skehoe1 -
How can I deal with tag page duplicate issues
The Moz crawler reported some dupliated issues. Many of them have to do with tags.
Moz Pro | | IamKovacs
Each tag has a link, and as some articles are under several tags, these come up as duplicate content. I read Dr Peter's piece on Canonical stuff, but it's not clear to me if any of these are the solution. Perhaps the solution lies somewhere else? Maybe I need to block the robots from these urls (But that seems counter-SEO-productive) Thanks
Kovacs0 -
Duplicate content
Hi Since adding blog to a site semoz is reporting increased duplicate content warning on seomoz crawl error tool such as: /blog/category/easter being a duplicate of blog/2013/03 Does this type of dupe content matter ? If so how do you stop this ? Also pages and pages of dupe content reported from internal/site search results, such as: /catalogsearch/result/index/?q=mens+fashion being a duplicate of /catalogsearch/result/?q=mens+fashion Does this matter need to be fixed or since internal site search not an issue and can just ignore, if it is an issue what do you need do to fix this type of dupe content ? Cheers Dan
Moz Pro | | Dan-Lawrence0 -
Keywords Data Tool: Why is volume metrics unavailable for all of my keywords?
When I use the SEOMoz Pro Tool for Keyword Resarch, I get the notice that the tool is getting improvements. But when I run my keywords all of the volume metric data is unavailable. Why is this?
Moz Pro | | seocoppercupimages0 -
Is there a way to export keywords from SEOMOZ?
Hi, is there a way to export keywords of a campaign in SEOMOZ? The other way around to import including labels? Best
Moz Pro | | ValerieSchmidt0 -
Keyword Difficulty / Search Volume
Hello all, What do you think about using Keyword Difficulty divided by Search Volume as an alternative to keyword efficiency indexes? ETA: Obviously this wouldn't be a hard and fast metric, but a general indicator to be taken into account along with other data.
Moz Pro | | wattssw0 -
Is there a way to specify what SEOmoz classes as duplicate content?
Hi all, Currently working through the laundry list of errors and warning on our company's 24 websites. Due to the ridiculous amount of on page links and the sheer volume of products on our sites, much of the descriptive text is similar, following a strict pattern to best mention our USPs and the like. Of course we use a CMS, which means that all the pages look the same and draw this information from the style sheet. Anyways, to the problem at hand. I have been tasked with reducing the "error" count on the SEOmoz admin panel, the problem being SEOmoz is reporting duplicate page content, when they are different, but similar products, for example, 35, 45 and 55 litre refrigeration units. Is there a way in which I can specify what classes as duplicate content, or make the duplicate content report more restrictive, so that everything HAS to be the same for this error to show. Any help is much appreciated, thanks in advance.
Moz Pro | | cmuknbb0 -
How to remove Duplicate content due to url parameters from SEOMoz Crawl Diagnostics
Hello all I'm currently getting back over 8000 crawl errors for duplicate content pages . Its a joomla site with virtuemart and 95% of the errors are for parameters in the url that the customer can use to filter products. Google is handling them fine under webmaster tools parameters but its pretty hard to find the other duplicate content issues in SEOMoz with all of these in the way. All of the problem parameters start with ?product_type_ Should i try and use the robot.txt to stop them from being crawled and if so what would be the best way to include them in the robot.txt Any help greatly appreciated.
Moz Pro | | dfeg0