GWT Duplicate Content and Canonical Tag - Annoying
-
Hello everyone!
I run an e-commerce site and I had some problems with duplicate meta descriptions for product pages.
I implemented the rel=canonical in order to address this problem, but after more than a week the number of errors showing in google webmaster tools hasn't changed and the site has been crawled already three times since I put the rel canonical.
I didn't change any description as each error regards a set of pages that are identical, same products, same descriptions just different length/colour.
I am pretty sure the rel=canonical has been implemented correctly so I can't understand why I still have these errors coming up.
Any suggestions?
Cheers
-
Thank you for your answers.
Yeah I checked the rel=canonical and I fixed it as it had been implemented badly.
I guess I have to wait and see!
Cheers
Oscar
-
Hello, It generally takes time. My personal observation is even if your site gets crawled on daily basis, still the page errors will take anywhere from 4 to 6 weeks from GWT to gets removed of
So - as long as the implementation is correct - you may focus on correction of other errors (if any) on the site, Webmaster will soon be updated with this.
-
As long as you did implement the rel=canonical tags correctly then it should happen the next time the page is crawled but don't be dismayed that the data isn't yet showing up in your GWT as 7 day delays or more is not unheard of.
-
just wait a bit more. one week is not much yet. The 3 crawls don't mean they will update it immediately
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tags, Categories, & Duplicate Content
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!
Technical SEO | | bradhodson0 -
Duplicate content issue
Moz crawl diagnostic tool is giving me a heap of duplicate content for each event on my website... http://www.ticketarena.co.uk/events/Mint-Festival-7/ http://www.ticketarena.co.uk/events/Mint-Festival-7/index.html Should i use a 301 redirect on the second link? i was unaware that this was classed as duplicate content. I thought it was just the way the CMS system was set up? Can anyone shed any light on this please. Thanks
Technical SEO | | Alexogilvie0 -
Wordpress: Tags generate duplicate Content - just delete the tags!?
Asking people, they say tags are bad and spamy and as I can see they generate all my duplicate page content issues. So the big question is, why Google very often prefers to show in SERPS these Tag-URLS... so it can't be too bad! :)))? Then after some research I found the "Term Optimizer" on Yoast.com ... that should help exactly with this problem but it seems not to be available anymore? So may be there another plugin that can help... or just delete all tags from my blog? and install permanent redirects?
Technical SEO | | inlinear
Is this the solution?0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Rel=canonical for similar (not exact) content?
Hi all, We have a software product and SEOMOZ tools are currently reporting duplicate content issues in the support section of the website. This is because we keep several versions of our documentation covering the current version and previous 3-4 versions as well. There is a fair amount of overlap in the documentation. When a new version comes out, we simply copy the documentation over, edit it as necessary to address changes and create new pages for the new functionality. This means there is probably an 80% or so overlap from one version to the next. We were previously blocking Google (using robots.txt) from accessing previous versions of the sofware documentation, but this is obviously not ideal from an SEO perspective. We're in the process of linking up all the old versions of the documenation to the newest version so we can use rel=canonical to point to the current version. However, the content isn't all exact duplicates. Will we be penalized by Google because we're using rel=canonical on pages that aren't actually exact duplicates? Thanks, Darren.
Technical SEO | | dgibbons0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0