Showing Duplicate Content in Webmaster Tools.
-
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
-
The site has been around for over 10 years, and we do not have any noindex. I am starting to see it drop but two months seems like a long time to start seeing changes.
-
It shouldn't be duplicate but for some reason it is. We did entirely new tags and descriptions for everything. It recently dropped a little more so I hope we are moving into the right direction.
-
Yea, 3000 pages will take a while.
The title tags and meta descriptions are all new, does that mean it is different from the old pages? If it is, then it shouldn't be duplicate.
-
Hmm... not totally sure on why only half of your sitemap is being indexed. Likely its a mixture of a number of factors including (but not limited to) site age, NoIndex tags, Google not crawling deep enough, lack of inbound links to deep pages, etc. etc. From what I've seen though, Google will eventually get to all/most of the pages in your sitemap and will eventually swap out your older pages for the 301'd/canonicalized pages that you want showing in the SERPs. Take into account some of the other tips people are sharing here because it may be a mix of our suggestions that will ultimately work for you.
-
We have the sitemap setup ok. For some reason only 50% of my sitemap is being indexed. Any ideas?
-
Okay, in that case I wouldn't suggest manually fetching and submitting 3000 links one by one because that would be a complete waste of time. You could always make sure to update your sitemap and then add that in Optimization>Sitemap or choose to Resubmit the current sitemap (hoping that that will lead to Google re-crawling sooner) and/or fetch some changed pages and submit to index as "URL and all linked pages".
Otherwise I'd say wait... SEO is a long-term job. It's only been 6 weeks since you re-did your site and less than that since you switched everything over from 302s to 301s. Give it some more time and you'll see Google start showing the correct pages and removing any unnecessary duplicate content warnings.
-
There are over 3000 for both meta descriptions and titles so to remove individually could take some time.
The title tag and meta descriptions are all new. We did completely new ones when we redid the site.
We checked and they are all 301's instead of 302s. Originally they were 302s but we changed that. It start dropping steadily and was down to about 1300. But all of a sudden it jumped back up to 3000 some, exactly what I was at before. So I am back to square one.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
Hi,
I had a similar problem but i found duplicate title tags through another tool but very similar to Google Webmasters. Although you did 301 redirects to new pages and thats the right thing to do but I believe from that tool, they are still picking up the title tag and meta descriptions from the old pages. I also don't know why. There are two ways to approach this.
-
Go to your old pages and simply remove all the title tags and meta descriptions
-
Go to your new pages and change the title tag and meta description. Optimize it even more than what you used to have on your old pages. Add secondary keywords to the title tag? Change up the meta description with CTA?
Maybe you should also check to make sure that all of them are 301 redirects and not 302. Furthermore, do you have www.yoursite.com and yoursite.com? These are considered two different sites and might be the reason.
Hope this helps.
-
-
Sometimes it can take a while for Google to show things correctly in the SERPs and for a time you may wind up with duplication warnings because there is the older, cached page still showing in the index along with the new page that the older one redirects to. In cases like this I usually jump into webmaster tools, fetch the old page (which I know redirects to the new one) and do a crawl request/submit to index to ensure that Google sees it and indexes it correctly sooner than it may have naturally. Now, I'm not 100% certain that it will fix your issue but I can say that it can potentially help and can't do any harm in this instance.
-
They all render the same page. One goes to the page it is supposed to and the other redirects to the correct page. This is the behavior I want to happen.
-
First you need to check if multiple URLs displayed under each title are working and rendering different pages.
Sometimes redirect may have syntax issues. Post your result to discuss other options.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webmaster is giving errors of Duplicate Meta Descriptions and Duplicate Title Tags
Webmaster is giving errors of Duplicate Meta Descriptions and Duplicate Title Tags after I changes the permalinks structure in wordpress. It there a quick fix for this and how damaging is the above for seo. Thanks T
Intermediate & Advanced SEO | | Taiger0 -
How to handle broken links to phantom pages appearing in webmaster tools
Hi,Would love to hear different experiences and thoughts on this one. We have a site that is plagued with 404's in the Webmaster Tools. A significant number of them have never existed, for instance affiliates have linked to them with the wrong URL or scraper sites have linked to them with a truncated version of the URL and an ellipsis eg; /my-nonexistent... What's the best way to handle these? If we do nothing and mark as fixed, they reappear in the broken links report. If we 301 redirect and mark as fixed they reappear. We tried 410 (gone forever) and marking as fixed; they re-appeared. We have a lot of legacy broken links and we would really like to clean up our WMT broken link profile - does anyone know of a way we can make these links to non extistent pages disappear once and for all? Many thanks in advance!
Intermediate & Advanced SEO | | dancape0 -
Implications of posting duplicate blog content on external domains?
I've had a few questions around the blog content on our site. Some of our vendors and partners have expressed interest in posting some of that content on their domains. What are the implications if we were to post copies of our blog posts on other domains? Should this be avoided or are there circumstances that this type of program would make sense?
Intermediate & Advanced SEO | | Visier1 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Duplicate Content www vs. non-www and best practices
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
Intermediate & Advanced SEO | | EnvoyWeb
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0