Showing Duplicate Content in Webmaster Tools.
-
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
-
The site has been around for over 10 years, and we do not have any noindex. I am starting to see it drop but two months seems like a long time to start seeing changes.
-
It shouldn't be duplicate but for some reason it is. We did entirely new tags and descriptions for everything. It recently dropped a little more so I hope we are moving into the right direction.
-
Yea, 3000 pages will take a while.
The title tags and meta descriptions are all new, does that mean it is different from the old pages? If it is, then it shouldn't be duplicate.
-
Hmm... not totally sure on why only half of your sitemap is being indexed. Likely its a mixture of a number of factors including (but not limited to) site age, NoIndex tags, Google not crawling deep enough, lack of inbound links to deep pages, etc. etc. From what I've seen though, Google will eventually get to all/most of the pages in your sitemap and will eventually swap out your older pages for the 301'd/canonicalized pages that you want showing in the SERPs. Take into account some of the other tips people are sharing here because it may be a mix of our suggestions that will ultimately work for you.
-
We have the sitemap setup ok. For some reason only 50% of my sitemap is being indexed. Any ideas?
-
Okay, in that case I wouldn't suggest manually fetching and submitting 3000 links one by one because that would be a complete waste of time. You could always make sure to update your sitemap and then add that in Optimization>Sitemap or choose to Resubmit the current sitemap (hoping that that will lead to Google re-crawling sooner) and/or fetch some changed pages and submit to index as "URL and all linked pages".
Otherwise I'd say wait... SEO is a long-term job. It's only been 6 weeks since you re-did your site and less than that since you switched everything over from 302s to 301s. Give it some more time and you'll see Google start showing the correct pages and removing any unnecessary duplicate content warnings.
-
There are over 3000 for both meta descriptions and titles so to remove individually could take some time.
The title tag and meta descriptions are all new. We did completely new ones when we redid the site.
We checked and they are all 301's instead of 302s. Originally they were 302s but we changed that. It start dropping steadily and was down to about 1300. But all of a sudden it jumped back up to 3000 some, exactly what I was at before. So I am back to square one.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
Hi,
I had a similar problem but i found duplicate title tags through another tool but very similar to Google Webmasters. Although you did 301 redirects to new pages and thats the right thing to do but I believe from that tool, they are still picking up the title tag and meta descriptions from the old pages. I also don't know why. There are two ways to approach this.
-
Go to your old pages and simply remove all the title tags and meta descriptions
-
Go to your new pages and change the title tag and meta description. Optimize it even more than what you used to have on your old pages. Add secondary keywords to the title tag? Change up the meta description with CTA?
Maybe you should also check to make sure that all of them are 301 redirects and not 302. Furthermore, do you have www.yoursite.com and yoursite.com? These are considered two different sites and might be the reason.
Hope this helps.
-
-
Sometimes it can take a while for Google to show things correctly in the SERPs and for a time you may wind up with duplication warnings because there is the older, cached page still showing in the index along with the new page that the older one redirects to. In cases like this I usually jump into webmaster tools, fetch the old page (which I know redirects to the new one) and do a crawl request/submit to index to ensure that Google sees it and indexes it correctly sooner than it may have naturally. Now, I'm not 100% certain that it will fix your issue but I can say that it can potentially help and can't do any harm in this instance.
-
They all render the same page. One goes to the page it is supposed to and the other redirects to the correct page. This is the behavior I want to happen.
-
First you need to check if multiple URLs displayed under each title are working and rendering different pages.
Sometimes redirect may have syntax issues. Post your result to discuss other options.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
Magento products and eBay - duplicate content risk?
Hi, We are selling about 1000 sticker products in our online store and would like to expand a large part of our products lineup to eBay as well. There are pretty good modules for this as I've heard. I'm just wondering if there will be duplicate content problems if I sync the products between Magento and eBay and they get uploaded to eBay with identical titles, descriptions and images? What's the workaround in this case? Thanks!
Intermediate & Advanced SEO | | speedbird12290 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Duplicate content on sub-domains?
I have 2 subdamains intented for 2 different countries (Colombia and Venezuela) ve.domain.com and co.domain.com. The site it's an e-commerce with over a million products available so they have the same page with the same content on both sub-domains....the only differences are the prices a payment options. Does google take that as duplicate content? Thanks
Intermediate & Advanced SEO | | daniel.alvarez0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0