Showing Duplicate Content in Webmaster Tools.
-
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
-
The site has been around for over 10 years, and we do not have any noindex. I am starting to see it drop but two months seems like a long time to start seeing changes.
-
It shouldn't be duplicate but for some reason it is. We did entirely new tags and descriptions for everything. It recently dropped a little more so I hope we are moving into the right direction.
-
Yea, 3000 pages will take a while.
The title tags and meta descriptions are all new, does that mean it is different from the old pages? If it is, then it shouldn't be duplicate.
-
Hmm... not totally sure on why only half of your sitemap is being indexed. Likely its a mixture of a number of factors including (but not limited to) site age, NoIndex tags, Google not crawling deep enough, lack of inbound links to deep pages, etc. etc. From what I've seen though, Google will eventually get to all/most of the pages in your sitemap and will eventually swap out your older pages for the 301'd/canonicalized pages that you want showing in the SERPs. Take into account some of the other tips people are sharing here because it may be a mix of our suggestions that will ultimately work for you.
-
We have the sitemap setup ok. For some reason only 50% of my sitemap is being indexed. Any ideas?
-
Okay, in that case I wouldn't suggest manually fetching and submitting 3000 links one by one because that would be a complete waste of time. You could always make sure to update your sitemap and then add that in Optimization>Sitemap or choose to Resubmit the current sitemap (hoping that that will lead to Google re-crawling sooner) and/or fetch some changed pages and submit to index as "URL and all linked pages".
Otherwise I'd say wait... SEO is a long-term job. It's only been 6 weeks since you re-did your site and less than that since you switched everything over from 302s to 301s. Give it some more time and you'll see Google start showing the correct pages and removing any unnecessary duplicate content warnings.
-
There are over 3000 for both meta descriptions and titles so to remove individually could take some time.
The title tag and meta descriptions are all new. We did completely new ones when we redid the site.
We checked and they are all 301's instead of 302s. Originally they were 302s but we changed that. It start dropping steadily and was down to about 1300. But all of a sudden it jumped back up to 3000 some, exactly what I was at before. So I am back to square one.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
Hi,
I had a similar problem but i found duplicate title tags through another tool but very similar to Google Webmasters. Although you did 301 redirects to new pages and thats the right thing to do but I believe from that tool, they are still picking up the title tag and meta descriptions from the old pages. I also don't know why. There are two ways to approach this.
-
Go to your old pages and simply remove all the title tags and meta descriptions
-
Go to your new pages and change the title tag and meta description. Optimize it even more than what you used to have on your old pages. Add secondary keywords to the title tag? Change up the meta description with CTA?
Maybe you should also check to make sure that all of them are 301 redirects and not 302. Furthermore, do you have www.yoursite.com and yoursite.com? These are considered two different sites and might be the reason.
Hope this helps.
-
-
Sometimes it can take a while for Google to show things correctly in the SERPs and for a time you may wind up with duplication warnings because there is the older, cached page still showing in the index along with the new page that the older one redirects to. In cases like this I usually jump into webmaster tools, fetch the old page (which I know redirects to the new one) and do a crawl request/submit to index to ensure that Google sees it and indexes it correctly sooner than it may have naturally. Now, I'm not 100% certain that it will fix your issue but I can say that it can potentially help and can't do any harm in this instance.
-
They all render the same page. One goes to the page it is supposed to and the other redirects to the correct page. This is the behavior I want to happen.
-
First you need to check if multiple URLs displayed under each title are working and rendering different pages.
Sometimes redirect may have syntax issues. Post your result to discuss other options.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Semi-duplicate content yet authoritative site
So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance
Intermediate & Advanced SEO | | Veebs0 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
How would you handle this duplicate content - noindex or canonical?
Hello Just trying look at how best to deal with this duplicated content. On our Canada holidays page we have a number of holidays listed (PAGE A)
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspx We also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspx Of the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search. From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A. Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A? Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing: Duplicate Content: Block, Redirect or Canonical - SEO Tips0 -
Duplicate Content Dilemma for Category and Brand Pages
Hi, I have a online shop with categories such as: Trousers Shirts Shoes etc. But now I'm having a problem with further development.
Intermediate & Advanced SEO | | soralsokal
I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products. How do I deal with this from a duplicate content perspective? I'm appreciate your suggestions. Best, Robin0 -
Duplicate page content errors stemming from CMS
Hello! We've recently relaunched (and completely restructured) our website. All looks well except for some duplicate content issues. Our internal CMS (custom) adds a /content/ to each page. Our development team has also set-up URLs to work without /content/. Is there a way I can tell Google that these are the same pages. I looked into the parameters tool, but that seemed more in-line with ecommerce and the like. Am I missing anything else?
Intermediate & Advanced SEO | | taylor.craig0 -
Duplicate Content For E-commerce
On our E-commerce site, we have multiple stores. Products are shown on our multiple stores which has created a duplicate content problem. Basically if we list a product say a shoe,that listing will show up on our multiple stores I assumed the solution would be to redirect the pages, use non follow tags or to use the rel=canonical tag. Are there any other options for me to use. I think my best bet is to use a mixture of 301 redirects and canonical tags. What do you recommend. I have 5000+ pages of duplicate content so the problem is big. Thanks in advance for your help!
Intermediate & Advanced SEO | | pinksgreens0 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
How to Set Custom Crawl Rate in Google Webmaster Tools?
This is really silly question to set custom crawl rate in Google webmaster tools. Any one can find out that section under setting tab. But, I have confusion to decide number for request per second and second between requests text field. I want to set custom crawl rate for my eCommerce website. I checked my Google webmaster tools and find out as attachment. So, Can I use this facility to improve my crawling? 6233755578_33ce83bb71_b.jpg
Intermediate & Advanced SEO | | CommercePundit0