Showing Duplicate Content in Webmaster Tools.
-
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
-
The site has been around for over 10 years, and we do not have any noindex. I am starting to see it drop but two months seems like a long time to start seeing changes.
-
It shouldn't be duplicate but for some reason it is. We did entirely new tags and descriptions for everything. It recently dropped a little more so I hope we are moving into the right direction.
-
Yea, 3000 pages will take a while.
The title tags and meta descriptions are all new, does that mean it is different from the old pages? If it is, then it shouldn't be duplicate.
-
Hmm... not totally sure on why only half of your sitemap is being indexed. Likely its a mixture of a number of factors including (but not limited to) site age, NoIndex tags, Google not crawling deep enough, lack of inbound links to deep pages, etc. etc. From what I've seen though, Google will eventually get to all/most of the pages in your sitemap and will eventually swap out your older pages for the 301'd/canonicalized pages that you want showing in the SERPs. Take into account some of the other tips people are sharing here because it may be a mix of our suggestions that will ultimately work for you.
-
We have the sitemap setup ok. For some reason only 50% of my sitemap is being indexed. Any ideas?
-
Okay, in that case I wouldn't suggest manually fetching and submitting 3000 links one by one because that would be a complete waste of time. You could always make sure to update your sitemap and then add that in Optimization>Sitemap or choose to Resubmit the current sitemap (hoping that that will lead to Google re-crawling sooner) and/or fetch some changed pages and submit to index as "URL and all linked pages".
Otherwise I'd say wait... SEO is a long-term job. It's only been 6 weeks since you re-did your site and less than that since you switched everything over from 302s to 301s. Give it some more time and you'll see Google start showing the correct pages and removing any unnecessary duplicate content warnings.
-
There are over 3000 for both meta descriptions and titles so to remove individually could take some time.
The title tag and meta descriptions are all new. We did completely new ones when we redid the site.
We checked and they are all 301's instead of 302s. Originally they were 302s but we changed that. It start dropping steadily and was down to about 1300. But all of a sudden it jumped back up to 3000 some, exactly what I was at before. So I am back to square one.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
Hi,
I had a similar problem but i found duplicate title tags through another tool but very similar to Google Webmasters. Although you did 301 redirects to new pages and thats the right thing to do but I believe from that tool, they are still picking up the title tag and meta descriptions from the old pages. I also don't know why. There are two ways to approach this.
-
Go to your old pages and simply remove all the title tags and meta descriptions
-
Go to your new pages and change the title tag and meta description. Optimize it even more than what you used to have on your old pages. Add secondary keywords to the title tag? Change up the meta description with CTA?
Maybe you should also check to make sure that all of them are 301 redirects and not 302. Furthermore, do you have www.yoursite.com and yoursite.com? These are considered two different sites and might be the reason.
Hope this helps.
-
-
Sometimes it can take a while for Google to show things correctly in the SERPs and for a time you may wind up with duplication warnings because there is the older, cached page still showing in the index along with the new page that the older one redirects to. In cases like this I usually jump into webmaster tools, fetch the old page (which I know redirects to the new one) and do a crawl request/submit to index to ensure that Google sees it and indexes it correctly sooner than it may have naturally. Now, I'm not 100% certain that it will fix your issue but I can say that it can potentially help and can't do any harm in this instance.
-
They all render the same page. One goes to the page it is supposed to and the other redirects to the correct page. This is the behavior I want to happen.
-
First you need to check if multiple URLs displayed under each title are working and rendering different pages.
Sometimes redirect may have syntax issues. Post your result to discuss other options.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for video content that is duplicated accross a larger network
I have a website with lots of content (high quality video clips for a particular niche). All the content gets fed out 100+ other sites on various domains/subdomains which are reskinned for a given city. So the content on these other sites is 100% duplicate. I still want to generate SEO traffic though. So my thought is that we: a) need to have canonical tags from all the other domains/subdomains that point back to the original post on the main site b) probably need to disallow search engine crawlers on all the other domains/subdomains Is this on the right track? Missing anything important related to duplicate content? The idea is that after we get search engines crawling the content correctly, from there we'd use the IP address to redirect the visitor to the best suited domain/subdomain. any thoughts on that approach? Thanks for your help!
Intermediate & Advanced SEO | | PlusROI0 -
Duplicate content hidden behind tabs
Just looking at an ecommerce website and they've hidden their product page's duplicate content behind tabs on the product pages - not on purpose, I might add. Is this a legitimate way to hide duplicate content, now that Google has lowered the importance and crawlability of content hidden behind tabs? Is this a legitimate tactic to tackle duplicate content? Your thoughts would be welcome. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Lot of duplicate content and still traffic is increasing... how does it work?
Hello Mozzers, I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now. The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though: they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions) indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc) a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case) they have Analytics but don't use Webmaster Tools Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them. Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements? Thanks for your help!
Intermediate & Advanced SEO | | Guybrush_Threepw00d0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Does duplicate content penalize the whole site or just the pages affected?
I am trying to assess the impact of duplicate content on our e-commerce site and I need to know if the duplicate content is affecting only the pages that contain the dupe content or does it affect the whole site? In Google that is. But of course. Lol
Intermediate & Advanced SEO | | bjs20100 -
Duplicate content across internation urls
We have a large site with 1,000+ pages of content to launch in the UK. Much of this content is already being used on a .nz url which is going to stay. Do you see this as an issue or do you thin Google will take localised factoring into consideration. We could add a link from the NZ pages to the UK. We cant noindex the pages as this is not an option. Thanks
Intermediate & Advanced SEO | | jazavide0 -
Duplicate content on the same page--is this an issue?
We are transitioning to responsive design and some of our pages will not scale properly, so we were thinking of adding the same content twice to the same URL (one would be simple text -- for mobile and the other would include the images, etc for the desktop version), and content would change based on size of the screen. I'm not looking for another technical solution (I know google specifies that you can dynamically serve different content based on user agent)--I am wondering if any one knows if having the same exact content appear twice on the same URL will cause a problem with SEO (any historical tests or experience would be great). Thank you in advance.
Intermediate & Advanced SEO | | nicole.healthline0 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0