Showing Duplicate Content in Webmaster Tools.
-
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
-
The site has been around for over 10 years, and we do not have any noindex. I am starting to see it drop but two months seems like a long time to start seeing changes.
-
It shouldn't be duplicate but for some reason it is. We did entirely new tags and descriptions for everything. It recently dropped a little more so I hope we are moving into the right direction.
-
Yea, 3000 pages will take a while.
The title tags and meta descriptions are all new, does that mean it is different from the old pages? If it is, then it shouldn't be duplicate.
-
Hmm... not totally sure on why only half of your sitemap is being indexed. Likely its a mixture of a number of factors including (but not limited to) site age, NoIndex tags, Google not crawling deep enough, lack of inbound links to deep pages, etc. etc. From what I've seen though, Google will eventually get to all/most of the pages in your sitemap and will eventually swap out your older pages for the 301'd/canonicalized pages that you want showing in the SERPs. Take into account some of the other tips people are sharing here because it may be a mix of our suggestions that will ultimately work for you.
-
We have the sitemap setup ok. For some reason only 50% of my sitemap is being indexed. Any ideas?
-
Okay, in that case I wouldn't suggest manually fetching and submitting 3000 links one by one because that would be a complete waste of time. You could always make sure to update your sitemap and then add that in Optimization>Sitemap or choose to Resubmit the current sitemap (hoping that that will lead to Google re-crawling sooner) and/or fetch some changed pages and submit to index as "URL and all linked pages".
Otherwise I'd say wait... SEO is a long-term job. It's only been 6 weeks since you re-did your site and less than that since you switched everything over from 302s to 301s. Give it some more time and you'll see Google start showing the correct pages and removing any unnecessary duplicate content warnings.
-
There are over 3000 for both meta descriptions and titles so to remove individually could take some time.
The title tag and meta descriptions are all new. We did completely new ones when we redid the site.
We checked and they are all 301's instead of 302s. Originally they were 302s but we changed that. It start dropping steadily and was down to about 1300. But all of a sudden it jumped back up to 3000 some, exactly what I was at before. So I am back to square one.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
I have over 3000 in both meta descriptions and titles so that might take awhile.
-
Hi,
I had a similar problem but i found duplicate title tags through another tool but very similar to Google Webmasters. Although you did 301 redirects to new pages and thats the right thing to do but I believe from that tool, they are still picking up the title tag and meta descriptions from the old pages. I also don't know why. There are two ways to approach this.
-
Go to your old pages and simply remove all the title tags and meta descriptions
-
Go to your new pages and change the title tag and meta description. Optimize it even more than what you used to have on your old pages. Add secondary keywords to the title tag? Change up the meta description with CTA?
Maybe you should also check to make sure that all of them are 301 redirects and not 302. Furthermore, do you have www.yoursite.com and yoursite.com? These are considered two different sites and might be the reason.
Hope this helps.
-
-
Sometimes it can take a while for Google to show things correctly in the SERPs and for a time you may wind up with duplication warnings because there is the older, cached page still showing in the index along with the new page that the older one redirects to. In cases like this I usually jump into webmaster tools, fetch the old page (which I know redirects to the new one) and do a crawl request/submit to index to ensure that Google sees it and indexes it correctly sooner than it may have naturally. Now, I'm not 100% certain that it will fix your issue but I can say that it can potentially help and can't do any harm in this instance.
-
They all render the same page. One goes to the page it is supposed to and the other redirects to the correct page. This is the behavior I want to happen.
-
First you need to check if multiple URLs displayed under each title are working and rendering different pages.
Sometimes redirect may have syntax issues. Post your result to discuss other options.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help - Duplicate Content
Hi, I am really struggling to understand why my site has a lot of duplicate content issues. It's flagging up as ridiculously high and I have no idea how to fix this, can anyone help me, please? Website is www.firstcapitol.co.uk
Intermediate & Advanced SEO | | Alix_SEO1 -
Ticket Industry E-commerce Duplicate Content Question
Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks
Intermediate & Advanced SEO | | keL.A.xT.o1 -
Best tools for identifying internal duplicate content
Hello again Mozzers! Other than the Moz tool, are there any other tools out there for identifying internal duplicate content? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Matt Cutts Announces Disavow Google Webmasters Tool
Today at Pubcon Cutts announced this tool similar to Bing's - http://searchengineland.com/google-launches-disavow-links-tool-136826. My question is, has anybody used Bing's? Do you foresee any problems or issues to consider? Just checking before going ahead with using it 🙂 Thanks
Intermediate & Advanced SEO | | bradkrussell0 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0 -
ECommerce syndication & duplicate content
We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?
Intermediate & Advanced SEO | | erangalp0