Help removing duplicate content from the index?
-
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below.
I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
-
Thanks for the quick response... I'll give this a try...
-
go to WMT and remove subdomain.domain.com
this will remove the whole sub domain. You will see some text saying remove site when you enter the root, (root of the subdomain that is)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Decline in traffic and duplicate content in different domains
Hi, 6 months ago my customer purchased their US supplier and moved the supplier's website to their e-commerce platform. When moving to the new platform they copied the descriptions of the products from their site to the supplier's site so now both sites have the same content in the product pages. Since then they have experienced decrease in traffic in about 80%. They didn't implement canonical tag or hreflang. My customer's domain format is https://www.xxx.biz and the supplier's domain is https://www.zzz.com The last one is targeting the US and when someone from outside of the US wants to purchase a product they get a message that they need to move to the first website, the www.xxx.biz. Both sites are in English. The old site version of www.zzz.com, before the shit to the new platform, contained different product descriptions, and BTW, the old website version is still live and indexed under a subdomain of www.zzz.com. My question is what's the best thing to do in this case so that the rankings will be back to higher positions and they'll get back their traffic. Thanks!
Technical SEO | | digital19740 -
Another Duplicate Content - eCommerce Question!
We are manufacturers of about 15 products and our website provides information about the products. We also offer them for sale on the site. Recently we partnered with a large eCommerce site that sells many of these types of products. They lifted descriptions from our site for theirs and are now selling our products. They have higher DA than us. Will this cause a ranking problem for us? Should we write unique descriptions for them? Thanks!
Technical SEO | | Chris6610 -
I really need some help with Magento and Duplicate Page Content results I;m getting
Hi, We use Magento for our eCommerce platform and I'm getting a number of duplicate page content results. It mainly concerns the duplicate page content errors for our category pages. Firstly It seems like the product type and filter options highlighted in the picture are causing duplicate page content Also one particularity category is getting a lot from duplicate page content errors , http://www.tidy-books.co.uk/shop-all-products I understand that this category page is using duplicate pages of other category pages so I set this to exclude them from the site map but it looks likes its till being picked up? I've attached the csv file showing these errors as well. - > Any help would be massively appreciated Thanks filter.png moz-tidy-books-uk-crawl_issues-01-OCT-2014.csv
Technical SEO | | tidybooks0 -
Duplicate Content Problem!
Hi folks, I have a quite awkward problem. Since a few weeks a get a huge amount of "duplicate content errors" in my MOZ crawl reports. After a while of looking for the error I thought of the domains I've bought additionally. So I went to Google and typed in site:myotherdomains.com The results was as I expected that my original website got indexed with my new domains aswell. That means: For example my original website was index with www.domain.com/aboutus - Then I bought some additional domains which are pointing on my / folder. What happened is that I also get listed with: www.mynewdomains.com/com How can I fix that? I tried a normal domain redirect but it seems as this doesn't help as when I am visiting www.mynewdomains.com the domain doesnt change in my browser to www.myoriginaldomain.com but stays with it ... I was busy the whole day to find a solution but I am kinda desperate now. If somebody could give me advice it would be much appreciated. Mike
Technical SEO | | KillAccountPlease0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0