Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content due to parked domains
-
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues.
12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website
1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address
2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates?
3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects.
4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index
5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content.
Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere.
Any feedback would be greatly appreciated.
-
Oh, wow - if you're talking a couple of years ago and major ranking drops, then definitely get aggressive. Remove as many as possible and Robots No-index them. If you've got the Robots.txt directives in place, Google shouldn't put them back (although, from past experience, I realize "shouldn't" isn't a guarantee). If you're down 90%, you've got very little to lose and clearly Google didn't like something about that set-up.
Unfortunately, that's about the most drastic, reasonable option. The next step would be to start over with a fresh domain and kill all of the old domains. That could be a lot more hazardous, though.
-
Thank you Dr. Peter.
Couple of years ago my search engine positions tanked by around 90% and have not picked up back yet. At that time assumed it was due to the duplicate content on these domains, as they were parked ( Not 301, just domain masking) at that point. To avoid that duplicate content problem I moved to 301 redirection. None of these domains have any link juice to speak. Some domains have some typein traffic. I was just trying to capture them rather than link jiuice.
I did de-index most of the domains from webmaster tools in the past. But Google put them back, after 90 days or so. 301 redirection in place did not help that much.
If Google thinks there is a chance of abuse of the 301 of new domains, I would start removing the new domains completely and point else where so that Google can have some new content.
Thank youAji Abraham -
Ugh... 75 is a chunk. The problem is that Google isn't a huge fan of 301-redirecting a bunch of new domains, because it's been too often abused in the past by people buying up domains with history and trying to consolidate PageRank. So, it's possible that (1) they're suspicious of these domains, or (2) they're just not crawling/caching them in a timely manner, since they used to be parked.
Personally, unless there's any link value at all to these, I'd consider completely de-indexing the duplicate domains - at this point that probably does mean removal in Google Search Console and adding Robots.txt (which might be a prerequisite of removal, but I can't recall).
Otherwise, your only real option is just to give the 301-redirects time. It may be a non-issue, and Google is just taking its time. Ultimately, the question is whether these are somehow harming the parent site. If Google is just indexing a few pages but you're not being harmed, I might leave it alone and let the 301s do their work over time. I checked some headers, and they seem to be set up properly.
If you're seeing harm or the wrong domains being returned in search, and if no one is linking to those other domains, then I'd probably be more aggressive and go for all-out removal.
-
Hello Dr.Peter
Thank you for helping out.
There are around 75 or so domains pointing to the main website. When they were parked (prior to November 2014) on the main site, they were added as additional domains, which were url masked. So at least 30 domains were indexed in google with same content as main content.
12 months back, I realized the duplicate content error and changed the domain parking to 301 redirects. Also used ‘remove url’ functionality in Google Webmaster tools. Even after 12 months, I noticed a number of domains had duplicate contents in google index.
This I removed the pages from the addon domains again using google webmaster tools.To give you an idea my main site with original content/links is iscripts.com and an addon domain socialappster.com is pointed to a product page at iscripts.com/socialware. If you do a site: socialappster.com in google you find few pages in google index, even though it is 301 redirect for more than 12 months now. Similar issue with other domains pointing to product pages as well as whole site.
Appreciate any direction you can provide to clean this mess.
Thanks
Aji Abraham
-
Oh, and how many domains are we talking (ballpark)?
-
What was happening when they were parked - were they 302-redirected or was it some kind of straight CNAME situation where, theoretically, Google shouldn't have even seen the parked domains? Trick, of course, is that Google is a registrar, so they can see a lot that isn't necessarily public or crawlable.
Did the additional domains get indexed while parked, or after you went to 301-redirects?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
How to Remove Joomla Canonical and Duplicate Page Content
I've attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
Intermediate & Advanced SEO | | deskstudio0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1