Duplicate content due to parked domains
-
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues.
12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website
1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address
2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates?
3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects.
4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index
5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content.
Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere.
Any feedback would be greatly appreciated.
-
Oh, wow - if you're talking a couple of years ago and major ranking drops, then definitely get aggressive. Remove as many as possible and Robots No-index them. If you've got the Robots.txt directives in place, Google shouldn't put them back (although, from past experience, I realize "shouldn't" isn't a guarantee). If you're down 90%, you've got very little to lose and clearly Google didn't like something about that set-up.
Unfortunately, that's about the most drastic, reasonable option. The next step would be to start over with a fresh domain and kill all of the old domains. That could be a lot more hazardous, though.
-
Thank you Dr. Peter.
Couple of years ago my search engine positions tanked by around 90% and have not picked up back yet. At that time assumed it was due to the duplicate content on these domains, as they were parked ( Not 301, just domain masking) at that point. To avoid that duplicate content problem I moved to 301 redirection. None of these domains have any link juice to speak. Some domains have some typein traffic. I was just trying to capture them rather than link jiuice.
I did de-index most of the domains from webmaster tools in the past. But Google put them back, after 90 days or so. 301 redirection in place did not help that much.
If Google thinks there is a chance of abuse of the 301 of new domains, I would start removing the new domains completely and point else where so that Google can have some new content.
Thank youAji Abraham -
Ugh... 75 is a chunk. The problem is that Google isn't a huge fan of 301-redirecting a bunch of new domains, because it's been too often abused in the past by people buying up domains with history and trying to consolidate PageRank. So, it's possible that (1) they're suspicious of these domains, or (2) they're just not crawling/caching them in a timely manner, since they used to be parked.
Personally, unless there's any link value at all to these, I'd consider completely de-indexing the duplicate domains - at this point that probably does mean removal in Google Search Console and adding Robots.txt (which might be a prerequisite of removal, but I can't recall).
Otherwise, your only real option is just to give the 301-redirects time. It may be a non-issue, and Google is just taking its time. Ultimately, the question is whether these are somehow harming the parent site. If Google is just indexing a few pages but you're not being harmed, I might leave it alone and let the 301s do their work over time. I checked some headers, and they seem to be set up properly.
If you're seeing harm or the wrong domains being returned in search, and if no one is linking to those other domains, then I'd probably be more aggressive and go for all-out removal.
-
Hello Dr.Peter
Thank you for helping out.
There are around 75 or so domains pointing to the main website. When they were parked (prior to November 2014) on the main site, they were added as additional domains, which were url masked. So at least 30 domains were indexed in google with same content as main content.
12 months back, I realized the duplicate content error and changed the domain parking to 301 redirects. Also used ‘remove url’ functionality in Google Webmaster tools. Even after 12 months, I noticed a number of domains had duplicate contents in google index.
This I removed the pages from the addon domains again using google webmaster tools.To give you an idea my main site with original content/links is iscripts.com and an addon domain socialappster.com is pointed to a product page at iscripts.com/socialware. If you do a site: socialappster.com in google you find few pages in google index, even though it is 301 redirect for more than 12 months now. Similar issue with other domains pointing to product pages as well as whole site.
Appreciate any direction you can provide to clean this mess.
Thanks
Aji Abraham
-
Oh, and how many domains are we talking (ballpark)?
-
What was happening when they were parked - were they 302-redirected or was it some kind of straight CNAME situation where, theoretically, Google shouldn't have even seen the parked domains? Trick, of course, is that Google is a registrar, so they can see a lot that isn't necessarily public or crawlable.
Did the additional domains get indexed while parked, or after you went to 301-redirects?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two Domains, Same Products/Content
We're an e-commerce company with two domains. One is our original company name/domain, one is a newer top-level domain. The older domain doesn't receive as much traffic but is still searched and used by long-time customers who are loyal to that brand, who we don't want to alienate. The sites are both identical in products and content, which creates a duplicate content issue. I have come across two options so far: 1. a 301 redirect from the old domain to the new one. 2. Optimize the content on the newer domain (the strongest of the two) and leave the older domain content as is. Does anyone know of a solution better than the two I listed above or have experience resolving a similar problem in the past?
Intermediate & Advanced SEO | | ilewis0 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
How would I be able to move content from one domain to another?
I have a client that wants to migrate some of his site's content to a new domain, not all of the content, just some of it. This is not an address change. He wants to continue actively using the domain name where all this content currently resides, so it's not a matter of notifying search engines of an address change. The first thing that comes to mind is the use of the canonical tag, but it's not making sense. Any recommendations? Thanks in advance.
Intermediate & Advanced SEO | | UplinkSpyder0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Duplicate content on the same page--is this an issue?
We are transitioning to responsive design and some of our pages will not scale properly, so we were thinking of adding the same content twice to the same URL (one would be simple text -- for mobile and the other would include the images, etc for the desktop version), and content would change based on size of the screen. I'm not looking for another technical solution (I know google specifies that you can dynamically serve different content based on user agent)--I am wondering if any one knows if having the same exact content appear twice on the same URL will cause a problem with SEO (any historical tests or experience would be great). Thank you in advance.
Intermediate & Advanced SEO | | nicole.healthline0 -
Wordpress Duplicate Content
We have recently moved our company's blog to Wordpress on a subdomain (we utilize the Yoast SEO plugin). We are now experiencing an ever-growing volume of crawl errors (nearly 300 4xx now) for pages that do not exist to begin with. I believe it may have something to do with having the blog on a subdomain and/or our yoast seo plugin's indexation archives (author, category, etc) --- we currently have Subpages of archives and taxonomies, and category archives in use. I'm not as familiar with Wordpress and the Yoast SEO plugin as I am with other CMS' so any help in this matter would be greatly appreciated. I can PM further info if necessary. Thank you for the help in advance.
Intermediate & Advanced SEO | | BethA0 -
Accepting RSS feeds. Does it = duplicate content?
Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?
Intermediate & Advanced SEO | | peterdbaron0