Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
-
Hi all,
Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains.
Thanks
-
Hi John,
Thanks for the response. I agree with you about using "rel=canonical". But there are too many pages to manually add these tags. Is there any other way to implement this?
Thanks
-
It's never a good idea to rely on the search engines "figuring out" confusing/contradictory signals from your website, vtmoz. While they claim they can do it in some cases, in reality, they very frequently get it wrong. (Though I've never seen any indication from Google that it makes any effort to find any connecting ownership signals between dupe sites when deciding whether to index a dupe. That would be way outside the scope of the crawler.)
So yes, it's entirely possible that wholly duplicate website is causing ranking and authority issues. Assuming what you mean by a site for "testing new optimisations" is a development website for testing new code before deploying to the live site, you'll need to get the dev site placed behind a password-protected login, or at the very least, have no-index meta tags added to every page. Once those are in place, you can use Google Search Console to request removal of the site from the index to hopefully speed things up. It's possible that when you had the dupe site before, it didn't get fully indexed, but this time it did, which would be why you're seeing more problems now. And having a dupe site in the index can definitely cause problems with the original site, as well as the dupe.
There's no guarantee removing the dupe site from the index will immediately resolve all the problems you are having, but leaving the dupe site in place in the index is just too big a risk. You'll still have to look deeper into what else may be causing or exacerbating the ranking issues.
Paul
-
This can be partially resolved by setting up "rel=canonical" tags on the website that you want Google to index. This is particularly advantageous when there is live duplicate content. It is basically a hint to the search engine that says "Look here!!".
Also, a better strategy to test landing pages, may be using a tool called Google Optimize, this will allow you to employ A/B and multivariate testing, without having a separate domain
Hopefully this helps!
John
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing some really bad sites that ranked in my niche years ago reaching 1st page
It started after the update about 4 websites form the 1st page dropped to the 2nd and 4 of the other sites just popped back to the 1st page and the bad part is that the Da and inbound links of these sites are really bad, so my question is must we just wait this out till Google realises how bad these site are and some of them haven't been updated in years links broken i can go on and on. what these sites have is just the age of the domains, but can this really be the main focus of these results?
Algorithm Updates | | johan80 -
Consistent drop every time after ranking good for few days: Same experience?
Hi all, We been facing this ranking fluctuation issue for over an year. Every time we made some changes for better optimisation. We improve rank but eventually drop after few days. Most of the changes we employed are On-page like page loading, fixing broken links & redirects, page titles optimisation, etc. When can see the ranking improvement for our main keywords and related keywords for a while and we drop with in a week eventually. I wonder if someone face the same issue and any thoughts on this scenario? Thanks
Algorithm Updates | | vtmoz0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Delay between being indexed and ranking for new pages.
I've noticed with the last few pages i've built that there's a delay between them being indexed and them actually ranking. Anyone else finding that? And why is it like that? Not much of an issue as they tend to pop up after a week or so, but I am curious. Isaac.
Algorithm Updates | | isaac6630 -
How I can check if Google and other search engines will properly cache a page (a dynamic one)?
My site is currently disallowing search engine bots with the help of robots.txt. These dynamic pages can be crawled using Screamingfrog since they are linked to a static category page which is also linked to the homepage. Thanks in advance!
Algorithm Updates | | esiow20130 -
Merging Multiple Domains into a Single Domain and Its Effect on Ranking
My client had multiple top-level-domains. Each one represented an insurance program within a specific vertical. For all the sites at these alternate domains, there was a 30/70 mix of duplicate vs. original content. Some of the alternate domains ranked very well for their target keyphrase groups, where others were absent in results pages. We advised the client to merge multiple domains into their existing main domain, for usability and SEO reasons. We recently ran the merger. Here was our process: On the main domain, transfer the content such that it matches 1-for-1 content on the various alternate domains Setup Google Webmaster tools on the main domain Push the new content on the main domain live and submit a corresponding sitemap to Google Establish 301 redirects on the alternate domains, such that each alternate domain URL points to its respective page on the main domain We did this 12 days ago, and pages (previously on the alternate domains) that had ranked well on Google have now plummeted or are entirely non-existent. Did we do the right thing by merging multiple top-level domains into a single domain? Is this initial dip in rankings normal? How soon should we expect to see it return to its normal rankings?
Algorithm Updates | | PinckneyHugoGroup0 -
Our Rankings are being very inconsistent! Some days we are on the front page, some days we are not in the top 50\. This happens on a weekly and sometimes daily basis... Any thoughts on why this is happening? This newbie appreciates any feedback.
We seem to be having major issues with our rankings. When I came into the company, the company was in the middle of cleaning up some of their past SEO efforts that had caused some issues with some of the latest Google updates. We were able to get the site back up to par, and some of our rankings were improved back to the first page, but then they disappeared. They will head back to the first page and then disappear again on a weekly and sometimes daily basis. Does anyone have any idea on why it will be doing this so inconsistently and so often? This newbie appreciates any feedback!!!!!
Algorithm Updates | | PCMV0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0