Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
-
Hi all,
Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains.
Thanks
-
Hi John,
Thanks for the response. I agree with you about using "rel=canonical". But there are too many pages to manually add these tags. Is there any other way to implement this?
Thanks
-
It's never a good idea to rely on the search engines "figuring out" confusing/contradictory signals from your website, vtmoz. While they claim they can do it in some cases, in reality, they very frequently get it wrong. (Though I've never seen any indication from Google that it makes any effort to find any connecting ownership signals between dupe sites when deciding whether to index a dupe. That would be way outside the scope of the crawler.)
So yes, it's entirely possible that wholly duplicate website is causing ranking and authority issues. Assuming what you mean by a site for "testing new optimisations" is a development website for testing new code before deploying to the live site, you'll need to get the dev site placed behind a password-protected login, or at the very least, have no-index meta tags added to every page. Once those are in place, you can use Google Search Console to request removal of the site from the index to hopefully speed things up. It's possible that when you had the dupe site before, it didn't get fully indexed, but this time it did, which would be why you're seeing more problems now. And having a dupe site in the index can definitely cause problems with the original site, as well as the dupe.
There's no guarantee removing the dupe site from the index will immediately resolve all the problems you are having, but leaving the dupe site in place in the index is just too big a risk. You'll still have to look deeper into what else may be causing or exacerbating the ranking issues.
Paul
-
This can be partially resolved by setting up "rel=canonical" tags on the website that you want Google to index. This is particularly advantageous when there is live duplicate content. It is basically a hint to the search engine that says "Look here!!".
Also, a better strategy to test landing pages, may be using a tool called Google Optimize, this will allow you to employ A/B and multivariate testing, without having a separate domain
Hopefully this helps!
John
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google push down for not ranking top for branded keywords?
Hi all, Usually websites rank for their branded keywords. Some times third party websites takeover the websites for branded keywords. If there are more number of such queries where website is not ranking (top) for branded keywords, Google push down website in overall rankings? Any correlation? Thanks
Algorithm Updates | | vtmoz0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Why do in-site search result pages rank better than my product pages?
Maybe this is a common SERP for a generic product type but I'm seeing it a lot more often. Here is an example SERP "rolling stools". The top 4 results are dynamic in-site search pages from Sears, ebay and Amazon (among others). I understand their influence and authority but why would a search return a dynamic in-site SERP instead of a solid product page. A better question would be - How do I get my in-site SERPs to rank or how do I get my client's page to rise above the #5 spot is currently ranks at? Thanks
Algorithm Updates | | BenRWoodard0 -
Client's site dropped completely from Google - AGAIN! Please help...
ok guys - hoping someone out there can help... (kinda long, but wanted to be sure all the details were out there) Already had this happen once - even posted in here about it - http://www.seomoz.org/q/client-s-site-dropped-completely-for-all-keywords-but-not-brand-name-not-manual-penalty-help Guy was a brand new client, all we did was tweak title tags and add a bit of content to his site since most was generic boilerplate text... started on our KW research and competitor research... in just a week, from title tag and content tweaks alone, he went from ranking on page 4-5 to ranking on page 3-4... then as we sat down to really optimize his site... POOF - he was gone from the Googs... He only showed up in "site:" searches and for exact matches of his business name - everything else was gone. Posted in here and on WMT - had several people check it out, both local guys and people from here (thanks to John Doherty for trying!) - but no one could figure out any reason why it would have happened. We submitted a reconsideration request, explaining that we knew we hadn't violated any quality guidelines, that he had less than 10 backlinks so it couldn't be bad linking, and that we had hardly touched the site. They sent back a canned response a week later that said there was no manual penalty and that we should "check our content" - mysteriously, the site started to show back up in the SERPs that morning (we got the canned response in the afternoon) There WAS an issue with NAP mismatch on some citations, but we fixed that, and that shouldn't have contributed to complete disappearance anyway. SO - the site was back, and back at its page 3 or 4 position... we decided to leave it alone for a few days just to be sure we didn't do anything... and then just 6 days later, when we were sitting down to fully optimize the site - POOF - completely gone again. We do SEO for a lot of different car dealers all over the country, and i know our strategies work. Looking at the competition in his market, he should easily be ranked page 2 or 3 with the very minimal tweaking we did... AND, since we didn't change anything since he came back, it makes even less sense that he was visible for a week and then gone again. So, mozzers... Anybody got any ideas? I'm really at a loss here - it makes zero sense that he's completely gone, except for his biz name... if nothing else, he should be ranking for "used cars canton"... Definitely appreciate any help anyone can offer -
Algorithm Updates | | Greg_Gifford0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Google site links on sub pages
Hi all Had a look for info on this one but couldn't find much. I know these days that if you have a decent domain good will often automatically put site links on for your home if someone searches for your company name, however has anyone seen these links appear for sub pages? For example, lets say I had a .com domain with /en /fr /de sub folders, each seoed for their location. If I were to then have domain.com/en/ as no1 in Google for my company in the UK would I be able to get site links under this or does it only work on the 'proper' homepage domain.com/ A client of mine wants to reorganise their website so they have different location sections ranking in different markets but they also want to keep having sitewide links as they like the look of it Thanks Carl
Algorithm Updates | | Grumpy_Carl0 -
Domain Name search in google not appearing
My hcg domain doesn't show up in google search. Shows up in new sand image search. If I wrap the domain name in quotes it shows up
Algorithm Updates | | noork0 -
Google showing different pages for same search term in uk and usa
Hi Guys, I have an interesting question and think Google is being a bit strange.. Can anyone tell me why when I input the term design agency in Google.co.uk it shows one page, but when i tyupe in the same search term in Google.com (worldwide search) it shows another page.. Any ideas guys? Is this not bit strange?? Any help here be much appreciated.. Thanks Gareth
Algorithm Updates | | GAZ090