Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
-
Hi all,
Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains.
Thanks
-
Hi John,
Thanks for the response. I agree with you about using "rel=canonical". But there are too many pages to manually add these tags. Is there any other way to implement this?
Thanks
-
It's never a good idea to rely on the search engines "figuring out" confusing/contradictory signals from your website, vtmoz. While they claim they can do it in some cases, in reality, they very frequently get it wrong. (Though I've never seen any indication from Google that it makes any effort to find any connecting ownership signals between dupe sites when deciding whether to index a dupe. That would be way outside the scope of the crawler.)
So yes, it's entirely possible that wholly duplicate website is causing ranking and authority issues. Assuming what you mean by a site for "testing new optimisations" is a development website for testing new code before deploying to the live site, you'll need to get the dev site placed behind a password-protected login, or at the very least, have no-index meta tags added to every page. Once those are in place, you can use Google Search Console to request removal of the site from the index to hopefully speed things up. It's possible that when you had the dupe site before, it didn't get fully indexed, but this time it did, which would be why you're seeing more problems now. And having a dupe site in the index can definitely cause problems with the original site, as well as the dupe.
There's no guarantee removing the dupe site from the index will immediately resolve all the problems you are having, but leaving the dupe site in place in the index is just too big a risk. You'll still have to look deeper into what else may be causing or exacerbating the ranking issues.
Paul
-
This can be partially resolved by setting up "rel=canonical" tags on the website that you want Google to index. This is particularly advantageous when there is live duplicate content. It is basically a hint to the search engine that says "Look here!!".
Also, a better strategy to test landing pages, may be using a tool called Google Optimize, this will allow you to employ A/B and multivariate testing, without having a separate domain
Hopefully this helps!
John
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hi all, Our log-in page is ranking in SERP instead of homepage and some times both pages rank for the primary keyword we targeted. We have even dropped. I am looking for a solution for this. Three points here to consider is: Our log-in page is the most visited page and landing page on the website. Even there is the primary keyword in this page or not; same scenario continues Log-in page is the first link bots touch when they crawling any page of our website as log-in page is linked on top navigation menu If we move login page to sub-domain, will it works? I am worrying that we loose so much traffic to our website which will be taken away from log-in page sub domain Please guide with your valuable suggestions. Thanks
Algorithm Updates | | vtmoz0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Diluting your authority - adding pages diluting rankings of other pages?
I'm looking after a site that has around 400 pages. All of these pages rank pretty well for the KW they are targetting. My question is: if we add another 400 pages without doing any link building work, holding DA the same, 1) would the rankings of those 400 previously good pages diminish? and 2) Would the new pages, as more and more new ones are created, rank less and less well?
Algorithm Updates | | xoffie0 -
Why do I have 7 URLs from the same domain ranking on the 1st page?
I have a client that has individual pages for authorized dealers of their product (say "Car Dealers"). When you search for "brand name + location", Google returns 7 "dealership" pages from the parent company's domain as the first 7 results, but there is one that gets pushed off to the 5th page of the SERPs. The formatting of content, geo-targeting, and meta data on the page is identical on every single one. None of them have external links and there is not one extremely distinguishable thing to assess why the one page doesn't get placed on that first SERP. Why is the one getting pushed so far down? I know this may be a bit confusing, but any thoughts would be greatly appreciated. Thanks!
Algorithm Updates | | MichaelWeisbaum0 -
Is it still possible for small businesses to rank well in google
Hi I've been playing around with ecommerce sites for a few years now and although I am no expert I'm not a complete novice. We used to do quite well in google but recent changes have halved our number of hits. I have noticed that over the last year google has given priority to large brand names as opposed to relevancy. For example, if you search for the term 'bridal jewellery' (google UK) you will see that apart from one or two the majority of placements are taken by big compnies who offer very little bridal jewellery. One or two pages at most. My question is, is it still possible to rank well against these brand names or has google made it impossible for small companies. PS we only practice ethical seo as suggested by seomoz. Any help or advice is greatly appreciated. Thanks www.kerryblu.co.uk
Algorithm Updates | | Dill0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Server Down for Few Hours went from Page 1 to Page 6?
We were on Page 1 - our server went down for about 4-6 hours and then we dropped to page 6. Would the server being down for this amount of time affect our position? Any advice would be much appreciated.
Algorithm Updates | | webdesigncwd0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0