Product Subdomain Outranking "Marketing" Domains
-
Hello, Moz community!
I have been puzzling about what to do for a client. Here is the challenge.
The client's "product"/welcome page lives at
this page allows the visitor to select the country/informational site they want OR to login to their subdomain/install of the product.
Google is choosing this www.client.com url as the main result for client brand searches.
In a perfect world, for searchers in the US, we would get served the client's US version of the information/marketing site which lives at https://client.com/us, and so on for other country level content (also living in a directory for that country)
It's a brand new client, we've done geo-targeting within the search console, and I'm kind of scared to rock the boat by de-indexing this www.client.com welcome screen.
Any thoughts, ideas, potential solutions are so appreciated.
THANKS!
Thank you!
-
thanks! Such a great answer.
-
You are very right to be worried about rocking that particular boat. If you de-index a page, it basically nullifies its SEO authority. Since the page which you would nullify, is a homepage-level URL (you gave the example 'www.client.com') then this would basically be SEOicide
Most other pages on your site, probably get most of their SEO authority and ranking power from your homepage (directly or indirectly, e.g: homepage linking to sub-page vs homepage linking to category, which then links to sub-page)
This is because, it's almost certain that your homepage will be the URL which has gained the most links from across the web. People are lazy, they just pick the shortest URL when linking. I'm not saying you don't have good deeplinks, just that 'most' of the good ones are probably hitting the homepage
So if you nullify the homepage's right to hold SEO authority, what happens to everything underneath the homepage? Are you imagining an avalanche right now? That's right, this would be one of the worst possible ideas in the universe. Write it down, print it out and burn it
Search-console level geo-targeting is for whole sites, not pages or (usually, though there can be exceptions) sections - you know that right? What that does is tell Google which country you want the website (the whole property which you have selected) to rank in. It basically stops that property from ranking well globally and gives minor boosts in the location which has been selected. If you just took your homepage level property and told it that it's US now, prepare to kiss most of your other traffic goodbye (hard lesson). If you were semi-smart and added /US/ as a separate property, and only set the geo targeting to US for that property - breathe a sigh of relief. It likely won't solve your issue but it won't be a complete catastrophe either (phew!)
Really the only decent tool you have to direct Google to rank individual web pages for regions and / or languages is the hreflang tag. These tags tell Google: "hey, you landed on me and I'm a valid page. But if you want to see versions of me in other languages - go to these other URLs through my hreflang links". Hreflangs only work if they are mutually agreed (both pages contain mirrored hreflangs to each other, and both pages do NOT give multiple URLs for a single language / location combination - or language / location in isolation)
The problem is, even if you do everything right - Google really has to believe "yes, this other page is another version of exactly the same page I'm looking at right now". Google can do stuff like, take the main content of both URLs, put it into a single string, then check the Boolean string similarity of both content strings to find the 'percentage' of the content's similarity. Well, this is how I check content similarity - Google does something similar, but probably infinitely more elegant and clever. In the case of hreflangs string translation is probably also enacted
If Google's mechanical mind, thinks that the pages are very different - then it will simply ignore the hreflang (just like Google will not pass SEO authority through a 301 redirect, if the contents of the old and new page are highly dissimilar in machine terms)
This is a fail-safe that Google has, to stop people from moving high rankings on 'useful' or 'proven' (via hyperlinks) URLs (content) - onto less useful, or less proven pages (which by Google's logic, if the content is very different, should have to re-prove their worth). Remember, what a human thinks is similar is irrelevant here. You need to focus on what a machine would find similar (can be VERY different things there)
So even if you do it all properly and use hreflangs, since the nature of the pages is very different (one is functional, helps users navigate, log-in and download something - that's very useful; whilst the other is selly, marketing content is usually thin) - it's unlikely that Google will swallow your intended URL serves
You'd be better off making the homepage include some marketing elements and making the marketing URLs include some of the functional elements. If both pages do both things well and are essentially the same, then hreflangs might actually start to work
If you want to keep the marketing URLs pure sell, fine - but they will only be useful as paid traffic landing pages (like from Google Ads, Pinterest Ads or FaceBook ads) where you can connect your ad to the advertorial (marketing) URLs. People expect ads to land on marketing-centric pages. People don't expect (or necessarily want) that for just regular web searches. The channel (SEO) is called 'organic' for a reason!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old Redirected Domain is replacing my current domain on SERPs
Hello everyone, All of a sudden a 2 year old redirected domain is replacing my current domain for 2 weeks now, my site is apitus.com and my old domain is aptitus.pe (the redirect is still working), however this only happens on my country google results (google.com.pe), if you check my site on google.com, everything looks ok even with a sitelink, which I no longer have on my country search results. Back to the issue, the first thing I thought was go to Search Console and take it out from the index, so I asked for access by uploading a file but since everything on that old site redirects to my current site I can't make such action. While still waiting for such access, is there anything else I could do?. Thanks in advance. PD: I'm adding the images of my SERPs CmzN8kY G3zZwwj
Technical SEO | | JoaoCJ0 -
Domain Change
What is the average organic traffic loss one can expect after switching to a new domain? We went from .com to .org and are seeing 50% decline in organic traffic and 25% in Google news traffic. 301s were implemented from site.com/some-page to site.org/some-page and change site was completed in WMT. This traffic drop seems excessive...
Technical SEO | | SoulSurfer80 -
How to redirect old domain to new domain.
We just recently signed up to Moz with hopes of fixing our Moz Ranking. We have an old domain - http://at-net.net and a new domain - https://www.expertip.net We have set up 301 (Permanent) redirects from all pages on the old site to the new, but aren't getting the ranking or aren't getting recognized from external links to the old sites. I've read the moz article on 'Link Juice' and followed those practices, but it doesn't seem to help. Does anyone have advice on doing this? Thanks in advance,
Technical SEO | | greg.lanier
Greg0 -
Getting a ton of "not found" errors in Webmaster tools stemming from /plugins/feedback.php
So recently Webmaster tools showed a million "not found" errors with the url "plugins/feedback.php/blah blah blah." A little googling helped me find that this comes from the Facebook comment box plugin. Apparently some changes recently have made this start happening. The question is, what's the right fix? The thread I was reading suggested adding "Disallow: /plugins/feedback.php" to the robots.txt file and marking them all fixed. Any ideas?
Technical SEO | | cbrant7770 -
How should I deal with "duplicate" content in an Equipment Database?
The Moz Crawler is identifying hundreds of instances of duplicate content on my site in our equipment database. The database is similar in functionality to a site like autotrader.com. We post equipment with pictures and our customers can look at the equipment and make purchasing decisions. The problem is that, though each unit is unique, they often have similar or identical specs which is why moz (and presumably google/bing) are identifying the content as "duplicate". In many cases, the only difference between listings are the pictures and mileage- the specifications and year are the same. Ideally, we wouldn't want to exclude these pages from being indexed because they could have some long-tail search value. But, obviously, we don't want to hurt the overall SEO of the site. Any advice would be appreciated.
Technical SEO | | DohenyDrones0 -
Disavowing the "right" bad backlinks
Hello, From july to november (this year), I gained 110.000 backlinks. Considering that I'm having trouble ranking well for any keyword in my niche (a niche that I was ranking #1 for several keywords and now I'm losing), I'm starting to believe that negative seo is affecting me. I already read several articles about negative seo, some telling this is a myth, others telling that negative SEO is alive and kicking... My site is about health and fitness in brazilian-portuguese language, and there's polish/chinese/english with warez/viagra/others drugs pointing to my domain and a massive links in comments with blogs without comment approval. Considering that all these new backlinks are not on my language and are clearly irrelevant, can I disavow them without fear of affecting my SEO even more ? Everytime you see someone talking about the disavow tool, is always the same warning: "cautiong when disavowing a link, you can hurt you site even more, removing a link that - in some way - was helping you". Any help or guidelines if I can remove this links safely would be greatly appreciated. Thank you and sorry for my english (it's not my native language) 5ZDjUcK.jpg
Technical SEO | | broncobr0 -
Does using data-href="" work more effectively than href="" rel="nofollow"?
I've been looking at some bigger enterprise sites and noticed some of them used HTML like this: <a <="" span="">data-href="http://www.otherodmain.com/" class="nofollow" rel="nofollow" target="_blank"></a> <a <="" span="">Instead of a regular href="" Does using data-href and some javascript help with shaping internal links, rather than just using a strict nofollow?</a>
Technical SEO | | JDatSB0 -
Quantity of products
I have an e-commerce site that sells around 1,500 SKU's.. but there are some huge categories that we really never sell anything in. I'm thinking of deleting a lot of underselling products. When I put it like that of course, you'll probably agree - but I'm hesitant because I have to wonder what Google will think of my site going from (say) 1,000 products and unique URL's to 500.. or lower. Removing that many URL's, products and product categories worries me that we may be viewed as a smaller store and have a negative impact. What effect would you expect from removing a lot of products? Who has done this?
Technical SEO | | TellThemEverything0