Effects of significant cross linking between subdomains
-
A client has noticed in recent months that their traffic from organic search has been declining, little by little.
They have a large ecommerce site with several different categories of product - each product type has its own subdomain. They have some big megamenus going on, and the end result is that if you look in their Webmaster Tools for one of their subdomains, under Links to your Site, it says they have nearly 22 million links from their own domain!
Client is wondering if this is what is causing the decline in traffic and wondering whether to change the whole structure of their site.
Interested to hear the thoughts of the community on this one!
-
Helen,
I know people who have had success in reducing the number of links within mega menus by turning some of them (after the first two levels, for instance, but you could get much more sophisticated if you wanted) into javascript links. If the javascript is not too complex Google will still have no trouble getting to those pages, but the links won't be "hrefs" and therefore won't waste pagerank on pages that are not as important relative to the others. The upside to this is that the links are still there for users, assuming that is a good thing.
As someone else mentioned, consider whether having those links there really is good for the users, or if they'd rather see a simpler menu. The search engines are beside the point in that case.
Any time an eCommerce site experiences slow, steady traffic drops I always look into the uniqueness of their product copy. That is often a sign that they are sharing product copy with other sites, either due to manufacturer description use, or by publishing feeds to 3rd party sites like Amazon, eBay or price comparison shopping engines.
Good luck!
-
You mentioned it's a large site Google only goes so deep into a site but as its an irrelevant detail it doesn't matter. Have you tried blocking some of the unused pages by robots and/or implementing tags like canonical &/or pagination tag
http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
https://support.google.com/webmasters/answer/139394?hl=en
you could look in Google trends for a rough idea of search volume over the years but that wont help your site as you mentioned. you can try tracking your SERP rank in Moz or other sfotware like serpbook etc.
Sounds like you've dropped down in your SERP to me.
-
Hi Chris,
Thanks for your reply. The issue isn't that Google hasn't indexed those pages, though - it has. I'm not sure what you mean by 'Google won't index huge sites it just doesn't have time', as it clearly does index plenty of huge sites.The site is pretty much fully indexed so it's not that Google can't find the pages.
We have also, of course, tried using the client's Analytics to identify the issue, as you describe, but the client accidentally deleted all the historical data beyond about the six month mark (oops), so I can't do a lot of the analysis I would normally do. I have one or two odd old printouts showing some historical Analytics and ranking data, and their sales data to go on, and this does tend to suggest that organic traffic has indeed dropped off (for reasons other than seasonal ones) and that there has been some decline in their search engine rankings for some key phrases. But I can't tell a lot more than that.
What I'm looking for is to see whether anyone else has had experience of this or a similar issue - whether anyone has seen excessive links between subdomains have a negative impact on rankings & traffic. I've been working in SEO for ten years and never come across anyone who has quite this many links within their own website, so it's not something I've encountered before.
Anyone else out there come across this before?
-
First thing I always do is pretend i don't work for the company etc. go to the site as a user and see how easy it is to navigate, can i find the product i need easily?(try to imagine you want a product prior to going on) Can i get back to the home page easily etc. I try to make sure I can access my home page (or the main products) only 3 pages away (5 max). Google won't index huge sites it just doesn't have time so if your structure is bad it may be Google bot giving up as it can't get all the way down to where you in fact want it to go.
If you find your self lost in "megamenus" imagine the user or Google bot, can you reduce the menus to achieve a good result?
Other factor could be the decline in traffic has there be a Decline in your placement in the SERP or seasonal traffic ? Although not a permanent fix PPC can help top up traffic to your site whilst you jiggle it a bit.
I hope some of the questions above help you look at the site in a different light, there are obviously other things it could be but first off I would look into your SERP placement and seasonal dips. You can use GA to look at users drop off points see where they are getting bored or getting lost too!
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi All, I found one other discussion about the subject of PDFs and passing of PageRank here: http://moz.com/community/q/will-a-pdf-pass-pagerank But this thread didn't answer my question so am posting it here. This PDF: http://www.ccisolutions.com/jsp/pdf/YAM-EMX_SERIES.PDF is reported by GWT to have 38 links coming from 8 unique domains. I checked the domains and some of them are high-quality relevant sites. Here's the list: Domains and Number of Links
Technical SEO | | danatanseo
prodiscjockeyequipment.com 9
decaturilmetalbuildings.com 9
timberlinesteelbuildings.com 6
jaymixer.com 4
panelsteelbuilding.com 4
steelbuildingsguide.net 3
freedocumentsearch.com 2
freedocument.net 1 However, when I plug the URL for this PDF into OSE, it reports no links and a Page Authority if only "1". This is not a new page. This is a really old page. In addition to that, when I check the PageRank of this URL, the PageRank is "nil" - not even "0" - I'm currently working on adding links back to our main site from within our PDFs, but I'm not sure how worthwhile this is if the PDFs aren't being allocated any authority from the pages already linking to them. Thoughts? Comments? Suggestions? Thanks all!0 -
Content on subdomain...
We recently moved our Wordpress site to a new host (WPEngine). We had forums on the old web host, which we need to migrate to a new forum platform (Xenforo) and integrate into the WP site. Since WPEngine only allows Wordpress on their servers, we need to install the forums at another web host, on one of our other domains. We might point to the forums through a subdomain, like this: forums.our-primary-domain.com The main reason we're re-installing the forums is for SEO value. HOWEVER, since our forum content will be on another domain, will we have an issue? If so, is there a workaround that would give us 'credit' for that content? Thanks much.
Technical SEO | | jmueller08230 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
External Links Discrepancy
Hello folks Apologies for my ignorance, but a SEO novice here… One of our competitors boasts over 300,000 external links, however when we analysed their links via http://www.opensiteexplorer.org we can only see around 10,000 in there “Number of Domains Linking to this Page” section. Can someone please assist and point out something which I assume is painfully obvious! Cheers, Chris
Technical SEO | | footyfriends0 -
Redirect link from a particular domain
Hi guys/gals, I have a few domains and blogs which I use really for a bit of fun and experimenting. One of the domains (abc.com) wasn't doing much but has a few decent links built to it. I redirected this domain to an active blog (123.com). Here's the problem: There's a particular external link to the homepage of abc.com which drives a lot of traffic but isn't relevant to the content of 123.com which it redirects to, causing a huge bounce rate from this link. Is there a way (maybe using using htaccess) that I can redirect traffic from this one link to another domain completely? I've contacted the owner of the external site but they are unable (or unwilling) to change the link. I hope I haven't lost you all but shout if you need any clarification. Thanks in advance!
Technical SEO | | Confetti_Wedding0 -
Forum Profile Links
Are they really important? Many preach they are, and there are tonnes of services out there who give you thousands of forum profile links in no time. I strictly believe in genuine links built the hard way, and definitely don't want to get into anything which is black hat. Please suggest if building several Forum Profile Links is an appropriate way of building links?
Technical SEO | | KS__2 -
Bandcamp subdomain
I have a website - www.weddingmusicproject.com that is doing quite well. However, we have all of the actual music listed on weddingmusicproject.bandcamp.com and have a number of very powerful pages there (bandcamp is a great product by the way). Is there a strong benefit to our domain's authority if we move our "bandcamp" site onto our own subdomain using their custom domain option - http://bandcamp.com/faq_custom_domains - Something like music.weddingmusicproject.com. I seem to think that it would increase our overall domain authority, but it wouldn't increase the number of inbound links or anything. If anything it would decrease the number of linking domains and bandcamp is quite a powerful site so those links would just turn into internal links. Thoughts? I know this is probably a basic concept, but I've thought it over a number of times and can't come to a conclusion.
Technical SEO | | deyobr0