Hello Micey123,
That sounds good except you should put the sitemap reference for xyz.abcd.com within that subdomain's robots.txt file as well: xyz.abcd.com/robots.txt, as each subdomain should have its own robots.txt file.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hello Micey123,
That sounds good except you should put the sitemap reference for xyz.abcd.com within that subdomain's robots.txt file as well: xyz.abcd.com/robots.txt, as each subdomain should have its own robots.txt file.
Hi Christian,
I don't see any evidence of the site being deindexed now. Here are some things I checked for you, along with a few observations:
Nothing in the Robots.txt file, or robots meta tag, or X-robots HTTP header response that would keep these pages from being indexed by Google
The rel= canonical tags appear to be functioning properly
The home page is indexed and not duplicated by other indexed pages
Google has about 86 pages indexd from your domain
Hrefl Lang tags appear to be implemented properly
There are only about 50 links going into the domain from other sites, and the ones from Moz are the best of what few aren't just random scraper sites (harmless, but annoying).
Sometimes Google ranks a brand higher when it first comes out because it's a chicken or egg situation. How else can they collect data for their machine to chew on unless some traffic is sent to a new site? We used to call this phenomenon "the Google sandbox" a long time ago, but it is essentially (in its effect) the same thing. We do it ourselves with A:B testing and paid advertising. You have to spend some budget to gain enough data to know what's working and what isn't.
I don't think you have a technical SEO problem here. I think you need to continue building a brand and producing useful, rich content. Good luck!
Hi Suarezventures,
I typically draw the subdomain vs top-level domain line at whether the two sites / experiences and purposes are vastly different. For example, a site like blogspot that hosts different websites on subdomains, or a brand that has a forum community on a subdomain because it runs on a different server and has a much different purpose than the main domain.
Ideally, if you're moving to Wordpress you'd have the content and the store on the same site (e.g. https://site.com). If this isn't possible for them, having one or the other on a subdomain would be better than having them on (Squarespace?).
What about having the new site on a subdomain so you don't have to deal with migrating the existing site? Can' t you leave it there and put up store.site.com on WP?
Thanks for the clarification on the platform Suarezventures.
I have worked with plenty of brands that have a similar setup on Shopify. They usually put the blog on a subdomain because Shopify's content management system - let's see, how do I say this nicely... sucks. These clients put up Wordpress on a subdomain. Some also put up a landing page platform like Hubspot or Unbounce to which they send paid traffic.
Your plan to put the eCommerce site on a subdomain has some benefits in that the content side won't be affected by future platform migrations on the eCommerce site. However, the content side will benefit the most from being at the main level with the homepage and most of the backlinks. Thus, organic search traffic to the eCommerce site could be harmed by this move. I normally wouldn't recommend it for that reason (because the business is eCommerce, which is what pays for the content) but in your case, it sounds like the eCommerce site doesn't bring in much traffic as it is.
Good luck. Let us know how it turns out.
"My theory is that the uplift generated by the internal linking is subsequently mitigated by other algorithmic factors relating to content quality or site performance or..."
I think your initial analysis of the situation is right. Look to improve user-experience, conversion rates and interactions on those pages and try your experiment again.
I don't like using bounce rate as a metric for this for several reasons, but if you use Time On Site, Pages Per Visit, or track interactions, such as when they scroll past 50% of the page or click a button... There are plenty of ways to gauge whether your changes are providing a better experience for visitors from search results, which in turn should be roughly the same thing that pleases the algorithm.
Hello Pascal,
Like Hillary Clinton, I have a public opinion and a private opinion when it comes to stuff like this. The public opinion is also Google's, and that is to use a rel="nofollow" attribute on widget links. They are considered links that webmasters use to manipulate the search engine rankings. Yes, that video is old, but the rule still stands as far as Google is concerned.
My private opinion is that widgets are a form of branding, and it is not a webmaster's responsibility to do anything other than get their brand discovered far and wide. You created a widget that, if people are using it, probably provides some value to them. Why should you get any less credit for this than you would get from someone linking to the widget on your site?
If you are going to keep the links followable, my advice is to keep the anchor text branded and the href pointing to your home page. This is the least likely to seem like link-graph manipulation. Avoid deep-links, unless they go to the widget download page, and avoid optimized anchor text. Use "YourDomain.com" or "Your Brand" instead.
I'll leave the question open for more input since this isn't a question that necessarily has a single "right" or "wrong" answer.
Hello Mat,
I don't think I'm seeing the same SERPs as you. Is there any way you could give me an example of one of these subdomains?
And yes, you're absolutely right that the same problem of keyword cannibalization would apply to subdirectories as well.
If it's the woltersk....lu domain I am getting non-secure warnings from Firefox when I try to access it.
How many different subdomains are there / will there be? Is it just shop.domain.lu and www.domain.lu or are there others? I didn't see any for "courses." or "software." in the SERP example you provided with the link. If it's just one, I think that's manageable. For example, maybe www. could focus on informational queries (e.g. JavaScript course) and shop. could focus on transactional ones (e.g. Buy Acme JavaScript course). Maybe one could focus on reviews and comparisons, or long-tail queries while the other focuses on short-tail queries. Without knowing more about the domains and your business, it is difficult for me to say. If you have three or four subdomains all going after the same keywords, that's definitely a problem and I don't think you can avoid cannibalization. At that point, it would be best to choose the strongest domain/subdomain and focus your efforts on ranking one of them instead of watering down your efforts over several.
Hello Rubix,
Saijo gave you some great advice, but I'm concerned about the fact that you have that page in the first place, and that it produces those URL parameters. It suggests to me that instead of showing a 404 error on the contact-office.aspx page (assuming that pages doesn't exist on that URL) you are redirecting the user who tries to access that URL to the /404.html page (e.g. /404.html?aspxerrorpath=/contact-office.aspx).
Typically you want the 404 http status code to show on the URL the user is trying to unsuccessfully access. In this case instead of redirecting them to your "404 page URL" you would want to show your customized 404 message (and ensure it returns a 404 status code, use this tool) on www.yourdomain.com/contact-office.aspx.
I hope this makes sense to you. If not, feel free to ask for clarification.