Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Backlinks from subdomain, can it hurt ranking?
-
I just started doing an SEO audit and noticed I have 40,000 some odd back links from an OLD version of our site that has been moved to a subdomain. The back links are for articles that already exist on our main site. I don't think Google is picking it up as duplicate content because that site isn't being crawled anymore. Could this hurt us SEO wise? I plan on removing the site, but how long after it's been removed should those back links disappear?
-
So there were no external links pointing to the subdomain admin.site.com ? If that's the case you could probably just noindex/nofollow the thing or let it 404. You could write an .htaccess rule to rewrite the domain name, but it's actually probably not worth it now that I think about it. The exception, of course, is if the subdomain had external links pointed to it.
-
Hi there,
Thanks for your reply. I'm not sure how feasible it is to redirect all of those urls. I know I could use regex but I just terminated the server that admin.site.com lived on so I can't access a robots.txt file anymore. Could I simply do a generic redirect admin.site.com -> site.com?
The subdomain was the same site and domain.
Thanks.
-
Okay, so the situation here is a little unclear, but the solution should be pretty straightforward.
If the admin.site.com was different from the original site domain, simply noindex/nofollow all of the pages on that domain. I recommend this over a robots.txt rule because it will actually remove them from the index. You can add a disallow all rule in robots.txt later once the site is completely noindexed.
If the admin.site.com was the same domain, I'd recommend redirecting all of those pages to the new URLs again and then launching a noindex/nofollow version blocked with robots.txt, though I'm not sure why it needs to exist for reference. If the subdomain was different from the old site you could also probably just noindex/nofollow all of it without the redirect. It's not best practice, but it's not that big a deal.
Hope this helps to answer.
-
If you took website down, you don't have to really do anything. Go to search console, do fetch as google on old admin subdomain, so google understands that it's not there anymore, and then just wait. Google will take those backlinks down.
-
The other thing I should note is that these site links do not show in Google when searching for the topic. I'm only seeing reported back links because the admin.site.com subdomain was blocked from Google crawling it for search results. Not sure if that makes a difference.
-
It says I can only demote 100 links, I need to demote upwards of 40,000 since the admin.site.com basically mirrored the actual site.
Now I'm a little confused.
I took the old site down, so I can use a robots.txt file there anymore.
So how can I disallow the entire admin subdomain and stop reporting back links?
-
I see. Read my response below and just use meta robots. it will help you out.
If you want to deindex those backlinks, you also can look into Google Search Console's demoting tool, but i don't think it's necessary.
-
Don't do that, disallow in robots.txt will NOT resolve indexing issue! What you need to use is meta robots. Noindex, nofollow. Watch this WBF on this subject:
-
Thanks
-
Hi there. Thanks for your response.
The pages exist on the new site, but the subdomain should have never been indexed. I noticed the back links in Google Console initially then confirmed with SEO Power Suite.
Basically we had site.com, then created a brand new site and migrated content over to newsite.com with 301 redirects from the old site. Then we wanted to keep the old site up for reference so we put it at admin.site.com. That is where all the 40,000 back links were coming from, admin.site.com, the old site.
There is no reason for us to redirect admin.site.com since the original articles were properly redirected. I guess however some how when the old site was taken down, Google must have indexed it still at the subdomain and counted those as backlinks.
-
Just make sure you add a robots.txt to the subdomain with
User-agent: * Disallow: / Or if the old site is not needed anymore, redirect the subdomain to your main domain and remove the site.
-
Hi there.
So, all the pages, from which those backlinks are coming from are non-existent anymore? have they been redirected? do they return 404s? Also, how did you find them? in Google Search Console or another tool?
So, if you found it in Google Search Console, and the original pages indeed have been removed and properly redirected, then it's just time delay by GSC. Otherwise (if those pages are crawlable), you should fix it.
Hope this makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks from local businesses
Hello, I need to make sure I understand this correctly. Will it help my SEO if I: find local businesses with a good DA & low spam score select businesses that are somehow related to mine (Insurance Agency) offer to guest post and include a backlink on their blog ( ask them if they'd like to reciprocate) For example, businesses such as
Link Building | | laurentjb
Roofing companies
Contractors
moving companies
etc Please let me know if there's anything I'm missing? Many thanks0 -
slug Link redirect to subdomain?
Hi !
Link Building | | Leviiii
Im Levi new here and new in the world of SEO, please dont judge if my questions are silly. Back on the days when the site was built we thought it is a good ideea to have subdomains that together with the domain name represent our main keywords.
ex. https://stansted.tonorwich.uk, https://heathrow.tonorwich.uk, https://luton.tonorwich.uk, https://gatwick.tonorwich.uk. There is content on this subdomains, would it make any difference from SEO perspective if we create slugs that redirect to these subdomains? for example creating https://tonorwich.uk/taxi-minibus-vip-tesla-norwich-to-stansted that redirects to https://stansted.tonorwich.uk ? Or better create these slugs with slightly different content?
Any ideeas would be appreciated.
Thanks in advance!0 -
I want to know about the Impact of Do-follow backlinks and No-follow backlinks for increasing DA.
Have Do-follow backlinks and No-Follow backlinks same value to increase Domain Authority in Moz? Recently, I have created 290 profile backlinks which was do-follow. But from yesterday I have seen all backlinks are no-follow now according to the decision of sites' owners. My site's DA has already increased up to 35. Will this DA fall gradually from now? Apart from it, if I have created same backlinks for my another site; will my DA be 35? Please try to clarify it. My site is: Homeworkpaper.net Thanks. Waiting for your valuable answer.
Link Building | | darrellpc1 -
Need advice: How to replace a high-ranking pdf with a landing page -- without dropping much in rank?
Hi! We have a pdf of a white paper on our site that ranks #1 for a niche term. We are now exploring the idea of building a landing page for this paper with an overview (2-3 paragraphs) and a registration form for interested parties to fill out in order to access the paper. Do you think there'd be any hope of ensuring the new landing page would rank almost as well eventually (top 3)? I'm thinking a key consideration would be preserving the link juice from the existing links that are coming into the pdf, and that 301 redirecting the pdf URL to the new landing page URL could help with that. Am I right? Or would it be better to try to reach out to the linking sites and ask them to update their links (seems daunting). Any advice/opinions on this situation would be appreciated. (If folks think that there would be a significant risk of permanently dropping out of the top 5 by attempting this, we would likely choose to leave the pdf ungated as is.) Thanks!! -John
Link Building | | jomosi0 -
How Many Backlinks Per Day
I am starting to do some manual back-linking for a medium competition word with high traffic. I plan to start back linking with like niche blogs with non spam comments, like niche directories, like niche forums, and guest blogging. My domain is over 3 years old and ranks for multiple keywords, but I'm going to concentrate specifically on this keyword, but need to know how many links to back-link a day. Again none of these will be spam, but will be of real quality. I was thinking 10-20 a day, but unsure.
Link Building | | treeoflife0 -
URL shortener and backlinks
Are there any SEO benefits drawn from creating backlinks with Google url shortener service(especially for the very deep links) or for the do-follow link juice I have to use a complete address?
Link Building | | SirMax0