Backlinks from subdomain, can it hurt ranking?
-
I just started doing an SEO audit and noticed I have 40,000 some odd back links from an OLD version of our site that has been moved to a subdomain. The back links are for articles that already exist on our main site. I don't think Google is picking it up as duplicate content because that site isn't being crawled anymore. Could this hurt us SEO wise? I plan on removing the site, but how long after it's been removed should those back links disappear?
-
So there were no external links pointing to the subdomain admin.site.com ? If that's the case you could probably just noindex/nofollow the thing or let it 404. You could write an .htaccess rule to rewrite the domain name, but it's actually probably not worth it now that I think about it. The exception, of course, is if the subdomain had external links pointed to it.
-
Hi there,
Thanks for your reply. I'm not sure how feasible it is to redirect all of those urls. I know I could use regex but I just terminated the server that admin.site.com lived on so I can't access a robots.txt file anymore. Could I simply do a generic redirect admin.site.com -> site.com?
The subdomain was the same site and domain.
Thanks.
-
Okay, so the situation here is a little unclear, but the solution should be pretty straightforward.
If the admin.site.com was different from the original site domain, simply noindex/nofollow all of the pages on that domain. I recommend this over a robots.txt rule because it will actually remove them from the index. You can add a disallow all rule in robots.txt later once the site is completely noindexed.
If the admin.site.com was the same domain, I'd recommend redirecting all of those pages to the new URLs again and then launching a noindex/nofollow version blocked with robots.txt, though I'm not sure why it needs to exist for reference. If the subdomain was different from the old site you could also probably just noindex/nofollow all of it without the redirect. It's not best practice, but it's not that big a deal.
Hope this helps to answer.
-
If you took website down, you don't have to really do anything. Go to search console, do fetch as google on old admin subdomain, so google understands that it's not there anymore, and then just wait. Google will take those backlinks down.
-
The other thing I should note is that these site links do not show in Google when searching for the topic. I'm only seeing reported back links because the admin.site.com subdomain was blocked from Google crawling it for search results. Not sure if that makes a difference.
-
It says I can only demote 100 links, I need to demote upwards of 40,000 since the admin.site.com basically mirrored the actual site.
Now I'm a little confused.
I took the old site down, so I can use a robots.txt file there anymore.
So how can I disallow the entire admin subdomain and stop reporting back links?
-
I see. Read my response below and just use meta robots. it will help you out.
If you want to deindex those backlinks, you also can look into Google Search Console's demoting tool, but i don't think it's necessary.
-
Don't do that, disallow in robots.txt will NOT resolve indexing issue! What you need to use is meta robots. Noindex, nofollow. Watch this WBF on this subject:
-
Thanks
-
Hi there. Thanks for your response.
The pages exist on the new site, but the subdomain should have never been indexed. I noticed the back links in Google Console initially then confirmed with SEO Power Suite.
Basically we had site.com, then created a brand new site and migrated content over to newsite.com with 301 redirects from the old site. Then we wanted to keep the old site up for reference so we put it at admin.site.com. That is where all the 40,000 back links were coming from, admin.site.com, the old site.
There is no reason for us to redirect admin.site.com since the original articles were properly redirected. I guess however some how when the old site was taken down, Google must have indexed it still at the subdomain and counted those as backlinks.
-
Just make sure you add a robots.txt to the subdomain with
User-agent: * Disallow: / Or if the old site is not needed anymore, redirect the subdomain to your main domain and remove the site.
-
Hi there.
So, all the pages, from which those backlinks are coming from are non-existent anymore? have they been redirected? do they return 404s? Also, how did you find them? in Google Search Console or another tool?
So, if you found it in Google Search Console, and the original pages indeed have been removed and properly redirected, then it's just time delay by GSC. Otherwise (if those pages are crawlable), you should fix it.
Hope this makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Slug redirect to existing subdomain?
Hi everyone, I'm Levi, new here and new in the world of SEO so please don't judge if my question is silly. Back on the days when my website was built we tought of having subdomains to help with SEO. We have a couple subdomains set up https://stansted.tonorwich.uk, https://heathrow.tonorwich.uk, https://gatwick.tonorwich.uk, https://luton.tonorwich.uk etc .
Link Building | | Leviiii
We do have some content on these, but from seo perspective creating a slug and redirecting it to the subdomains would be beneficial? Would it make any difference at all, good or bad? Ex. Creating
https://tonorwich.uk/taxi-minibus-tofrom-stansted and redirecting it to https://stansted.tonorwich.uk
Will this help or we better create different content on the slugs? Waiting for your opinions.
Thanks in advance.0 -
Why my website is not getting rank?
I can understand my website age is a big issue but now I want you to help me. I have been doing guest posts for increasing the domain authority but no good result. I just want you to let me what should I do next? AmazeInvent
Link Building | | alexmurphy11220 -
Adding Quality Backlinks
Without sounding like an idiot, what are the best ways in adding good quality back links. There are websites out there such as Click Submit but from experience they just link from weird URL's and the links are in comments. In an ideal world starting up my own Blogs and building up a good domain authority would work best but the customer would like some back links adding by next week. Using local directories wouldn't really work for their business so if anyone has any idea's I would love to hear them.
Link Building | | chrissmithps0 -
Getting backlinks on a page using https
Our company got a backlink on a page with high DA but using https (https://developer.apple.com/internationalization/) more than 2 months ago. However, this particular backlink was never counted by Open Site Explorer or other backlink checker. I've contacted the Moz Helpdesk and they said they are having issues crawling https sites. So I'm wondering if anything I can do to make sure this backlink gets counted? Thanks.
Link Building | | OneSky-Admin0 -
Homepage vs specific page Backlink
Hello, I have a local business and have been working on getting backlinks for highly relevant keywords. I have keyword optimized pages (keyword in URL, title, H1, etc) but I have been sending backlinks to the homepage, as these keywords represent the core of what we offer. See example below: car repair car repair london auto repair auto repair london I have been sending backlinks (with the correct anchor text of course) to the homepage, am I better off sending them to the specific pages? One more thing: the SEOMOZ report talks about cannibalisation of keywords, where for example the page optimised for "car repair" links to "Car repair london". I have links to the optimised pages in a footer at the bottom of every single page on the website, is this bad practice? (considering SEOMOZ talks about cannibalisation? Should I do something else instead (or eliminate the footer links altogether?) I would appreciate your expert advice.
Link Building | | dpaq20110 -
Ranking dropping for certain keyword
Hi There, Just wondering what has happened to my rankings as I cant seem to understand it. In the past week we've dropped from page 1 to page 10 for a certain keyword. The URL is also different that google displays - instead of displaying the individual page it now just links to the main domain. All the other keywords have remained consistent and some have improved. The page is still in the Google Index if I manually search for it. There hasn't been any dramatic change to the page except I've created a few links to it. Have we been penalized by Google and is there anything I can do?
Link Building | | danielmckay70