How to take down a sub domain which is receiving many spammy back-links?
-
Hi all,
We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"?
Thanks
-
Hi vtmoz,
OK. You can upload a file containing all the domains you want to disavow you don't need to do that one by one. To check thousands of links is not something one wants to do for sure actually...
How you could do it: Disavow them all (from Webmaster Tools you export them all to a file) and then you delete a couple of dozens you know are strong and valuable domains.
Cheers,
Cesare
-
Hi Cesare,
But there are too many backlinks from different sub domains and how we are going to check thousands of links to disavow them? I think its hard to go with.
-
Hi vtmox,
Simply disavow the links that are spammy: https://support.google.com/webmasters/answer/2648487?hl=en. Thats it. Doing that you tell Google which ones not to take into account and the "good" ones will still going to benefit your subdomain. There is no need to take the subdomain down.
Hope this helps.
Cheers,
Cesare
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
White Label Subdomain Competing with Top Level Domain?
Hi All, We have a top level domain that is a comparison site for companies in our industry. We also manage a white label website for a specific company in the same industry, which was originally set up as a subdomain. In other words we have: "example.com" and "companyname.example.com." The sites are treated as separate websites--the subdomain site isn't filling a role like a subfolder would. It has it's own branding, navigation/url structure, etc. Since these sites are in the same industry, there is obviously a huge overlap in the keywords we want each to rank for. In fact 100% of the keywords for the subdomain, are targets for the top level domain. My question is, are we hurting ourselves in google rankings by having two sites under the same top level domain competing for the same keywords? We want both sites to be as successful as possible. Would we be better served by kicking the subdomain out into a new top level domain? Thanks!
Algorithm Updates | | Rodrigo-DC0 -
Domain Authority Distribution Across the Web
**Does anyone have stats for domain authority distribution across the entire web? E.G., what percentage of websites fall in the DA range of 0-25, 26-50, 51-75, 76-100. **
Algorithm Updates | | Investis_Digital2 -
Why do we have so many pages scanned by bots (over 250,000) and our biggest competitors have about 70,000? Seems like something is very wrong.
We are trying to figure out why last year we had a huge (80%) and sudden (within two days) drop in our google searches. The only "outlier" in our site that we can find is a huge number of pages reported in MOZ as scanned by search engines. Is this a problem? How did we get so many pages reported? What can we do to bring the number of searched pages back to a "normal" level? BT
Algorithm Updates | | achituv0 -
Are links from directories still good practice?
Ok, so I am new at "link building"....which of course I have read furiously on how that philosophy is changed, it's a goal, not so much a process. I am focusing on great content, social sharing, etc. BUT, I see competitors still getting links from some of the directories that I have found listed on Moz as being "good" directories to list in. For example, yelllow pages, manta, ibegin, hot frog, etc. Do I have the terminology totally twisted here? Is it still good practice to get a couple links from these directories. Or is this practice completely the wrong thing to do post Panda & Penquin. Thanks guys!
Algorithm Updates | | cschwartzel0 -
SEO For sub locations for specific services
Hey Guys, I am currently creating a website for my business that will be marketing through SEO very heavily. I Live in NYC, and i'd like to rank up for the individual locations such as Queens, Brooklyn, Long Island and eventually if my domain authority and other long hall metrics kick in NYC. What I find very tiring is targeting these locations all separately, it means I need to create the same site 4 times with completely different and unique content. Should this setup work for me, and is there a risk that Google will see 4 web design pages, and basically say even though the content is unique your ranking up for web design with a location too many times? from my understanding this is not a problem now, but is this a future risk? It also becomes extremely difficult for site navigation with about us pages, contact us pages, and other pages that either have to be duplicated or all pages shown on sidebar for navigation. Please share your thoughts with me, THANKS!!!
Algorithm Updates | | tonyr70 -
Effect of Disavow the bad links
Hi, My website was effected in the recent google updates. I have a feeling that this was due to the bad links. I have analyzed all my existing links and either I am removing the bad links or disavow the links. 1. When can I see the effect / results of this activity.
Algorithm Updates | | adiez12341 -
Too Many On-Page Links
After running a site analysis on here it has come up and said that I have a lot o pages with too many on page links and that this might be why the site is being penalized. Thing is I am not sure how to remedy this as one page that says it has 116 links is this one : http://www.whosjack.org/10-films-with-some-crazy-bitches/ Although there is only one link in the body Then again our home page has 165 http://www.whosjack.org which again it says is too many. The thing is is that surely it doesn't count on links all over the page as other wise every news homepage would be penalised? For example what would happen here on this home page? : http://www.dazeddigital.com/ Can anyone help me see what I am missing? Are there possible hidden links anywhere I should be looking for etc? Thanks
Algorithm Updates | | luwhosjack0 -
How important are links after Panda
I have noticed that the sites in my niche that were at the top of the SERP's are still at the top of the SERP's after panda. I have also heard people theorizing that links are no longer important, its now all about bounce rates, time on site, etc. Is there any consensus about how important links are after Panda? thx Paul
Algorithm Updates | | diogenes1