How to take down a sub domain which is receiving many spammy back-links?
-
Hi all,
We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"?
Thanks
-
Hi vtmoz,
OK. You can upload a file containing all the domains you want to disavow you don't need to do that one by one. To check thousands of links is not something one wants to do for sure actually...
How you could do it: Disavow them all (from Webmaster Tools you export them all to a file) and then you delete a couple of dozens you know are strong and valuable domains.
Cheers,
Cesare
-
Hi Cesare,
But there are too many backlinks from different sub domains and how we are going to check thousands of links to disavow them? I think its hard to go with.
-
Hi vtmox,
Simply disavow the links that are spammy: https://support.google.com/webmasters/answer/2648487?hl=en. Thats it. Doing that you tell Google which ones not to take into account and the "good" ones will still going to benefit your subdomain. There is no need to take the subdomain down.
Hope this helps.
Cheers,
Cesare
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our root domain is no longer appearing in search results
Hi all The root domain for our site, roadtrippers.com, has been disappearing from Google's search results. Subfolders and subdomains still appear, but our root domain isn't found at all. I believe I've verified this by searching "-inurl:trips -inurl:byways -inurl:support -inurl:blog -inurl:places -inurl:guides -inurl:destinations site:https://roadtrippers.com/" in Google and our root domain is nowhere to be found. This may or may not be related to another issue we've had, where the root domain is appearing with a seemingly rotating set of parameters. Sometimes it'll be ?mod=, sometimes it'll be ?tag=translation. Originally they appeared to simply displace our ranking root domain, but now they and our root domain are completely disappearing. Our dev team believes they fixed the problem with recent 301 tags to any unapproved parameter being added to the root domain, but this hasn't fixed the original problem. Any insight into this is greatly appreciated! Brandon
Algorithm Updates | | brandonRT0 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
To link or redirect? That is the question.
I have a site that I don't really use any longer but still has some okay rankings. I'd like to take advantage of the links point to that site. Is it better to redirect that site to my new one or to just place a link on the homepage pointing to my new site?
Algorithm Updates | | JCurrier0 -
Do scraped or borrowed articles with my links still pass page rank?
I wrote some articles for Ezine Articles a few years back and i still see links in the ose to my site that are from these articles that were borrowed from the Ezine Articles bank. Do the links in these articles still count toward my site including link juice and anchor text or does google discount them as duplicate content? I was told that Google counts these links for about 3 weeks and then discounts them as duplicate content so it's like they don't exist. Any truth to this or should i make the articles on my site available for people to copy and paste into their blogs as long as they keep my links intact? Thanks, Ron
Algorithm Updates | | Ron100 -
Has Panda update made you lose your ranks but put them back again?
I noticed recently that one of the main sites I run dropped ranks quite heavily across the board. I then noticed that with very link building during the time that the ranks were down (about 1 month) that my ranks went back up again really quickly. All this with very little link building effort, and its the same link building campaign I've been running for a while. So I'm wondering has any been experiencing ranking flux between jan and feb? I know that people reckon if you fix some things your ranks can improve again, but I barley fixed anything on the site and yet it dropped some keywords from 1st page to 3rd page and then back to 3rd page; some keywords went back to original position some were lower but non were higher.
Algorithm Updates | | upick-1623910 -
Too many page links?`
Hi there This blog insert was flag suggesting there was too many page links? I cant identify the same problem? Can anyone explain?
Algorithm Updates | | footballfriends0 -
How do you get the Mini-Embed-Link-Thingies in search results?
Rand Fishkin touched on the tiny links that can appear for your search result - not the 6 pack of links that show the structure of your site, but the 2-4 links that show up on one line below your meta description. Any idea how to earn or influence these? He mentioned them in the "New Opportunities in Google's Search Results" webinar from May 2011 on Slide 19 if that helps. An example would be if you search "seo guide" in Google search, the SEOmoz link has those mini links below the meta description. Are they random or can they be influenced?
Algorithm Updates | | Hakkasan0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1