How to take down a sub domain which is receiving many spammy back-links?
-
Hi all,
We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"?
Thanks
-
Hi vtmoz,
OK. You can upload a file containing all the domains you want to disavow you don't need to do that one by one. To check thousands of links is not something one wants to do for sure actually...
How you could do it: Disavow them all (from Webmaster Tools you export them all to a file) and then you delete a couple of dozens you know are strong and valuable domains.
Cheers,
Cesare
-
Hi Cesare,
But there are too many backlinks from different sub domains and how we are going to check thousands of links to disavow them? I think its hard to go with.
-
Hi vtmox,
Simply disavow the links that are spammy: https://support.google.com/webmasters/answer/2648487?hl=en. Thats it. Doing that you tell Google which ones not to take into account and the "good" ones will still going to benefit your subdomain. There is no need to take the subdomain down.
Hope this helps.
Cheers,
Cesare
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does google takes to crawl a single site ?
lately i have been thinking , when a crawler visits an already visited site or indexed site, whats the duration of its scanning?
Algorithm Updates | | Sam09schulz0 -
Back link plan discussion
When you have a lot of keywords that you rank for say something like 15,000 or more. How do you develop a good back link plan? I was thinking to first look at the highest volume keywords we already rank for but aren't in the top 1-3 spots. To focus on those few words trying to obtain more high quality back links. But I'm not sure if this is the best plan . What would you do? What are some good consistent back link plans you can use to work on a keyword or lots of keywords? Thanks for the discussion, Chris
Algorithm Updates | | Cfarcher1 -
Traffic drop only affecting google country domains
Hello, I have noticed that our our traffic is down by 15% (last 30 days to the 30 days before it) and I dug deeper to figure out whats going on and I am not sure I understand what is happening. Traffic from google country domains( for example google.com.sa) dropped by 90% on the 18th of September, same applies to other country specific domains. Now my other stats (visits organic keywords, search queries in WMT) seem to be normal and have seem some decrease (~5%) but nothing as drastic as the traffic drop from the google country domains. Is this an https thing that is masking the source of the traffic that came into effect on that date? Is the traffic that is now missing from google country domains being reported from other sources? Can anyone shed some light on what is going on? qk0CS7X
Algorithm Updates | | omarfk0 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
SEO For sub locations for specific services
Hey Guys, I am currently creating a website for my business that will be marketing through SEO very heavily. I Live in NYC, and i'd like to rank up for the individual locations such as Queens, Brooklyn, Long Island and eventually if my domain authority and other long hall metrics kick in NYC. What I find very tiring is targeting these locations all separately, it means I need to create the same site 4 times with completely different and unique content. Should this setup work for me, and is there a risk that Google will see 4 web design pages, and basically say even though the content is unique your ranking up for web design with a location too many times? from my understanding this is not a problem now, but is this a future risk? It also becomes extremely difficult for site navigation with about us pages, contact us pages, and other pages that either have to be duplicated or all pages shown on sidebar for navigation. Please share your thoughts with me, THANKS!!!
Algorithm Updates | | tonyr70 -
Back to result after manual action revoked
Hello, Two weeks ago we got message from Google which said about our reconsideration request has been sucesfuly processed after few months of fight. So, manual action has been revoked but our problem is more complex. Here are flow: 1. 1-01-2012 we got filtered (manual filter - buy link) Here are 100k vistors per day... 2. 7-05-2012 we have problem with budget, hosting, etc.. Before that (after get filter) we get 20k vistors per day.. 3. 12-07-2012 we back Our statistics at this moment: 10-12k vistors per day, so we make 301 from country domain to global *.com 4.12.07-2012 - 1-09-2012 We try make a good reconsideration request without any positive result... Our statistics after 301 redirect are 5k vistors per day, not more. Every day is : 5 000 (+/- 5 %) 5.14-12-2012 Our reconsideration request has been procesed positive, manual action has been revoked but... after two weks from this message we have the same statistics (still are the same traffic source precentages). So whats going on? When will be back to SERP? Thanks! It's very important for us, before we got filter our statistics are permanent good. At this moment we don't see result for "keyword domainame" in TOP 10, only positions > 50 sometimes are normal (1-2 % of queries). Our competition have good statistics all time, their pages are above our results for phrases like: "keyword domainame"
Algorithm Updates | | thenaturat0 -
Domain Deindexed because of Redirect
I think this is an interesting topic to discuss though I'm looking for answers too. One of my well performing domain deindexed by Google today. Reason: Redirect from a 9 year old Deindexed domain (Must be penalysed) I believe this is done by one of my competitor. What you people suggest me to do now? Don't you think if this is the way Google treat the redirects after Penguin anybody can use this technique to harm their competitors?
Algorithm Updates | | HeIsHere0 -
HI, i am pro member of Seo moz, i just want to know , How much time take Seomoz for Crawl Diagnostics.
HI, i am pro member of Seo moz, i just want to know , How much time take Seomoz for Crawl Diagnostics. Because last evening i have changes in my website pages as seomoz suggested but i am not getting any changes in Crawl Diagnostics.
Algorithm Updates | | jaybinary0