Do allow or disavow, that is the question!
-
We're in the middle of a disavow process and we're having some difficulty deciding whether or not to disavow links from Justia.com and prweb.com - justia.com alone is giving us 23,000 links with just 76 linked pages. So, to allow, or disavow? That's the question!
What do you think guys?
Thank you.
John.
-
Hey John
If you decide to take action action, then being aggressive with the links is a good approach. Both in Cyrus Shephard's great Moz blog post on the disavow tool and also advice from Google itself says if you suspect an entire domain to be spammy, go ahead and disavow all of it.
However, from my own perspective, I would only go through and create a disavow file if I knew for sure that I was suffering from a manual or algorithmic penalty. I have seen very little benefit in being proactive with that tool (eg rankings are good, you spot bad links in your link profile and disavow them to be safe) and, in fact, I have seen a number of cases when a disavow was submitted "prematurely" - ie, a site was ranking fine and then disavowed some links and saw rankings fall.
If we want to look at it from a slightly skeptical point of view - if you're not suffering from a Google penalty, do you really want to inform Google that you have suspicious links in your profile?
However, that is a matter of preference based on my own experience. I would certainly take note of the links you think are bad (and perhaps put together a file ready to go, just in case). Worth noting that prweb.com has made all of its links nofollow anyway, and so as they're not passing on link equity it doesn't seem logical to then disavow them (as they have no SEO benefit) Also, keep in mind though that if you visit the page and the link is not there - and especially if you do a google search for cache:http://www.example.com and see that the cached version contains no link - there's a very good chance that the link has already been discounted anyway and so would not be flagged in a manual or algorithmic check. Seeing as you have so many links from the domains, that may be occurring.
Hope this helps
-
Has Google notified you of the need to disavow links in Webmaster Tools? Usually, there's a message about unnatural links on the Manual Actions page.
I've never preemptively disavowed links. Maybe that's wrong. But then again, no single site is giving us 23,000 links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap and content question
This is our primary sitemap https://www.samhillbands.com/sitemaps/sitemap.xml We have a about 750 location based URL's that aren't currently linked anywhere on the site. https://www.samhillbands.com/sitemaps/locations.xml Google is indexing most of the URL because we submitted the locations sitemap directly for indexing. Thoughts on that? Should we just create a page that contains all of the location links and make it live on the site? Should we remove the locations sitemap from separate indexing...because of duplicate content? # Sitemap Type Processed Issues Items Submitted Indexed --- --- --- --- --- --- --- --- --- 1 /sitemaps/locations.xml Sitemap May 10, 2016 - Web 771 648 2 /sitemaps/sitemap.xml Sitemap index May 8, 2016 - Web 862 730
Intermediate & Advanced SEO | | brianvest0 -
Schema question
Hi all, We have two Trustpilot schemas (Local Business) on our web pages ( One on desktop / one on mobile) but we are finding that it is not updating the number of reviews in the search results. When using the tool : https://developers.google.com/structured-data/testing-tool/ , the test results are coming back ok. I have two ideas as to why it may not be working; 1) The duplication of the schema code is causing issues 2) We had to change the html code for all of our 50+ backend pages using a search&replace WordPress plugin to save a vast amount of time. Maybe this is plugin related? The fact that the google testing tool gives back positive results adds to the confusion. I test both of the theorised issues to see if it provides a fixes. Can anyone shed some further light on this issue? Is there something obvious I am missing? All responses are greatly appreciated! Thanks, Tom p.s. Example Page: https://www.allcleartravel.co.uk/asthma-travel-insurance/
Intermediate & Advanced SEO | | AllClearMarketing0 -
Content Cannibalism Question with example
Hi, Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism. Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google. The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS Two questions: 1. Does that happen often with Google (presenting two lines from the same site on first page)? 2. Would it be better practice for the writer to combine them? - creating a one more powerful page... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
SEO question
Hi there! I'm the SEO manager for 5 Star Loans. I have 2 city pages running. We are running our business in 2 locations: Berkeley, CA & San Jose, CA. For those offices we've created 2 google listings with separate gmail accounts. Berkeley (http://5starloans.com/berkeley/) ranks well in Berkeley in Gmaps and it shows on first page in organic results. However the second city page San Jose (http://5starloans.com/san-jose/) doesn't show in the Gmaps local pack results and also doesn't rank well in organic results. Both of them have authentic backlinks and reviews. It has been a year already and it's high time we knew the problem 🙂 any comment would be helpful. thanks a lot
Intermediate & Advanced SEO | | moonalev0 -
XML sitemaps questions
Hi All, My developer has asked me some questions that I do not know the answer to. We have both searched for an answer but can't find one.... So, I was hoping that the clever folk on Moz can help!!! Here is couple questions that would be nice to clarify on. What is the actual address/name of file for news xml. Can xml site maps be generated on request? Consider following scenario: spider requests http://mypage.com/sitemap.xml which permanently redirects to extensionless MVC 4 page http://mypage.com/sitemapxml/ . This page generates xml. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Site changes lead to big questions
I'm making some changes to my business that will cause me to move my blog to a new domain. The existing site will serve as a sales campaign for our full service programs and I want to keep visitors focused on that campaign. The old site will serve much like a mini site with a sales letter and video sales letter. In moving the blog content to another page - I found a post from Rand from a few years ago http://www.seomoz.org/blog/expectations-and-best-practices-for-moving-to-or-launching-a-new-domain. The way I wanted to approach this was to remove the content from the old site, and then resubmit the site map to Google for indexing. Of course they'll notice that the blog pages are gone. (probably a load of 404's) After perhaps a week, I'd repost the content (about 50 posts) on the new domain, which will be little more than a blog. I'd like some input on the way to approach this. Should I... a) Follow Rand's formula? b) Go with my idea (sort of the brute force model)? c) Consider an alternative method? It's probably worth mentioning that none of these posts have high search engine rankings. I appreciate your input Mozzers!
Intermediate & Advanced SEO | | sdennison0