Do allow or disavow, that is the question!
-
We're in the middle of a disavow process and we're having some difficulty deciding whether or not to disavow links from Justia.com and prweb.com - justia.com alone is giving us 23,000 links with just 76 linked pages. So, to allow, or disavow? That's the question!
What do you think guys?
Thank you.
John.
-
Hey John
If you decide to take action action, then being aggressive with the links is a good approach. Both in Cyrus Shephard's great Moz blog post on the disavow tool and also advice from Google itself says if you suspect an entire domain to be spammy, go ahead and disavow all of it.
However, from my own perspective, I would only go through and create a disavow file if I knew for sure that I was suffering from a manual or algorithmic penalty. I have seen very little benefit in being proactive with that tool (eg rankings are good, you spot bad links in your link profile and disavow them to be safe) and, in fact, I have seen a number of cases when a disavow was submitted "prematurely" - ie, a site was ranking fine and then disavowed some links and saw rankings fall.
If we want to look at it from a slightly skeptical point of view - if you're not suffering from a Google penalty, do you really want to inform Google that you have suspicious links in your profile?
However, that is a matter of preference based on my own experience. I would certainly take note of the links you think are bad (and perhaps put together a file ready to go, just in case). Worth noting that prweb.com has made all of its links nofollow anyway, and so as they're not passing on link equity it doesn't seem logical to then disavow them (as they have no SEO benefit) Also, keep in mind though that if you visit the page and the link is not there - and especially if you do a google search for cache:http://www.example.com and see that the cached version contains no link - there's a very good chance that the link has already been discounted anyway and so would not be flagged in a manual or algorithmic check. Seeing as you have so many links from the domains, that may be occurring.
Hope this helps
-
Has Google notified you of the need to disavow links in Webmaster Tools? Usually, there's a message about unnatural links on the Manual Actions page.
I've never preemptively disavowed links. Maybe that's wrong. But then again, no single site is giving us 23,000 links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about structuring @id schema tags
We are using JSON-LD to apply schema. My colleague had question about applying @id tags in the schema parent lists: While implementing schema, we've included @id as a parameter to both the "list" child of "ListItem" of a "BreadcrumbList" - on the same schema, we've added an @id parameter to mainContentOfPage and both @id parameters are set to the pages URL. Having this @id in both places is giving schema checker results that have the child elements of "mainContentOfPage" appearing under the "list" item. Questions: is this good or bad? Where should @id be used? What should @id be set to? Thanks for the insight!
Intermediate & Advanced SEO | | RosemaryB0 -
Hypothetical SEO Question
I am running a website for a law firm. It has been running for many, many years and has plenty of backlinks and authority. I then create a standalone website for a specific type of case that the law firm is handling. On that website, I have a page that copies some of the attorney bio text from the main website. How much of a negative impact will this standalone website have on the main website as far as duplicate content issues are concerned? Please explain your answer in detail. Thank you in advance.
Intermediate & Advanced SEO | | goldbergweismancairo0 -
4 questions about a paragraph of SEO friendly text in my e-com websites header.
Hi guys, I'm trying to understand the SEO behind our websites header. www.mountainjade.co.nz As you can see we have a paragraph of relevant introductory text that is also SEO friendly in our header. What I would like some help with is understanding how google views and assigns 'juice' to information like this in the header or footer of a website. Usually certain pages have content specific to a given topic, and google ranks these pages accordingly. But with a websites header / footer its content appears on every page as the header is always at the top and footer at the bottom. 1. In what way does my website benefit from the paragraph of text in the header? e.g at the domain level? Just the home page? etc etc 2. How does google assign 'juice' to the paragraph of text? (similiar to Q1). 3. How would my website be effected if I moved the text to the footer? (Aesthetic change) 4. When I 'inspect element' on the paragraph, it is labelled 'div id=site description.' Can someone please explain the relevance of a sites description to SEO for me. This paragraph of text was in the websites header before I came onboard, and I've been too concerned to change / move it as I don't know enough about it. Any help would be appreciated! Thanks team, Jake
Intermediate & Advanced SEO | | Jacobsheehan0 -
Wrong Website Showing Up On Knowledge Graph - Car Dealer SEO Question
Hi Everyone, I have a client who has two website platforms, one of them is mandated by the manufacturer and the other is the one we use and is linked up to our Google Plus/Maps/etc. accounts. The one that is manufacturer mandated is showing up on the Google Knowledge graph and this is not ideal for us. Unfortunately, we cannot get rid of the other site because it is mandated. So how do we go about fixing this issue? I Had a few ideas, and I'd like to know if they would work. If you can think of something that's outside of the box, I'd appreciate it. 1.) Put a rel=canonical across the website 2.) Remove all keywords that might trigger it to show up on the knowledge graph from the URL of the non ideal site 3.) Go for a .net or .us domain. Do these kind of domains have less authority and are less likely to show up in a google search? Thanks!
Intermediate & Advanced SEO | | oomdomarketing0 -
Simple Link Question
Hi Guys, I will appreciate if you answer 1 small question..... Will our site benefit from that link?
Intermediate & Advanced SEO | | Webdeal
Valuable website related to our business ---nofollow link--> PDF Doc(on second site) ---link to our site ---> Kind Regards,
webdeal0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Few questions regarding wordpress and indexing/no follow.
I'm using Yoast's Wordpress SEO plugin on my wordpress site which allows you to quickly set up nofollow / no index on specific taxonomies. I wanted to see what you guys thought was the best practice in setting up my various taxonomies. Would you noidex, but follow all of these, none of these, or just some of these: Categories, tags, media, author archives ( (My blog is mainly a single author blog (me) but my wife does sometimes write posts. So I didn't know how this effected everything. Also I could simply make the blog a single user blog and just have her posts be guest posts, but I'd rather leave her as a user.), and date archives. The example I read on line only no-index's the date archives. Just curious what you guys thought. Thanks.
Intermediate & Advanced SEO | | NoahsDad0 -
Questions about 301 Redirects
I have about 10 - 15 URLs that are redirecting to http://www.domainname.comwww.domainname.com/. (which is an invalid URL)The website is on a Joomla platform. Does anyone know how I can fix this? I can't figure out where the problem is coming from.
Intermediate & Advanced SEO | | JohnParker27920