Clean up of Links, What to get rid of?
-
We have been cleaning up our back office and preparing our .com domain to take all our future traffic and have got into a debate about how far to clean up the old past links.
We have not ever had a penalty on the site as far as we know, but did once get the site taken offline by Google as they thought it was a malware site back in March this year. They put it straight back up and running in 5 hours, but was very strange as it is an amazon-webstore retail site.
We are not sure why Google thought (edit: typo) this, so just in-case we have been combing through the historical links and now started to disavow any links we cannot get removed manually. So far just a couple of sites that have no relevance to our retail business.
However, the debate we have been having is around Directory listings: Should we get rid of these too? Gut reaction is Yes, based on the need for quality relevant links for the end user, but then some are passing proper links to relevant sections of our site albeit in a directory format. Dmoz comes to mind
Any thoughts?
Bruce.
-
Good points.
Following on from recent developments from the Matts Cutts comments about Directories, we get loads of links which could be, might not be really from Directories. As these are not top links we have removed some, which we can see are not related to our business at all and have removed these, but some are in that grey area, where the directory has good Trust and authority, but the links are somewhat generic.
We have been working on removing links for quite a few a month or so, and we have seen our Global rank drop like a stone, as unlinking is not a good sign from where-ever, so we know this will recover.
The balancing of this debate is how far you go to strip back to pure distilled links and allow a few diluted links.
-
Since you have brought up the question I would say that as you are looking for what would be the best for your site always think about how this benefits your site visitors. The focus should always be on content. Is the backlink in question relevant at all to what you are doing as a business? If the answer is yes ask yourself if the site that you are linking to is of any value. How is the trust flow and page rank to that site. I hope this helps.
Malcom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
38% of SEOs Never Disavow Links: Are you one among them or the other 62%?
Hi all, Links disavowing is such a advanced tasks in SEO with decent amount of risk involved. I thought many wouldn't follow use this method as Google been saying that they try to ignore bad links and there will be no penalty for such bad links and negative SEO is really a rare case. But I wondered to see only 38% SEOs never used this method and other 62% are disavowing links monthly, quarterly or yearly. I just wonder do we need to disavow links now? It's very easy to say to disavow a link which is not good but difficult to conclude them whether they are hurting already or we will get hurt once they been disavowed. Thanks Screenshot_3.jpg
Algorithm Updates | | vtmoz1 -
Excessive internal links. Should I remove the footer links?
Hi guys, I have an ecommerce site selling eco-friendly items online. I ran some on-page optimisation reports from SEOMoz PRO and discovered that I have at least 120 internal links per page. 32 of these are in the footer, designed in part to aid user navigation but perhaps also to have a positive impact on SERPs and SEO in general for the ecommerce site. Will removing these links be beneficial to my search engine rankings, as I will have less than 100 internal links per page? Or is it a major change which may be dangerous for my site rankings? Please help as I'm not sure about this! I've attached an image of the footer links below. I won't be removing the Facebook/Twitter links, just the 3 columns on the left. Thank you, Pravin MAvLe.jpg
Algorithm Updates | | goforgreen0 -
Why am I getting different Google SERP result for same keywords?
Hi Mozzers, I have noticed recently that Google (.com.au) has been serving up different SERP results for the same keywords. For example, one of our main keywords is "Car Loan". One result will show our site as ranking #5 organically from 242,000,000 results. A refresh of this search will then result in our site not ranking at all from 133,000,000 results. We have been noticing this happen only in the last few days & more frustrating is that Google is throwing up the SERP from 133,000,000 results more frequently. Would anyone know why this is occurring? And what can we do, if anything, to ensure we are shown regardless of how many results Google calls from? Is it from recent algo update & will it settle down over time? Any help would be greatly appreciated. (Just to add - I'm not gogged in to Google when completing this test & regularly clear cookies etc so I don't believe its a personalised search issue)
Algorithm Updates | | 360Finance0 -
Infographics Links could get discounted in the future
Hey guys, I read this article this morning on SEL. Not sure what to think about it.. Matt did have a point that a lot of infographics are of bad quality (even with wrong information present at times) , and hence don't deserve to gain links from it. But how could Google possible know whether the infographic itself is of high quality or not?? http://searchengineland.com/cutts-infographic-links-might-get-discounted-in-the-future-127192
Algorithm Updates | | Michael-Goode0 -
Are you guys finding more No Follows being counted as links?
Wonding if anyone is finding more and more no follows actually being counted as links.
Algorithm Updates | | barrystix30 -
How to get bullet snippets SERPs
http://insidesearch.blogspot.com/2011/08/new-snippets-for-list-pages.html I read this post and have been seeing a lot of results with this feature, but can't figure out exactly why some results get them and others dont. of the results I've seen, many have information in lists or tables (as the article suggests), but some simply have their information listed in separate divs. Does anyone have any further insight on this? The above article is the only one I can find on the subject.
Algorithm Updates | | Hakkasan0