Unnatural Links Removal - are GWMT links enough?
-
Hi,
When working on unnatural links penalty, is removing and disavowing links shown on the GWMT enough or should the list be broaden to include OSE and Majestic etc.?
Thanks
-
Hi BeytzNet,
The answer to that question really depends on another question:
Are you looking for a short-term solution that may or may not get your current penalty lifted, or are you genuinely interested in dealing with links that really shouldn't be there?
If you're after the band-aid solution then you can try going with the arrogant suggestion from some Googlers that only links which offend Google at this point in time need to be dealt with. (Given Google's current propensity for adding to its list of what is "unnatural", their attitude borders on sadistic.)
If you really want to get some control over your backlink profile and future proof your site in the face of changing spam targets, impending Penguin updates and whatever else may be coming down the line, you might find it useful to try this little exercise:
Download backlink data from as many of the following as possible (free download limits for the tools you don't subscribe to will give you enough of a sample)
- Google Webmaster Tools
- Bing Webmaster Tools
- Open Site Explorer
- Majestic SEO
- ahrefs
- Raven Tools (pulls in data from Open Site Explorer & Majestic SEO)
Open each csv, select all and change text color so that the data for each list is a different color.
Copy and paste the content of each into one Excel spreadsheet so that all of the URLs are in one list.
Deduplicate the list.
Check out the different colored URLs left in your list...the takeaway is that every tool will bring you different link data. If you want a true picture of your backlink profile, you are now much closer to having it.
Incidentally, Google is not the only search engine to apply manual penalties. Others just don't talk about it as much as Google does. You might also find it helpful to read this post from Ryan Kent about identifying the source of your link penalty.
Hope that helps,
Sha
-
I don't usually worry about removing/disavowing those links. Google is concerned about the links that you have personally made (or an SEO on your behalf) in order to increase pagerank.
It's pretty common to have a lot of them.
-
Thanks Marie,
Question -
Going through my link profile I have encountered dozens of links from different SEO sites that analyzed my domain - whether on its own or showing it as a competitor to another site on the same niche.Weirdly, these are dofollow links (dozens!).
Should I disavow them?
Obviously these are not requested links of any kind. These sites are kind of aggregation sites that show practically any site worth mentioning.
-
Thanks Ben. This is the article I remember seeing.
-
That's great information and process.
-
Thanks Ben for that article. A few days ago I was searching for that and couldn't find it!
The vast majority of SEOs will tell you that you need to include links from as many sources as possible. However, John Mueller (a Google employee) recently said that in the majority of cases, focusing on the links in your WMT is enough. I could not find the thread where he said this, so I asked in WMF if someone could find it. Here is the thread.
In the past I have used a combo of links from WMT and also from ahrefs. However, for the current sites that I am working on I am just using WMT. If for some reason we do not get reconsidered then I will go back and add links from other sources.
I think the reason why people say to get links from all sources is that historically WMT has only given you a sample of your links. But in the last few months or so, in the "Download latest links" section they give a much larger number. Don't be fooled by the fact that it says, "Latest links". I have seen sites where this list included thousands of links going back as far as 2008.
-
According to Google Search Quality engineer, Uli Lutz, you only need to include the links in GWMT. Here is an article with more information on that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breadcrumbs versus in context link
Hi, I remember reading that links within the text have more value than breadcrumbs links for example because in context links are surrounded by the right content (words) but google search engine optimisation starter guide says breadcrumbs are good, so which one is recommended ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Affiliate Links Dilemma
Hello everyone. Our e-commerce website virtualsheetmusic.com has several hundreds affiliate incoming links, and many of them are "follow" links. I thought to redirect all incoming affiliate links to a "intermediate" page excluded by the robots.txt file in order to avoid any possible "commercial links" penalty from Google, but I now face a dilemma... most of our best referral links are affiliate links, by excluding those links from our back link profile could give us a big hit in terms of rankings. How would you solve this dilemma? What would you suggest doing in this sort of cases?
Intermediate & Advanced SEO | | fablau0 -
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
Any success stories after removing excessive cross domain linking?
Hi, I found some excessive cross domain linking from a separate blog to the main company website. It sounds like best practice is to cut back on this, but I don't have any proof of this. I'm cautious about cutting off existing links; we removed two redundant domains that had a huge number of links pointing to the main site almost 1 year ago, but didn't see any correlated improvement in rankings or traffic per se. Hoping some people can share a success story after pruning off excessive cross linking either for their own website or for a client's. Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
Alternative Link Detox tools?
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
Intermediate & Advanced SEO | | lightwurx0 -
How long should you wait between submitting link removal requests?
I'm in the process of trying to clear up a spammy link profile for a site I'm working on. I'm using the excellent data from MOZ and the list of links from Google Webmaster Tools to come up with a list of sites and Remove'em to manage the process and before I go to Google I want to make sure the file I am going to submit for the disavow process is as strong as possible. I am aware that I need to contact webmasters about three times to do the removal request properly. How long between requests should there be and how long should I wait between submitting a final removal request and submitting the file to the disavow tool? Any advice welcome. Thanks.
Intermediate & Advanced SEO | | johanisk0 -
Links to Facebook pages
I would like to ask if anyone has any knowledge regarding linking to a company's facebook page. I have built a few links to a client's facebook page in an effort to have it rank better in SERPs. I just learned that unlike twitter and linkedin, it is apparently not possibly to directly link to facebook pages. At least it is not possible from a search engine's perspective. If you follow any facebook page link while you are not logged into facebook, you are redirected to the facebook home page. I can't think of any way around this obstacle. I'd love some clever solution such as providing a URL which includes a basic dummy facebook login but there is nothing I am aware of to achieve this result. Does anyone have any ideas on this topic?
Intermediate & Advanced SEO | | RyanKent0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0