Should I disavow links from pages that don't exist any more
-
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist.
There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file?
Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not?
Hope Im making sense
-
Sounds a plan. Thanks for your help bud, much appreciated.
-
My take, I'll just go ahead and start doing other things to improve it's current rankings. I could assign someone to go over links if another team member is available.
If I see improvements, within the next month, then that's a good sign already that you should continue and not worry about the dead links.
It takes google a long time to actually forget about those links pointing to your site. So if they are dead AND then you didnt notice any increases or drops in analytics, then they are pretty much ineffective so they shouldnt be a major obstacle. I think someone coined a term for it, ghost links or something. LOL.
-
Hi. I did go through GA several years back, think back to 2011, but didn't really see dramatic changes in traffic other than a general trend of just low organic traffic throughout. Keep in mind that it's an engineering site, so no thousands of visit per day... the keywords that are important for the site get below 1000 searcher per month (data from the days when Google Keyword Tool shared this info with us mortals).
That said, I do notice in roughly 60% of the links absolutely no regard for anchors, so some are www.domain.com/index.php, Company Name, some are Visit Site, some are Website etc. Some anchors are entire generic sentences like "your company provided great service, your entire team should be commended blah blah blah". And there are tons of backlinks from http://jennifers.tempdomainname.com...a domain that a weird animal as there's not much data on who they are, what they do and what the deal is with the domain name itself. Weird.
In all honesty, nothing in WMT or GA suggests that the site got hit by either Penguin or Panda....BUT, having a ton of links that originate from non-existing pages, pages with no thematic proximity to the client site, anchors that are as generic as "Great Service"...is it a plus to err on the side of caution and get them disavowed, or wait for a reason from Google and then do the link hygiene?
-
Hi Igor,
Seeing ezinearticles in there is definitely a red flag that tells you that it probably has web directories, article networks, blog networks, pliggs, guestbooks and other links from that time.
Maybe you can dig up some old analytics data, check out when the traffic dropped.
If you did not see any heavy anchor text usage, then the site must've gotten away with a sitewide penalty, I would assume it's just a few (or many, but not all) of the keywords that got hit so either way, youll need to clean up -> disavow the links if they are indeed like that. So that's probably a reason for it's low organic rankings.
That, and since it's old, it might have been affected by panda too.
-
Thanks for your response. Im about done with cleaning up the link list in very broad strokes, eliminating obvious poor quality links, so in a few hours I could have a big list for disavowing.
The site is very specific, mechanical engineering thing and they sell technology and consulting to GM, GE, Intel, Nasa... so backlinks from sites for rental properties and resorts do look shady....even if they do return a 200 status.
But...how vigilent is google now with all the Penguin updates about backlinks from non-related sites, and my client's site has tons of them? And if Majestic reports them to have zero trust flow, is there a benefit of having them at all?
Thanks.
-
Hi. Thanks for responding. WMT shows just a fraction of the links actually. about few thousand for the site that Majestic Historic reports 48k. But I dont have any notifications of issues. Im guessing that with all the Penguin updates most sites won't get any notifications and it's up to us SEO guys to figure out why rankings are so low.
About quality of the links, many do come from weird sites, and I've noticed ezinearticles too. Problem is that the 48k portfolio was built by non-seo experts and now, few years after the fact, Im stuck with a site that doesn't rank well and has no notifications in WMT. But can I take the lack of notification as evidence that the site has no backlinks problem, or do I read-in the problem in poor organic ranking?
-
If I would be in that similar situation I would not really care about it but if it didn’t took too much of my time, I would have included all of these in the disavow file too.
But if the page is not giving a 200 status, this shouldn’t really be a problem.
Hope this helps!
-
Hi Igor,
Do they still show up in Webmaster tools? Do you have a penalty because of those links that used to link to the site? If not then I wouldn't really worry about it and just prioritize other things and make that a side task.
Are the majority of them on bad looking domains? If you checked the link URL on archive.org, were they spammy links? Then go ahead and include them in the disavow list.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website's Home Page is Missing on Google SERP
Hi All, I have a WordPress website which has about 10-12 pages in total. When I search for the brand name on Google Search, the home page URL isn't appearing on the result pages while the rest of the pages are appearing. There're no issues with the canonicalization or meta titles/descriptions as such. What could possibly the reason behind this aberration? Looking forward to your advice! Cheers
Technical SEO | | ugorayan0 -
Links On Out Of Stock Product Pages Causing 404
Hi Moz Community! We're doing an audit of our e-commerce site at the moment and have noticed a lot of 404 errors coming from out of stock/discontinued product pages that we've kept 200 in the past. We kept these and added links on them for categories or products that are similar to the discontinued items but many other links of the page like images, blog posts, and even breadcrumbs have broken or are no longer valid causing lots of additional 404s. If the product has been discontinued for a long time and gets no traffic and has no link equity would you recommend adding a noindex robots tag on these pages so we're not wasting time fixing all the broken links on these? Any thoughts?Thanks
Technical SEO | | znotes0 -
Can I speed up removal of cache for 301'd page on unverified website?
I recently asked another website to remove a page from their website (I have no control over this website) and they have now 301'd this old URL to another - this is just what I wanted. My only aim now is to see the Google cache removed for that page as quickly as possible.
Technical SEO | | Mark_Reynolds
I'm not sure that asking the website to remove the url via WMT is the right way to go and assume I should just be waiting for Google to pick up the 301 and naturally remove the cache. But are there any recommended methods I can use to speed this process up? The old URL was last cached on 3 Oct 2014 so not too long ago. I don't think the URL is linked from any other page on the Internet now, but I guess it would still be in Google's list of URLs to crawl. Should I sit back and wait (who knows how long that would take?) or would adding a link to the old URL from a website I manage speed things up? Or would it help to submit the old URL to Google's Submission tool? URL0 -
Dealing with high link juice/low value pages?
How do people deal with low value pages on sites which tend to pool pagerank and internal links? For example log in pages, copyright, privacy notice pages, etc. I know recently Matt Cutts did a video saying don't worry about them, and in the past we all know various strategies like nofollow, etc. were effective but no more. Are there any other tactics or techniques with dealing with these pages and leveraging them for SEO benefit? Maybe having internal links on these pages to strategically pass off some of the link juice?
Technical SEO | | IrvCo_Interactive0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
Any need to worry about spammy links in Webmaster Tools from sites that no longer exist?
I own an ecommerce website that had some spammy stuff done on it by an SEO firm through SEOLinkVine a few years ago. I'm working on removing all those links, but some of the sites no longer exist. I'm assuming I don't have to worry about disavowing those in Webmaster Tools? Thanks!
Technical SEO | | CobraJones950 -
Rel=Canonical on a page with 302 redirection existing
Hi SEOMoz! Can I have the rel=canonical tag on a URL page that has a 302 redirection? Does this harm the search engine friendliness of a content page / website? Thanks! Steve
Technical SEO | | sjcbayona-412180 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0