Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I disavow links from pages that don't exist any more
-
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist.
There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file?
Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not?
Hope Im making sense
-
Sounds a plan. Thanks for your help bud, much appreciated.
-
My take, I'll just go ahead and start doing other things to improve it's current rankings. I could assign someone to go over links if another team member is available.
If I see improvements, within the next month, then that's a good sign already that you should continue and not worry about the dead links.
It takes google a long time to actually forget about those links pointing to your site. So if they are dead AND then you didnt notice any increases or drops in analytics, then they are pretty much ineffective so they shouldnt be a major obstacle. I think someone coined a term for it, ghost links or something. LOL.
-
Hi. I did go through GA several years back, think back to 2011, but didn't really see dramatic changes in traffic other than a general trend of just low organic traffic throughout. Keep in mind that it's an engineering site, so no thousands of visit per day... the keywords that are important for the site get below 1000 searcher per month (data from the days when Google Keyword Tool shared this info with us mortals).
That said, I do notice in roughly 60% of the links absolutely no regard for anchors, so some are www.domain.com/index.php, Company Name, some are Visit Site, some are Website etc. Some anchors are entire generic sentences like "your company provided great service, your entire team should be commended blah blah blah". And there are tons of backlinks from http://jennifers.tempdomainname.com...a domain that a weird animal as there's not much data on who they are, what they do and what the deal is with the domain name itself. Weird.
In all honesty, nothing in WMT or GA suggests that the site got hit by either Penguin or Panda....BUT, having a ton of links that originate from non-existing pages, pages with no thematic proximity to the client site, anchors that are as generic as "Great Service"...is it a plus to err on the side of caution and get them disavowed, or wait for a reason from Google and then do the link hygiene?
-
Hi Igor,
Seeing ezinearticles in there is definitely a red flag that tells you that it probably has web directories, article networks, blog networks, pliggs, guestbooks and other links from that time.
Maybe you can dig up some old analytics data, check out when the traffic dropped.
If you did not see any heavy anchor text usage, then the site must've gotten away with a sitewide penalty, I would assume it's just a few (or many, but not all) of the keywords that got hit so either way, youll need to clean up -> disavow the links if they are indeed like that. So that's probably a reason for it's low organic rankings.
That, and since it's old, it might have been affected by panda too.
-
Thanks for your response. Im about done with cleaning up the link list in very broad strokes, eliminating obvious poor quality links, so in a few hours I could have a big list for disavowing.
The site is very specific, mechanical engineering thing and they sell technology and consulting to GM, GE, Intel, Nasa... so backlinks from sites for rental properties and resorts do look shady....even if they do return a 200 status.
But...how vigilent is google now with all the Penguin updates about backlinks from non-related sites, and my client's site has tons of them? And if Majestic reports them to have zero trust flow, is there a benefit of having them at all?
Thanks.
-
Hi. Thanks for responding. WMT shows just a fraction of the links actually. about few thousand for the site that Majestic Historic reports 48k. But I dont have any notifications of issues. Im guessing that with all the Penguin updates most sites won't get any notifications and it's up to us SEO guys to figure out why rankings are so low.
About quality of the links, many do come from weird sites, and I've noticed ezinearticles too. Problem is that the 48k portfolio was built by non-seo experts and now, few years after the fact, Im stuck with a site that doesn't rank well and has no notifications in WMT. But can I take the lack of notification as evidence that the site has no backlinks problem, or do I read-in the problem in poor organic ranking?
-
If I would be in that similar situation I would not really care about it but if it didn’t took too much of my time, I would have included all of these in the disavow file too.
But if the page is not giving a 200 status, this shouldn’t really be a problem.
Hope this helps!
-
Hi Igor,
Do they still show up in Webmaster tools? Do you have a penalty because of those links that used to link to the site? If not then I wouldn't really worry about it and just prioritize other things and make that a side task.
Are the majority of them on bad looking domains? If you checked the link URL on archive.org, were they spammy links? Then go ahead and include them in the disavow list.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | | davidvogel0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Hundreds of 404 errors are showing up for pages that never existed
For our site, Google is suddenly reporting hundreds of 404 errors, but the pages they are reporting never existed. The links Google shows are clearly spam style, but the website hasn't been hacked. This happened a few weeks ago, and after a couple days they disappeared from WMT. What's the deal? Screen-Shot-2016-02-29-at-9.35.18-AM.png
Technical SEO | | MichaelGregory0 -
301 Redirect non existant pages
Hi I have 100's of URL's appearing in Search Console for example: ?p=1_1 These go to on to 5_200 etc.. I have tried to do htaccess and the mod rewrite is on as I can redirect directories to the root i.e RewriteRule ^web_example(.*)$ /$1 [R=301,N,L] However I have tried all kinds of variations to redirect ?p= and either it doesn't work at all or it crashes the website. Can anyone point me in the right direction to fix this.
Technical SEO | | Cocoonfxmedia0 -
Remove page with PA of 69 and 300 root domain links?
Hi We have a few pages within our website which were at one time a focus for us, but due to developing the other areas of the website, they are now defunct (better content elsewhere) and in some ways slightly duplicate so we're merging two areas into one. We have removed the links to the main hub page from our navigation, and were going to 301 this main page to the main hub page of the section which replaces it. However I've just noticed the page due to be removed has a PA of 69 and 15,000 incoming links from 300 root domains. So not bad! It's actually stronger than the page we are 301'ing it to (but not really an option to swap as the URL structure will look messy) With this in mind, is the strategy to redirect still the best or should we keep the page and turn it into a landing page, with links off to the other section? It just feels as though we would be doing this just for the sake of google, im not sure how much decent content we could put on it as we've already done that on the destination page. The incoming links to that page will still be relevant to the new section (they are both v similar hence the merging) Any suggestions welcome, thanks
Technical SEO | | benseb0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Too Many On-Page Links - caused by a drop down menu
Many of our e-com sites we build for customers have drop down menus to help the user easily find products without having to click - Example: http://www.customandcommercial.com/ But this then causes the report to trigger too many on page links We do have a site map and a google site map So should I put code in place not to follow the drop down menu link items or leave in place?
Technical SEO | | spiralsites0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0