Dead links-urls
-
What is the quickest way to get Google to clean up dead
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores.Thanks
-
thank you 410 was what i needed.
-
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
Can you offer clarification as to what you view as a "dead link"?
If you mean a link to a page on your site which no longer exists, then your options are as follows:
-
take John's advice and 301 the link to the most relevant page on your site
-
allow the link to 404. 404 errors are a natural part of the internet. You should be sure to have the most helpful 404 page possible. Include your site's normal navigation, a search box, most popular articles, etc.
Blocking pages in robots.txt is usually a bad idea and is more likely to add to the problem rather then fix it. If you do not block the page in robots.txt, Google would then crawl the page, see the error and remove it from their index. Because you block the page they can no longer view the page so they will likely leave it in for longer.
Google normally promptly removes web pages submitted with the URL Removal Tool. The tool is designed to remove pages which can damage your site or others. For example, if confidential information was accidentally published. It is not designed to be used to remove 74k pages because you decided to remove them from your website.
If these dead links are simply pages you removed from your site, I advise you to remove the robots.txt block and then decide to 301 them or allow them to 404. Google should then clean up the links within 30-60 days.
If you want to speed up the process as much as possible, there is one other step you can take. Set a 410 (Gone) code for the pages. When Google receives a 404 response, they are unsure if the page is simply temporarily unavailable so they keep it in the index. If they receive a 410 response, you are telling Google the page is GONE and they can update their index faster.
-
-
Are the links 404 pages or just expired content?
-
The links are index, i need them really cleaned up, not a
redirect, and users/customers find it frustrating -
You could try setting up a 301 redirect in htaccess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 2 linking root domains on my URL. But I don't get the whole Root domain thing. So I don't understand how I can improve it?
I have 2 linking root domains on my URL. But I don't get the whole Root domain thing. So I don't understand how I can improve it? I copy and pasted this, from my Links page in my campaign because I can't seem to grasp what a root domain is: 'A higher number of good quality linking root domains improves a page's ranking potential'. Can some one explain to me what this is. As simply as possible. Here's my site www.Thumannagency.com Thanks in advance:)
Moz Pro | | MissThumann0 -
Good tool to track external links from the website
I am in search of a tool that provides me links generating from my site to another site. Is there a software or tool that can scan the whole site and provide me what are the links of other sites in my site.
Moz Pro | | csfarnsworth0 -
Campaign link not working?
I imagine most of the initial problems with the domain migration to moz.com will be cleaned up rather quickly. I did however notice right away my links to my campaigns from the dashboard were not working. I then click the campaigns tab up top and my campaign links worked from there. Just thought I'd share real quick incase anyone else is having the same issue.
Moz Pro | | CDUBP0 -
No link data for my domains + still no adwords data
All 3 sites In my campaign have no link data analysis done. 5 to 6 months after you said you were working on it Adwords data is still not available in Keyword analysis Partial service is understandable for a few days but not for months. Beyond the nice screens, what data & services do you exactly provide and what are we paying for?
Moz Pro | | ResourceLab0 -
Finding the source of duplicate content URL's
We have a website that displays a number of products. The product has variations (sizes) and unfortunately every size has its own URL (for now anyway). Needless to say, this causes duplicate content issues. (And of course, we are looking to change the URL's for our site as soon as possible) However, even though these duplicate URL's exist, you should not be able to land on them by navigating through the site. In theory, the site should always display the link to the smallest size. It seems that there is a flaw in our system somewhere, as these links are now found in our campaign here on SEOmoz. My question: is there any way to find the crawl path that lead to the URL's that shouldn't have been found, so we can locate the problem?
Moz Pro | | DocdataCommerce0 -
I will appreciate the exact definition of the term External followed Links and the term Followed Linking Root Domain.
I will appreciate the exact definition of the term External followed Links and the term Followed Linking Root Domain when using OpenSiteExplorer on Page level and on Sub-domain level. let's say my site is www.eitan.com and that's my home page as well. which of the following will be consider as external link / linking root domain? www.es.eitan.com es.eitan.com www.eitan.es joe.eitan.com www.joe.eitan.com Thanks a lot
Moz Pro | | WixSeoTeam0 -
DA or PA for link building strenght
Hello, We're doing link building for nlpca(dot)com with Open Site Explorer. 90% of the sites found that we're targeting have NLP resource sections that will probably list our site because we are an authority. My Excel Spreadsheet has the following values: Backlink Holder Site Name Site URL Site Type Tactic PA Ease (1-5) Estimated Value (1-5) Priority (Ease x Est. Value) Contact Info -- I'm only interested in strong sites - sites that will be around for the long haul, and I'm stopping aquiring sites that their resource section has a PA lower than 20. Should I be incorporating DA in some way as well? What other feedback do you have for me?
Moz Pro | | BobGW0 -
Crawl test tool from SEOmoz - which URLs does it actually crawl?
I am using for the first time the crawl test tool from SEOmoz and I do not really understand which URLs the tool is going to crawl. First, it says "enter any subdomain" --> why can´t I do the crawl for the root domain? Second it says "we'll crawl up to 3,000 linked-to pages" --> does that mean that the tool crawls all internal links that it can find on the given domain? Thanks for your help!
Moz Pro | | Elke.GetApp0