Dead links-urls
-
What is the quickest way to get Google to clean up dead
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores.Thanks
-
thank you 410 was what i needed.
-
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
Can you offer clarification as to what you view as a "dead link"?
If you mean a link to a page on your site which no longer exists, then your options are as follows:
-
take John's advice and 301 the link to the most relevant page on your site
-
allow the link to 404. 404 errors are a natural part of the internet. You should be sure to have the most helpful 404 page possible. Include your site's normal navigation, a search box, most popular articles, etc.
Blocking pages in robots.txt is usually a bad idea and is more likely to add to the problem rather then fix it. If you do not block the page in robots.txt, Google would then crawl the page, see the error and remove it from their index. Because you block the page they can no longer view the page so they will likely leave it in for longer.
Google normally promptly removes web pages submitted with the URL Removal Tool. The tool is designed to remove pages which can damage your site or others. For example, if confidential information was accidentally published. It is not designed to be used to remove 74k pages because you decided to remove them from your website.
If these dead links are simply pages you removed from your site, I advise you to remove the robots.txt block and then decide to 301 them or allow them to 404. Google should then clean up the links within 30-60 days.
If you want to speed up the process as much as possible, there is one other step you can take. Set a 410 (Gone) code for the pages. When Google receives a 404 response, they are unsure if the page is simply temporarily unavailable so they keep it in the index. If they receive a 410 response, you are telling Google the page is GONE and they can update their index faster.
-
-
Are the links 404 pages or just expired content?
-
The links are index, i need them really cleaned up, not a
redirect, and users/customers find it frustrating -
You could try setting up a 301 redirect in htaccess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a site that is showing Internal 0 Equity-Passing Links, 0 Internal links, and every page bar the homepage has a PageAuthority of 0\. Any suggestions on why this could be?
The website has been around for several years, although did undergo a significant redesign at the end of last year. I greatly appreciate any help.
Moz Pro | | embracedarrenhughes0 -
ERROR: Too Many on-page links!
User wise, my product menu @LEDSupply.com is user friendly, but I'm concerned that it might be seen by crawlers as bad because of the TOO MANY ON-PAGE LINKS error I am getting in my moz crawl report. Is it really counting all the links in every drop-down menu? If so, is there are resource on how to fix it????
Moz Pro | | saultienut0 -
Internal and external links showing in Webmaster Tools but not in MOZ
Hi there I set up a campaign 2 weeks ago but I am registering zero internal or external links for my website. It's a small site with only about 300 external links but they show in Webmaster tools but not in my link report or Open Site Explorer. Do I need to wait until the next OSE crawl or is something wrong with my set up? Thanks in advance! Laurie
Moz Pro | | turnbullholdingsltd0 -
Difference Between equity passing and follow links
Hi, I am recently seeing two more new options in the Opensiteexplorer filter options. equity passing links
Moz Pro | | Dexjj
non equity passing links
nofollow
dofollow What is the difference between an equity passing links and dofollow links. Can you guys help me.4 -
Open Site Explorer Link Metrics
I'm new to SEOMOZ. I have a Pro account and have been messing with the tools available. I ran a report in Open Site Explorer on one of my prospects. I also entered in several of their competitors to see how they rank. I would like to show the report to my prospective client but couldn't figure out how to download the information. Is that info available for download? Does anyone have a recommendation for showing prospects how they rank against their competitors. Any help appreciated.
Moz Pro | | MedGroupMedia1 -
Number of available links limited?
OK, I've been making use of the free LinkScape API (on behalf of a client of mine) and trying to get links (and info on those links) to a specific domain/page/etc. NOTE : I've been using it without any issue in the past, however we are currently facing some weird issues. Let's take this simple query as an example : http://lsapi.seomoz.com/linkscape/links/wikipedia.org?SourceCols=4&TargetCols=4&Sort=page_authority&Scope=page_to_domain What this one supposedly does is to get links to "wikipedia.org", right? I'm reading : The Page_to_* scopes will by default return 25 links per source domain if no limit is specified, so you can see domain diversity. Due to space limitations in our API, a general link query for a given page will return at most 25 pages for every unique domain linking to that page. And I'm saying OK, that's fine. The thing is that (instead of the 1000 links I had been getting before), I'm now getting just 25 links. NOT per... "source domain"... but obviously per "target domain" (= wikipedia.org) - or am I missing something? (well, probably wikipedia suddenly has just about 25 links pointed to it... makes sense! 🙂 ) Please, let me know what's going on with the above, simply because getting just 25 links is close to worthless... Thanks a lot, in advance!
Moz Pro | | drkameleon0 -
External Followed Links History, number of links go down
I was reviewing Historical Domain Analysis and found that in last 2 month we lost almost 10000 external followed links. What this could be? is this real or just question seomoz crawling? 30voy1g.jpg
Moz Pro | | ctam0 -
Where is the best place to add links on my site?
If I'd like to put links to other sites in my site, is it better to have a page named "Our Helpful Links" etc. instead of just adding them to the bottom of an existing page like I've seen on some sites? I'm asking because I'm wanting to make Google as happy as possible and still add them. Just in case it helps to look at the site yourself to give advise its; http://www.allstatetransmission.net If you see anything else there that I should work on feel free to be hard on it, I value any criticism. Thanks, Jeff
Moz Pro | | allstatetransmission0