Dead links-urls
-
What is the quickest way to get Google to clean up dead
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores.Thanks
-
thank you 410 was what i needed.
-
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
Can you offer clarification as to what you view as a "dead link"?
If you mean a link to a page on your site which no longer exists, then your options are as follows:
-
take John's advice and 301 the link to the most relevant page on your site
-
allow the link to 404. 404 errors are a natural part of the internet. You should be sure to have the most helpful 404 page possible. Include your site's normal navigation, a search box, most popular articles, etc.
Blocking pages in robots.txt is usually a bad idea and is more likely to add to the problem rather then fix it. If you do not block the page in robots.txt, Google would then crawl the page, see the error and remove it from their index. Because you block the page they can no longer view the page so they will likely leave it in for longer.
Google normally promptly removes web pages submitted with the URL Removal Tool. The tool is designed to remove pages which can damage your site or others. For example, if confidential information was accidentally published. It is not designed to be used to remove 74k pages because you decided to remove them from your website.
If these dead links are simply pages you removed from your site, I advise you to remove the robots.txt block and then decide to 301 them or allow them to 404. Google should then clean up the links within 30-60 days.
If you want to speed up the process as much as possible, there is one other step you can take. Set a 410 (Gone) code for the pages. When Google receives a 404 response, they are unsure if the page is simply temporarily unavailable so they keep it in the index. If they receive a 410 response, you are telling Google the page is GONE and they can update their index faster.
-
-
Are the links 404 pages or just expired content?
-
The links are index, i need them really cleaned up, not a
redirect, and users/customers find it frustrating -
You could try setting up a 301 redirect in htaccess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links to Your Site: No Data Available in Google Search Console
The site I am working on did not have their site submitted to Google Search Console (formerly Google Webmaster Tools). I submitted the site and a sitemap that auto updates. Google is crawling the site daily (about 30 pages a day). Under Search Traffic > Links to Your Site it shows no data is availible. I thought it was because it was a newly submitted site, but it has been two months now. Moz seems to have the same issue. Moz does show inbound links, but their are some that we think should really help us that are not shown. For instance, the Dallas Morning News wrote this article. They have a high DA and PA. Also, iliveindallas.com has an article about us that is still on the front page. That was a few weeks ago but also does not show up on Moz or Google SC. We are trying to be selective about the links we are getting. That they are follow links from reputable sites. Worried that both Google and Moz are not showing them.
Moz Pro | | TapGoods1 -
On Link Analysis tab I my best pages are 301 and 404 pages.
I looked on my redirrect file and found that /* redirects to /v/404.asp.
Moz Pro | | sbetzen
However if you look below at the link analysis the 404 page is getting a 404 error.
The homepage ecowindchimes.com/ is getting a 301 (but I don't know where it is going to).
The third one is also redirected. 1. [No Data] ecowindchimes.com/ ||| 301 ||| 2 ||| 36 2. 2. [No Data] ecowindchimes.com/v/404.asp ||| 404 ||| 2 ||| 34 3. [No Data] 3. ecowindchimes.com/index.html?lang=en-us&target=d2.html ||| 301 ||| 1 ||| 33 So I have 2 questions: 1) should this be fixed? and 2) how? This is a volusion site and I believe the "catchall" redirect was done by them0 -
Open Site Explorer, Which is the right link number?
Hi SEOMozers, I am trying to check a record of links to one of the websites I help look after. One problem with the site is the home page automatically redirects to /home which seems to make OSE give some weird results. I can give out the domain if need be but I would rather not if I can help it 😉 To sum up what's happening when you put the domain into OSE with the /home then I get the social figures but the link number is right out saying the site has 200 links. Typing in the domain without the /home brings up no social figures and 15k links which looks more correct. Also OSE has the notification about the redirect and if I'd like to go to the /home version. The weird thing is however is if I change the filter to 'Pages on this root domain' which should show all links pointing to the website, the figures don't change...? Has anyone experienced this? Any solutions? Or is it a known bug? Cheers for any help in advance!
Moz Pro | | thisisOllie0 -
Is seomoz rogerbot only crawling the subdomains by links or as well by id?
I´m new at seomoz and just set up a first campaign. After the first crawling i got quite a few 404 errors due to deleted (spammy) forum threads. I was sure there are no links to these deleted threads so my question is weather the seomoz rogerbot is only crawling my subdomains by links or as well by ids (the forum thread ids are serially numbered from 1 to x). If the rogerbot crawls as well serially numbered ids do i have to be concerned by the 404 error on behalf of the googlebot as well?
Moz Pro | | sauspiel0 -
On page links tool here at Seomoz
Hi Seomoz - first of all, thanks for the best SEO tools I have ever worked with (this is my first question in this forum, and also I just subscribed as a paying customer after the 30 days trial you guys offer). My question: After having worked for several weeks on getting the numbers of links in our forum on www.texaspoker.dk down, we are somewhat surprised to see that we didn't succeed in getting lower numbers. For instance, this page: http://www.texaspoker.dk/forum/aktuelle-konkurrencer/coaching-projekt-bliver-du-den-udvalgte has (that's what Seomoz seo tool tells us): 239 on page links. Can this really be true? We can't find these links, and we actuually did a lot to lower the numbers of links, for instance the forum members picture was a link before, and also there was a "go to top" link in each post in the forum. Thanks a lot.
Moz Pro | | MPO0 -
Links listed in MozPro Crawl Diagnostics
Ok, seeing as I'm getting to the end of my first week as a Pro Member, I'm getting more and more feedback regarding the pages on my site. I'm slightly concerned though that, having logged in this morning, I'm being shown 407 warnings for pages with 'Too Many On Page Links.' According to the blurb at the top of the page, 'Too Many' is generally defined as being over 100 links on a page ... but when I look at the pages which are being thrown up in the report, none of them contain anywhere near 100 links. I seriously doubt there is a glitch with the tool which has led me to think that maybe there's an issue with the way my site is coded. Is anyone aware of a coding problem that may lead Google and SEOMoz to suspect that I have a load of links across my site? P.S. As an aside, when this tool mentions 'Too Many Links' is it referring purely to OBL or does it count links to elsewhere on my domain too? Cheers,
Moz Pro | | theshortstack0 -
Why is open site explorer not showing the top links that if found previously.
In previous months ago open site explorer listed many other sites linking to mine. I have checked those sites and they are still linking to my site. Why doesn't open site explorer list them?
Moz Pro | | call4help0 -
Is the Competitive Link Finder having issues?
Hi there, Been trying to use the Competitive Link Finder off-and-on for about an hour now. It keeps telling me it's busy. Is there something known going on right now? And if so, do we know when it comes back? Best Regards, Daniel @ Path Marketing.
Moz Pro | | PathMarketing0