Dead links-urls
-
What is the quickest way to get Google to clean up dead
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores.Thanks
-
thank you 410 was what i needed.
-
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
Can you offer clarification as to what you view as a "dead link"?
If you mean a link to a page on your site which no longer exists, then your options are as follows:
-
take John's advice and 301 the link to the most relevant page on your site
-
allow the link to 404. 404 errors are a natural part of the internet. You should be sure to have the most helpful 404 page possible. Include your site's normal navigation, a search box, most popular articles, etc.
Blocking pages in robots.txt is usually a bad idea and is more likely to add to the problem rather then fix it. If you do not block the page in robots.txt, Google would then crawl the page, see the error and remove it from their index. Because you block the page they can no longer view the page so they will likely leave it in for longer.
Google normally promptly removes web pages submitted with the URL Removal Tool. The tool is designed to remove pages which can damage your site or others. For example, if confidential information was accidentally published. It is not designed to be used to remove 74k pages because you decided to remove them from your website.
If these dead links are simply pages you removed from your site, I advise you to remove the robots.txt block and then decide to 301 them or allow them to 404. Google should then clean up the links within 30-60 days.
If you want to speed up the process as much as possible, there is one other step you can take. Set a 410 (Gone) code for the pages. When Google receives a 404 response, they are unsure if the page is simply temporarily unavailable so they keep it in the index. If they receive a 410 response, you are telling Google the page is GONE and they can update their index faster.
-
-
Are the links 404 pages or just expired content?
-
The links are index, i need them really cleaned up, not a
redirect, and users/customers find it frustrating -
You could try setting up a 301 redirect in htaccess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
5XX (Server Error) on all urls
Hi I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine. Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report | 500 : TimeoutError http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 | Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout? Many thanks Carl
Moz Pro | | GrumpyCarl0 -
Inbound Link Tool?
I signed up for SEOMOZ so that I could get a conclusive list of links for my competition. I hope I can get this information with this tool. Can I? And where can I find the tool? P.S. I don't need my own inbound links; I can get these with google webmaster tools.
Moz Pro | | bachdeg0 -
Does a url with no trailing slash (/)need A special redirect to the same url with a trailing slash (/)
I recently moved a website to wordpress which the wordpress default includes the trailing slash (/) after ALL urls. My url structure used to look like: www.example.com/blue-widgets Now it looks like: www.example.com/blue-widgets/ Today I checked the urls using Open Site Explorer and below is what I discovered: www.example.com/blue-widgets returned all my links, authority, etc HOWEVER there is a note that says........."Oh Hey! it looks like that URL redirects to www.example.com/blue-widgets/. Would you like to see data for that URL instead?" When I click on the link to THAT URL I get a note that says_.....NO DATA AVAILABLE FOR THIS URL._ Does this mean that www.example.com/blue-widgets/ really has NO DATA? How do I fix this?
Moz Pro | | webestate0 -
Link buying: finding sites based on criteria
Hi. I'm looking to get links to my site from blogs/sites/pages that fit this criteria: niche: IT, electronics, product reviews Minimum PR: 2 Maximum 20 external links Is there an automated tool that can help me discover these site?
Moz Pro | | seo_marker0 -
What is the best link building management tool ?
What is the best link building management tool that can automatically fill in content for submission for me, check whether my sites are submitted on other sites, propose new lists of submission sites, organize my link building (by date, anchor text, url, page rank, both back links, reciprocal, paid, etc), organize my social media profiles and connect them to each other ?
Moz Pro | | CretanDevelopments0 -
Too Many On-Page Links: Crawl Diag vs On-Page
I've got a site I'm optimizing that has thousands of 'too many links on-page' warnings from the SeoMoz crawl diagnostic. I've been in there and realized that there are indeed, the rent is too damned high, and it's due to a header/left/footer category menu that's repeating itself. So I changed these links to NoFollow, cutting my total links by about 50 per page. I was too impatient to wait for a new crawl, so I used the On Page Reports to see if anything would come up on the Internal Link Count/External Link Count factors, and nothing did. However, the crawl (eventually) came back with the same warning. I looked at the link Count in the crawl details, and realized that it's basically counting every single '<a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p=""></a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">1. Is no-follow a valid strategy to reduce link count for a page? (Obviously not for SeoMoz crawler, but for Google)</a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">2. What metric does the On-Page Report use to determine if there are too many Internal/External links? Apologies if this has been asked, the search didn't seem to come up with anything specific to this.</a>
Moz Pro | | icecarats0 -
Quantity of Internal Followed Links
We compared our and competitor pages. And we have got next statistics: Internal Followed Links: 184 4,496 We are thinking about changing our shopping cart links format from /product_info.php?products_id=3547 to /product_name.html This should increase "number of our pages" and "number of links" from them.
Moz Pro | | ctam
But in case a description of products from the same category often is very similar. We worry about duplicating page content. Could you give any advice in that situation? Do Internal Links play a big role in page rank? Thanks. PS: at attached image you can find full page metrics from open site explorer. Our metrics are at left, competitors - right. BtTap.jpg0 -
Rel Canonical issues for two urls sharing same IP address
Our client built a wordpress site on url A, then opted for a better url B. Rather than moving all the wordpress files/website over to the new url B, they just contacted GoDaddy, who hosted BOTH urls under the same IP address. When I do a term target on url B, I'm flagged for rel canonical use. I can only get a B grade for each keyword. (I've also tried using url A, but I get the same flag and B grade results). I'm not sure if this set-up will thwart our seo efforts for the site, because only the homepage comes up when you type in url B anyway. Every subsequent page displays the original url A. Somewhere, wordpress is also adding a rel canonical link on the homepage source to url A, too, which we can't seem to edit. So, question is: is it ok to leave this set up as is with both urls hosted on the same IP address, or should we move the whole site over to the desired url B? Thanks much!
Moz Pro | | GravitateOnline0