Dead links-urls
-
What is the quickest way to get Google to clean up dead
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores.Thanks
-
thank you 410 was what i needed.
-
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
Can you offer clarification as to what you view as a "dead link"?
If you mean a link to a page on your site which no longer exists, then your options are as follows:
-
take John's advice and 301 the link to the most relevant page on your site
-
allow the link to 404. 404 errors are a natural part of the internet. You should be sure to have the most helpful 404 page possible. Include your site's normal navigation, a search box, most popular articles, etc.
Blocking pages in robots.txt is usually a bad idea and is more likely to add to the problem rather then fix it. If you do not block the page in robots.txt, Google would then crawl the page, see the error and remove it from their index. Because you block the page they can no longer view the page so they will likely leave it in for longer.
Google normally promptly removes web pages submitted with the URL Removal Tool. The tool is designed to remove pages which can damage your site or others. For example, if confidential information was accidentally published. It is not designed to be used to remove 74k pages because you decided to remove them from your website.
If these dead links are simply pages you removed from your site, I advise you to remove the robots.txt block and then decide to 301 them or allow them to 404. Google should then clean up the links within 30-60 days.
If you want to speed up the process as much as possible, there is one other step you can take. Set a 410 (Gone) code for the pages. When Google receives a 404 response, they are unsure if the page is simply temporarily unavailable so they keep it in the index. If they receive a 410 response, you are telling Google the page is GONE and they can update their index faster.
-
-
Are the links 404 pages or just expired content?
-
The links are index, i need them really cleaned up, not a
redirect, and users/customers find it frustrating -
You could try setting up a 301 redirect in htaccess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you know good company for obtaining back links ?
Hi All Can anybody recommend a company which can produce ethical good quality links and a reasonable price? We need to try and get our rankings higher and if you've used a company I would appreciate if you could let me know etc Thank you Alan Whitby Holiday Cottages
Moz Pro | | alandavidson
http://www.endeavourcottage.co.uk/1 -
Woocommerce filter urls showing in crawl results, but not indexed?
I'm getting 100's of Duplicate Content warnings for a Woocommerce store I have. The urls are
Moz Pro | | JustinMurray
etc These don't seem to be indexed in google, and the canonical is for the shop base url. These seem to be simply urls generated by Woocommerce filters. Is this simply a false alarm from Moz crawl?0 -
Specific external links
So...we have an A+ rating with the BBB and have for some time. We have a link from their page. We also use sites like pricegrabber and reseller rankings and have links from those. What I can't figure out is why links from the BBB and Pricegrabber don't even show up in open site explorer but the reseller ratings links are everywhere. The BBB one is easily the oldest and the other two are roughly the same age (i.e. at least a couple years old.)
Moz Pro | | Greatmats1 -
URL paramters and duplicate content
Hello, I have a 2-fold question: Crawl Diagnostics is picking up a lot of Duplicate Page Title errors, and as far as I can tell, all of them are cause by URL parameters trailing the URL. We use a Magento store, and all filtering attributes, categories, product pages etc are tagged on as URL parameters. example: Main URL:
Moz Pro | | yacpro13
/accessories.html Duplicated Title Page URLs: /accessories.html?dir=asc&order=position
/accessories.html?mode=list
/accessories.html?mode=grid
...and many others How can I make the Crawl Diagnostics not identify these as errors? Now from an SEO point of view, all these URL parameters are been picked up by google, and are listed in WedMaster Tools -> URL parameters. All URL parameters are set to "let google decide". I remember having read that Google was smart enough here to make the right decision, and we shouldn't have to worry about it. Is this true, or is there a larger issue at hand here? Thankas!0 -
Broken links in the pdf of the On Page Report
Hi, I run an individual On Page report for a particular URL, then I export as pdf. The URL appears in the pdf and looks fine but when you click on it it goes to a 'page not found'. I know the URL is correct. When I hover over the URL in the pdf i notice that the word 'Good' is at the end of my URL but I did not put this in there. if I give the report to a client it doesn't look so good. http://www.narellanpools.com.au/local-contact/narellan-pools-alburywodongaGood Is this a bug? Cheers Virginia
Moz Pro | | VirginiaC0 -
Link reporting.
Is there a way in the Pro reporting where I can see a summary of the number of incoming links by type (blogs / news / wiki / dir / forums etc)? Even better, could the report give me an average Page Rank for each link type? Thanks,
Moz Pro | | CarlDarby0 -
What Exactly Does "Linking Root Domains" mean??
What Exactly Does "Linking Root Domains" mean?? And how does it affect your ranking for certain Keywords?? Thanks
Moz Pro | | Caseman57 -
Why are SEOmoz Pro Keyword Ranking reports different between two 301-linked URLs?
Hi, My main domain is www.dancenut.com. I have this 301 URL redirected to www.dancenut.com/boston/. I set up an SEOmoz Pro campaign for each of these URLs in order to see if they were being treated differently in any way. In most cases the report results are identical, or the small differences are understandable. However, there is one big difference between the two sets of campaign reports. In the Keyword Ranking reports, the data for the Bing and Yahoo! reports are identical, but the data for the Google reports are dramatically different. Out of 21 keywords, 9 are listed in the top 50 for www.dancenut.com, but only 2 are listed in the top 50 for www.dancenut.com/boston/ (the specific positions are the same for the 2 keywords that are listed for both). Does this make any sense? Could the SEOmoz Pro data be wrong? If not, then I'm suspicious that Google may not be interpreting the 301 redirect properly. I don't think this could be fully explained by a 1-10% reduction in link juice due to the 301 because I have one keyword for which my site ranks #1 in Google, Bing, and Yahoo! with www.dancenut.com, but it doesn't even rank in the top 50 in Google with www.dancenut.com/boston/. And why would these differences only exist for Google? Any insight would be much appreciated! Andrew
Moz Pro | | dancenut0