Dealing with broken internal links/404s. What's best practice?
-
I've just started working on a website that has generated lots (100s) of broken internal links. Essentially specific pages have been removed over time and nobody has been keeping an eye on what internal links might have been affected. Most of these are internal links that are embedded in content which hasn't been updated following the page's deletion.
What's my best way to approach fixing these broken links?
My plan is currently to redirect where appropriate (from a specific service page that doesn't exist to the overall service category maybe?) but there are lots of pages that don't have a similar or equivalent page. I presume I'll need to go through the content removing the links or replacing them where possible.
My example is a specific staff member who no longer works there and is linked to from a category page, should i be redirecting from the old staff member and updating the anchor text, or just straight up replacing the whole thing to link to the right person?
In most cases, these pages don't rank and I can't think of many that have any external websites linking to them.
I'm over thinking all of this?
Please help!
-
Thank you, you answered all of my questions and some more I didn't ask...but should have!
The notable alumni page is a great idea, and not one I'd thought of.
It's going to be a lengthy process, but I'm no happy that I know I'm doing the right things.
Thank you again!
-
Those pages have been deleted for a reason so rather than redirecting to a page that's not relevant or one that's the main category page why not just get rid of the link? this will increase the link equity of the good links coming from that page that go to places that users want to go to.
Certainly when someone leaves you need to redirect 'Job title' to the new owner of that job title. That's what users will want. But sending people around in circles on your site is pointless and will potentially cause them frustration and to leave. If you're trying to find more info and you end up going 'up a level' and not to a deeper and more detailed level then that always makes me bounce because I figure the website can't answer my question or give me the info I need.
Think about the user. Pagerank sculpting is dead. But it's still important to make sure there is always a path for a user to follow to get the info they need. If there is not then delete the entire sentence and the original link. It's only going to help strengthen the link equity flow throughout your site.
Simplify - don't complicate. That would be my advice. And remember to ask for a crawl of the updated page and it's direct links to get a quick index and see whether it helps your pages rank.
You don't necessarily need links to internal pages from external sources. That's not the reason they are not ranking. They are not ranking because people aren't navigating to them (no implicit user feedback signals) or they don't entice people to click from their SERP entry. So look at updating the title tags and meta descriptions and doing what you suggest, where appropriate change the anchor text to the right thing and link to the right place or just delete the link.
I have a really useful 'notable alumni' page for great people who used to work at our practice but now don't. You'll need their permission to keep them on your site and it helps if they had a nice page with good DA and PA that links with anchor text to - for example - a product or service.
But google HATES 404's so get rid of them all as soon as you can and watch the pages that remain creep up the rankings.
-
If the 404 link has an equivalent on site, I'd update the link to point at the new equivalent. Even if the page isn't ranking, there's potential for a visitor to get to the page, so why not send them to proper content if you have it.
There's also a Moz Blog post regarding 404 pages: https://moz.com/blog/are-404-pages-always-bad-for-seo
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do about spam links I didn't create?
I have dropped in rankings 3-5 points over the past 6 months and have been trying to figure out why. One thing I found was a ton of my pictures on a image net ring. I obviously didn't put those photos there or give permission to use them. It looks like an offshore website. How do we deal with these type of bad links?
Technical SEO | | CalicoKitty20000 -
What's best practice for cart pages?
i don't mean e-commerce site in general, but the actual cart page itself. What's best practice for the links that customers click to add products to the cart, and the cart page itself? Also, I use vanity URLs for my cart links which redirect to the actual cart page with the parameters applied. Should I use use 301 or 302 redirects for the links? Do I make the cart page's canonical tag point back to the store home page so that I'm not accruing link juice to a page that customers don't actually want to land on from search? I'm kinda surprised at the dearth of information out there on this, or maybe I'm not looking in the right places?
Technical SEO | | VM-Oz0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Best practice around removing large section of the website
We are looking at removing a large section of our website that is getting low/no traffic. My current thought of removing this would be to delete the pages and add 301 redirects to a similar page within the site that is not being deleted. This will be removing 400+ pages, does it this make sense? Or should we point them to the homepage? Finally should we do this in one batch or should we slowly remove the pages over the course of a couple weeks. Thanks - appreciate the help in understanding the best practice in terms of SEO.
Technical SEO | | webactive0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Modifying urls cause broken links?
I want to modify my urls for the purpose of creating clean, informative urls that can be pasted directly for backlink purposes. Instead I have urls with long, garbled, strange characters. When I change the URL it breaks existing back links! Any way around this?
Technical SEO | | natearistotle0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0