What is the best way to handle links that lead to a 404 page
-
Hi Team Moz,
I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages.
Steps I have taken:
-
Multiple new sitemaps submitted with new URLs and the indexing looks solid
-
used webmasters to remove urls with natural result listings that did not redirect and produce urls
-
Completely built out new ppc campaigns with new URL structures
-
contacted few major link partners
Now here is my question:
I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
-
-
Hi,
Definitely don't use disavow unless you think that the links are poor quality and could harm your site, or are actively harming it right now. That is what disavow is for, not for removing your 404 pages.
There is no harm waiting for Google to remove the 404 pages on its own, especially if you have used its URL removal tool as well. If there are any good links in the backlink profile of the 404ing pages, do attempt to contact the webmaster and have them changed - most people are more than happy to do this.
-
If the links are good ones, 301 redirect to a good page, you don't have to have a blank page at that url.
if they are bad links just leave them. if that are 404'ing then they can do you no harm.
The only 404's that can do you harm are ones from your own internal links, because it means you have link juice leaks. fix any if you have them
-
Edit the link backs you were getting to the 404 pages and point it to the new pages. Another option is to host a blank page (with header and footer) on the 404 page and 301 redirect it to the new pages. the page rank/ link profile will get passed to the new page.
-
Well, the correct / best thing to do would be to try and get all of those links edited and pointed to live pages. That said, if you don't know who posted the links or have no way to get in touch with those who do, then it can be very awkward to achieve - still, link reclamation can be a great way to help with new links, seeing as they are already pointing to your site.
-Andy
-
If you feel the links are harming you or your SEO efforts in anyway, you can go ahead and disavow them. However, the disavow link does not remove the links so it does not help with 404 errors, but will ignore them when it comes to your rankings.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
Find all 404 links in my site that are indexed
Hi All, Find all 404 links in my site that are indexed. We deleted a lot of URl's from site but now i dont have the track of all we deleted. Any site/Tool can scan the index and give me the exact URL's so I can use https://www.google.com/webmasters/tools/removals?hl=en&rlf=all Regards Martin
Technical SEO | | mtthompsons0 -
Adding parameters in URLs and linking to a page
Hi, Here's a fairly technical question: We would like to implement badge feature where linking websites using a badge would use urls such as: domain.com/page?state=texas&city=houston domain.com/page?state=neveda&city=lasvegas Important note: the parameter will change the information and layout of the page: domain.com/page Would those 2 urls above along with their extra parameters be considered the same page as domain.com/page by google's crawler? We're considering adding the parameter "state" and "city" to Google WMT url parameter tool to tel them who to handle those parameters. Any feedback or comments is appreciated! Thanks in advance. Martin
Technical SEO | | MartinH0 -
Too Many On-Page Links on a Blog
I have a question about the number of on-page links on a page and the implications on how we're viewed by search engines. After SEOmoz crawls our website, we consistently get notifications that some of our pages have "Too Many On-Page Links." These are always limited to pages on our blog, and largely a function of our tag cloud (~ 30 links) plus categories (10 links) plus popular posts (5 links). These all display on every blog post in the sidebar. How significant a problem is this? And, if you think it is a significant problem, what would you suggest to remedy the problem? Here's a link to our blog in case it helps: http://wiredimpact.com/blog/ The above page currently is listed as having 138 links. Any advice is much appreciated. Thanks so much. David
Technical SEO | | WiredImpact0 -
How to best remove old pages for SEO
I run an accommodation web site, each listing has its own page. When a property is removed what is the best way to handle this for SEO because the URL will no longer be valid and there will be a blank page.
Technical SEO | | JamieHibbert0 -
When launching a site redesign, what is the best way to roll it out?
We are currently working on a site re-design and are getting close to launch. When launching the newly designed site, what is the best way to roll it out? 301 Redirects with an under construction page? Just upload the entire site at once? or al little of both? Thank You
Technical SEO | | ControlByWeb0 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0