What is the best way to handle links that lead to a 404 page
-
Hi Team Moz,
I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages.
Steps I have taken:
-
Multiple new sitemaps submitted with new URLs and the indexing looks solid
-
used webmasters to remove urls with natural result listings that did not redirect and produce urls
-
Completely built out new ppc campaigns with new URL structures
-
contacted few major link partners
Now here is my question:
I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
-
-
Hi,
Definitely don't use disavow unless you think that the links are poor quality and could harm your site, or are actively harming it right now. That is what disavow is for, not for removing your 404 pages.
There is no harm waiting for Google to remove the 404 pages on its own, especially if you have used its URL removal tool as well. If there are any good links in the backlink profile of the 404ing pages, do attempt to contact the webmaster and have them changed - most people are more than happy to do this.
-
If the links are good ones, 301 redirect to a good page, you don't have to have a blank page at that url.
if they are bad links just leave them. if that are 404'ing then they can do you no harm.
The only 404's that can do you harm are ones from your own internal links, because it means you have link juice leaks. fix any if you have them
-
Edit the link backs you were getting to the 404 pages and point it to the new pages. Another option is to host a blank page (with header and footer) on the 404 page and 301 redirect it to the new pages. the page rank/ link profile will get passed to the new page.
-
Well, the correct / best thing to do would be to try and get all of those links edited and pointed to live pages. That said, if you don't know who posted the links or have no way to get in touch with those who do, then it can be very awkward to achieve - still, link reclamation can be a great way to help with new links, seeing as they are already pointing to your site.
-Andy
-
If you feel the links are harming you or your SEO efforts in anyway, you can go ahead and disavow them. However, the disavow link does not remove the links so it does not help with 404 errors, but will ignore them when it comes to your rankings.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Deleteing old page and passing on link strenth?
We are a printing company and thinking over bringing our products down to 2 - 3 rather than the 10+ we currently have, the pages we will be getting rid of will be pages such as flyers, booklets etc and just concentrating on banners and stickers would you suggest 301ing the pages to the home page or picking pages for them to go to? Also could we expect a decent raise for the pages we are left with? Thanks shaun
Technical SEO | | BobAnderson0 -
Redirect for Soft 404 or 404?
I have a client site that displays properties from the MLS. Once these properties sell they're removed from the MLS and they stop showing up on her site. This would result in a 404 error, but right now any property that's not being found is being 301 redirected back to the property page. I see how this makes sense for a user, but Google is saying there's an increase in Soft 404 errors and I've read that this could negatively affect organic traffic. Should I keep the redirect for removed properties or should I have it serve a 404 with a message that the house you're looking for may have sold and link to the property page? Is it better to have Soft 404 errors or 404 errors?
Technical SEO | | JaredDetroit0 -
Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented. If this is hurting us, what would you recommend as a solution?
Technical SEO | | Deacyde0 -
How do I handle soft 404s on category pages?
I have a site that provides a service where listings are displayed on site for 30 days, then they expire. These listings are categorized by type. On occasion, categories have no listings available, and Google Webmaster Tools is listing them as Soft 404 errors. It's not possible to remove these categories and 301 redirect to another page. Any suggestions on how to work around the soft 404s?
Technical SEO | | ang0 -
Google webmaster tool doestn allow me to send 'URL and all linked pages"
Hello! I made a lot of optimization changes in my site ( seo urls, and a lot more ) , I always use Google Webmaster tools, fetch as Google Bot to refresh my site but now it doesnt allow me to 'Send URL and all linked pages' check the attachment Thank you
Technical SEO | | matiw0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Adding parameters in URLs and linking to a page
Hi, Here's a fairly technical question: We would like to implement badge feature where linking websites using a badge would use urls such as: domain.com/page?state=texas&city=houston domain.com/page?state=neveda&city=lasvegas Important note: the parameter will change the information and layout of the page: domain.com/page Would those 2 urls above along with their extra parameters be considered the same page as domain.com/page by google's crawler? We're considering adding the parameter "state" and "city" to Google WMT url parameter tool to tel them who to handle those parameters. Any feedback or comments is appreciated! Thanks in advance. Martin
Technical SEO | | MartinH0 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0