How do I best deal with pages returning 404 errors as they contain links from other sites?
-
I have over 750 URL's returning 404 errors. The majority of these pages have back links from sites, however the credibility of these pages from what I can see is somewhat dubious, mainly forums and sites with low DA & PA.
It has been suggested placing 301 redirects from these pages, a nice easy solution, however I am concerned that we could do more harm than good to our sites credibility and link building strategy going into 2013. I don't want to redirect these pages if its going to cause a panda/penguin problem.
Could I request manual removal or something of this nature?
Thoughts appreciated.
-
Hi Craig,
Why not create an awesome kick butt 404 page with links that can direct people to pages that they are looking for? This way, if your audience is interested, they can continue to navigate your site from your 404 page and you keep all the link juice to your 404 page. Your 404 page might be one of the top visited page so you should take this opportunity to include links to direct people to the right pages they might be looking for.
One reason 301 redirect may not be good is because if someone clicks on a backlink to your stie and you redirect all your 404 traffic to your home page. It might turn people off for sending them to your home page when they thought the link is directing them to something they are interested in.
-
We do 301 redirects from URLs that return 404 errors for 2 reasons: 1) To not lose traffic 2) To not lose link juice So in your case I would do the following: To not lose traffic - I would check the stats of the website to see which of these pages received visits - if they are any I would do 301 redirects to pages that can provide the information the visitor was looking for in the first pace. To not lose link juice - For the remaining pages, I would check the value of the backlinks and see if they are worth it. If they are, I would do 301 redirect to the home. I personally don’t jump too easy in using Google disavowal tool. As they say on their official blog:
If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue. If you haven’t gotten this notification, this tool generally isn’t something you need to worry about. I’m not talking from experience here, but just from logic - if a bad link points to a page that doesn't exist on my site I don’t believe it can get me penalized.
-
Hiz, you can use google disavow tool for this : https://www.google.com/webmasters/tools/disavow-links-main?pli=1 as stated by matthew.
-
Hi Matt,
Thanks for advice, much appreciated. I'll do a bit of digging and see what turns up.
Thanks
Craig
-
Hi Craig,
You have a couple of option here. First thing I would do would be to analyse all of the links going to these pages. If there are any pages that you would say the links are actually of value then set up a 301 redirect to another relevant page on the site.
Alternatively, you could ask for the manual removal of links to the pages via the Google disavow tool. If there are only a couple of links going through to the old URLs then I wouldn't worry too much. I would just spend a while analysing and prioritising which pages need to be redirected urgently and then take a deeper look into their links.
Hope this helps,
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best option for these pages?
Hi Guys, We have product pages on our site which have duplicate content, the search volume for people searching for these products is very, very small. Also if we add unique content, we could face keyword cannibalisation issues with category/sub-category pages. Now based on proper SEO best practice we should add rel canonical tags from these product pages to the next relevant page. Pros Can rank for product oriented keywords but search volume is very small. Any link equity to these pages passed due to the rel canonical tag would be very small, as these pages barely get any links. Cons Time and effort involved in adding rel canonical tags. Even if we do add rel canonical tags, if Google doesn't deem them relevant then they might ignore causing duplicate content issues. Time and effort involved in making all the content unique - not really worth it - again very minimal searchers. Plus if we do make it unique, then we face keyword cannibalisation issues. -- What do you think would be the optimal solution to this? I'm thinking just implementing a: Across all these product based pages. Keen to hear thoughts? Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
Site Structure: How do I deal with a great user experience that's not the best for Google's spiders?
We have ~3,000 photos that have all been tagged. We have a wonderful AJAXy interface for users where they can toggle all of these tags to find the exact set of photos they're looking for very quickly. We've also optimized a site structure for Google's benefit that gives each category a page. Each category page links to applicable album pages. Each album page links to individual photo pages. All pages have a good chunk of unique text. Now, for Google, the domain.com/photos index page should be a directory of sorts that links to each category page. Alternatively, the user would probably prefer the AJAXy interface. What is the best way to execute this?
Intermediate & Advanced SEO | | tatermarketing0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Unnatural Links From My Site Penalty - Where, exactly?
So I was just surprised by officially being one of the very few to be hit with the manual penalty from Google "unnatural links from your site." We run a clean ship or try to. Of all the possible penalties, this is the one most unlikely by far to occur. Well, it explains some issues we've had that have been impossible to overcome. We don't have a link exchange. Our entire directory has been deindexed from Google for almost 2 years because of Panda/Penguin - just to be 100% sure this didn't happen. We removed even links that went even to my own personal websites - which were a literal handful. We have 3 partners - who have nofollow links and are listed on a single page. So I'm wondering... does anyone have any reason to understand why we'd have this penalty and it would linger for such a long period of time? If you want to see strange things, try to look up our page rank on virtually any page, especially in the /gui de/ directory. Now the bizarre results of many months make sense. Hopefully one of my fellow SEOs with a fresh pair of eyes can take a look at this one. http://legal.nu/kc68
Intermediate & Advanced SEO | | seoagnostic0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Can 404 Errors Be Affecting Rankings
I have a client that we recently (3 months ago) designed, developed, and launch a new site at a "new" domain. We set up redirects from the old domain to the new domain and kept an eye on Google Webmaster Tools to make sure the redirects were working properly. Everything was going great, we maintained and improved the rankings for the first 2 months or so. In late January, I started noticing a great deal of 404 errors in Webmaster Tools for URLs from the new site. None of these URLs were actually on the current site so I asked my client if he had previously used to domain. It just so happens that he used the domain a while back and none of the URLs were ever redirected or removed from the index. I've been setting up redirects for all of the 404s appearing in Webmaster tools but we took a pretty decent hit in rankings for February. Could those errors (72 in total) been partially if not completely responsible for the hit in rankings? All other factors have been constant so that lead me to believe these errors were the culprits.
Intermediate & Advanced SEO | | TheOceanAgency0 -
How to increase the page rank for keyword for entire site
sorry for my bad english is there any way to increase the ranking for a keyword for the entire site .i know that seo is done per page basis .my site contains 1000ds of posts and i cant get back links for each and every post .so i picked 4 keywords which are mostly used while searching my products , is there any method i can increase my ranking for those keywords like increasing domain authority EXAMPLE :like if i want to increase my ranking for "buy laptop" .if any user searches In google with buy laptop i want my site or any of related pages that match the user search query must show up in front
Intermediate & Advanced SEO | | prakash.moturu0