Do I need to redirect soft 404s that I got from Google Webmaster Tools?
-
Hi guys,
I got almost 1000+ soft 404s from GWT. All of the soft 404s produce 200 HTTP status code but the URLs are something like the following:
http://www.example.com/search/house-for-rent
(query used: house for rent)
http://www.example.com/search/-----------rent
(query used:-------rent)
There are no listings that match these queries and there is an advanced search that is visible in these pages.
Here are my questions:
1. Do I need to redirect each page to its appropriate landing page?
2. Do I need to add user sitemap or a list of URLs where they can search for other properties?
Any suggestions would help.
-
Thanks guys for your inputs. By the way, this issue is already resolved last year. Thanks again!
-
It depends what you want to achieve. If the 404s are pages which no longer exist than it will be the fastest to use the GWMT removal tool to remove the page pattern and also add a noindex in robots.txt. In addition obviously returning a 404.
The soft 404 is a case where content is not found but HTTP-status 200 is returned - this needs to change if you currently serve non-existing pages.
We generally do the following:
- Content which we know does not exist anymore (i.e. a deleted product page or a deleted product category) is served with a SC_GONE (410) and we provide cross-selling information (i.e. display products from related categories). This works great and we have seen a boost in indexed content.
- URLs which don't exist will go through a standard 404 - this is intentional as our monitoring will pick this up. If it is a legitimate 404 put of SEO value, we will do a redirect if it makes sense, or just let Google drop it over time (takes sometimes up to 4 weeks)
You can have multiple versions of 404 pages, but this would need to be coded out - i.e. in your application server you would define 404page which then programmatically would display content depending on what you want to do.
-
I know I am way late to the party, but MagicDude4Eva, have you had success just putting a noindex header on the soft 404 pages?
That sounds like the easiest way to deal with this problem, if it works, especially since a lot of sites use dynamic URLs for product search that you don't want to de-index.
Can you have multiple 404 pages? Otherwise redirecting an empty search results page to your 404 page could be quite confusing..
-
Hi mate,
I already added the following syntax to my website's robots.txt:
User-agent: *
Disallow: /search/
I have checked the dynamic pages or URLs produced by search box (ex.http://www.domain.com/search/jhjehfjehfefe) but they are still showing in Google.com and there's still 1000+ soft 404s in my Google webmaster tools account.
I appreciate your help.
Thanks man!
-
I think if it is done carefully it adds quite a lot of value. A proper site taxonomy is obviously always better and more predictable.
-
I would never index or let google crawl search pages - very dangerous ground.
-
I would do the following:
- For valid searches returned create a proper canoncial URL (and then decide if you want to do a index,follow or a noindex,follow on the result pages). You might not necessarily want to index search results, but rather a structure of items/pages on your site.
- I would generally not index search results (rather have your pages being crawled through category structures, sitemaps and RSS feeds)
- It does sound though that the way you implemented the search is wrong - it should not result in a soft 404 - it could be as easy as making the canonical for your search just "/search" (without any search terms) and if no results are found display options to the user for search refinements
The only time I have seen soft 404s with us is in cases where we removed product pages and then displayed a generic "product not available" page with some upselling options. In this case we set a status of 410 (GONE) which resolved the soft 404 issue.
The advantage of the 410 is that your application makes the decision that a page is gone, whereas a 404 could really be just a wrong linked URL.
-
Yes Customize 404 whenever your database don't have have search results for user query then you can redirect them to that page.
Have you considered of blocking "search" results directory in Robots.txt because those pages are dynamic, they are not actually physical page so its better you block them.
-
What do you mean by default page? Is it a customized 404 page?
Thanks a lot man! I appreciate it.
-
Hi,
As per your URL, I think best solution is to block "search" directory in Robots.txt, then Google will not able to to access those pages so no error in GWT. OR you can also create default page for query which don't have any result in database.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trailing slash redirects not working in https - working in http: What might be the culprit?
Hi, Our WP website redirects without slash to with slash are not working. When we tried with http mode, they are working. So, not working on https mode. Correct code given at .htaccess file. What might be causing the issue? Thanks
Web Design | | vtmoz0 -
Any risks involved when we have huge list of redirects in our website database?
Hi all, We have changed hundreds and thousands of page and page URLs for last 10 years in our website. So, now we are going to redirect most of the old pages/links to the current related pages. This will increase the number of redirects we set in our website database to thousands. Will this put any extra weight on crawler and any risks involved for employing too many redirects. Thanks
Web Design | | vtmoz0 -
Multiple redirects hurt?
In the process of website migrations and redesign, we create & replace new pages which will lead to multiple redirects unknowingly. Like: page A to page B & page B to page C. Will these kind of multiple redirects hurt? I would be happy to hear what happens with WordPress with this scenario in particular.
Web Design | | vtmoz0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
Fetch data for users with ajax but show it without ajax for Google
Hi, We have a thematic footer which shows similar pages links relevant to the search criteria made on a page. We want to fetch those footer similar links through ajax when users search on site but the links will be shown without using ajax when Google fetches those pages. We want to do this to improve our page load time. The links content & count will be exactly same in both cases whether Google fetches the search pages or user fetches those pages. Will this be treated as negative by Google, Can this have any negative affect on our rankings or traffic. Regards,
Web Design | | vivekrathore0 -
301 htaccess redirect or 301 HTTP DNS Redirect
Hi, I was wondering which you would recommend for a 301 redirect. Should we do a 301 redirect from .htacess or should we do a HTTP DNS 301 redirect. The HTTP redirect, does a redirect from the DNS Provider and doesn't require that we keep hosting the site while the htacess redirect still requires hosting. Thanks!
Web Design | | MattJD0 -
Does Google penalize duplicate website design?
Hello, We are very close to launching five new websites, all in the same business sector. Because we would like to keep our brand intact, we are looking to use the same design on all five websites. My question is, will Google penalize the sites if they have the same design? Thank you! Best regards,
Web Design | | Tiberiu
Tiberiu0 -
Searching for BEST e-Commerce Multilanguage Platform, Need Advice
I have 2 online store in Canada. Both are selling One bilingual (English&french) Filtration Montreal and a unilinguale store Furnace Filters Canada Both are offering the same products at the same price for Canadian. We work hard to rank on Google.ca because we only sell and ship to Canada. The platform of Filtration Montreal is very basic and limited. For example, the url structure make it very hard to rank on Google.ca This platform is very not SEO friendly with url like: http://www.filtrationmontreal.com/en/product/honeywell-genuine-filter-95/pack-of-5-genuine-honeywell-furnace-filters-20x20x5-601.html The only good thing about this platform, is the multilingual option. The customer can shop in french or English. I would like to move that store to a new platform where I can create a multilingual online store. Do you have sugestions? Furnace Filters Canada is on BigCommerce. I find it SEO friendly. Using SEOmoz tools and new to SEO, high competitive keywords like: furnace filters, furnace filter are ranking on 3rd rank in Google.ca fist page! This site is getting more & more visitors every months. The only frustrating thing is the English only version of the stores to customers. QUESTIONS: What will be the SEO impact if I'm moving Furnace Filters Canada to a new platform? Do you have suggestions in finding the perfect multilanguage e-Commerce platform? Andrew Bleakley suggest Ashop. Anybody using Ashop? How about a eCommerce platform that can manage my 2 stores at the same time. REMEMBER, we sell and ship to Canada only. Thank you for your help and support. BigBlaze
Web Design | | BigBlaze2050