Do I need to redirect soft 404s that I got from Google Webmaster Tools?
-
Hi guys,
I got almost 1000+ soft 404s from GWT. All of the soft 404s produce 200 HTTP status code but the URLs are something like the following:
http://www.example.com/search/house-for-rent
(query used: house for rent)
http://www.example.com/search/-----------rent
(query used:-------rent)
There are no listings that match these queries and there is an advanced search that is visible in these pages.
Here are my questions:
1. Do I need to redirect each page to its appropriate landing page?
2. Do I need to add user sitemap or a list of URLs where they can search for other properties?
Any suggestions would help.
-
Thanks guys for your inputs. By the way, this issue is already resolved last year. Thanks again!
-
It depends what you want to achieve. If the 404s are pages which no longer exist than it will be the fastest to use the GWMT removal tool to remove the page pattern and also add a noindex in robots.txt. In addition obviously returning a 404.
The soft 404 is a case where content is not found but HTTP-status 200 is returned - this needs to change if you currently serve non-existing pages.
We generally do the following:
- Content which we know does not exist anymore (i.e. a deleted product page or a deleted product category) is served with a SC_GONE (410) and we provide cross-selling information (i.e. display products from related categories). This works great and we have seen a boost in indexed content.
- URLs which don't exist will go through a standard 404 - this is intentional as our monitoring will pick this up. If it is a legitimate 404 put of SEO value, we will do a redirect if it makes sense, or just let Google drop it over time (takes sometimes up to 4 weeks)
You can have multiple versions of 404 pages, but this would need to be coded out - i.e. in your application server you would define 404page which then programmatically would display content depending on what you want to do.
-
I know I am way late to the party, but MagicDude4Eva, have you had success just putting a noindex header on the soft 404 pages?
That sounds like the easiest way to deal with this problem, if it works, especially since a lot of sites use dynamic URLs for product search that you don't want to de-index.
Can you have multiple 404 pages? Otherwise redirecting an empty search results page to your 404 page could be quite confusing..
-
Hi mate,
I already added the following syntax to my website's robots.txt:
User-agent: *
Disallow: /search/
I have checked the dynamic pages or URLs produced by search box (ex.http://www.domain.com/search/jhjehfjehfefe) but they are still showing in Google.com and there's still 1000+ soft 404s in my Google webmaster tools account.
I appreciate your help.
Thanks man!
-
I think if it is done carefully it adds quite a lot of value. A proper site taxonomy is obviously always better and more predictable.
-
I would never index or let google crawl search pages - very dangerous ground.
-
I would do the following:
- For valid searches returned create a proper canoncial URL (and then decide if you want to do a index,follow or a noindex,follow on the result pages). You might not necessarily want to index search results, but rather a structure of items/pages on your site.
- I would generally not index search results (rather have your pages being crawled through category structures, sitemaps and RSS feeds)
- It does sound though that the way you implemented the search is wrong - it should not result in a soft 404 - it could be as easy as making the canonical for your search just "/search" (without any search terms) and if no results are found display options to the user for search refinements
The only time I have seen soft 404s with us is in cases where we removed product pages and then displayed a generic "product not available" page with some upselling options. In this case we set a status of 410 (GONE) which resolved the soft 404 issue.
The advantage of the 410 is that your application makes the decision that a page is gone, whereas a 404 could really be just a wrong linked URL.
-
Yes Customize 404 whenever your database don't have have search results for user query then you can redirect them to that page.
Have you considered of blocking "search" results directory in Robots.txt because those pages are dynamic, they are not actually physical page so its better you block them.
-
What do you mean by default page? Is it a customized 404 page?
Thanks a lot man! I appreciate it.
-
Hi,
As per your URL, I think best solution is to block "search" directory in Robots.txt, then Google will not able to to access those pages so no error in GWT. OR you can also create default page for query which don't have any result in database.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What website changes (technical) SEOs can ignore confidently? Google's perspective!
Hi community members, I am looking after SEO at our company and there are lots of changes happening about our website; especially technical changes. It's hard for me to look after every deployment of the website like change of server location, etc. We generally agree that every change related to website must be notified by SEO to understand the ranking fluctuation and how search engines welcome them. I just wonder what technical deployments of a website I could confidently ignore to save time and give a go ahead to technical team without interrupting or waiting for my approval. Thanks
Web Design | | vtmoz1 -
Google Search Console Block
Am new to SEO. My clients site was completed using Yoast premium and then used Google search console to initiate the crawl. Initially setup an http:// property and all seemed good. Then i removed that under search console an created an https:// did the render and it appears google has put a block and placed their own robots.txt file which basically has rendered the site useless. Feedback most appreciated.
Web Design | | BoostMyBiz0 -
Does stock art photo attribution negatively impact SEO by leaking Google Page Rank?
Greetings: Companies such as Shutterstock often require that buyers place credit attribution on their web pages when photos you buy from them appear on these pages.. Shutterstock requests that credit attribution links such as these be added: Songquan Deng / Shutterstock.com Do these links negatively impact SEO? Or do search engines view them as a positive? Thanks,
Web Design | | Kingalan1
Alan0 -
Site health - webmaster tools
A bit of an odd one. In Webmaster Tools, there's the option to order sites by site health. When we do this our site - http://www.neooptic.com/ - is near the bottom, despite there being little or no crawl errors. Any ideas why this could be happening?
Web Design | | neooptic0 -
Need to hire a tech for find out why google spider can't access my site
Google spider can't access my site and all my pages are gone out of serps. I've called network solutions tech support and they say all is fine with the site which is wrong. does anyone know of a web tech who i can hire to fix this issue? Thanks, Ron
Web Design | | Ron100 -
How to create Google Section or Jump To Links
Hello all! i need create a jump to links on my site and when seach a keyword on google it will display jump to links http://techwyse.com/blog/wp-content/uploads/2009/12/google-jump-to-links.jpg i same with that images please help me how to do it, and have any plugin on wordpress can do that google-jump-to-links.jpg
Web Design | | ITVSEO.COM0 -
Responsive design and Google analytics mobile tracking codes?
Hi all, We are currently rebuilding a site using responsive design, however i have just had a thought. On another site where we have a mobile site under a sub-directory we utilise mobile tracking codes as we found that this was far more accurate for recording visits. On a responsive design site evidently all pages, desktop and mobile, will be under the same URL, yet the content will adjust to the screen size of the device. Should we also change the tracking code to be mobile code on the lower resolutions or would the same code be sufficient?
Web Design | | Sarbs0 -
WordPress not man enough...has anybody got experience working with Pryo CMS?
Hey folks I'm working with a small team on putting together a new niche accommodation / holiday search portal here in the UK. We are most likely using PHP / MySQL technology for the site - I am a huge fan of WordPress but not sure its quite man enough for the task (many option search over 10,000 plus properties). We can't afford to pay for a bespoke development, so off-the shelf CMS is the most likely route for release 1, and from what I've been reading Pyro CMS seems a good open source choice... https://www.pyrocms.com/ Has anybody come across this, or know how good it is with regards to on-site SEO? Or maybe WordPress is up to the task? If not, what are other good open source options for sites focused around a search function? Cheers Simon
Web Design | | SCL-SEO0