200 for Site Visitors, 404 for Google (but possibly 200?)
-
A 2nd question we have about another site we're working with...
Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google.
They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
-
Thanks Mike - yes, I believe this only happens on results pages on their site.
Good point on the cloaking - good thing to think about as well.
Sounds like disallowing in robots.txt is the 1st thing they should do, then they can remove the pages resulting in 404s which they can then manage through GWM.
-
Ah... its a search results page. Generally speaking, best practices for internal search results pages is to disallow them in robots.txt as Google usually considers is disfavorable to have search results appear in search results. What I'd really worry about here is that it could accidentally be viewed as cloaking since you're serving Google something completely different than you're serving human visitors. (Though a manual reviewer should see that you aren't doing it with malicious intent)
Does this only happen on search results pages?
-
If it were me, I would serve up the 200, but any time a "no-content" page was served up under a different URL I would use a canonical tag to point Google to a standard /no-content page.
This is an easy way to tell google "hey these are all really the same page, and serve the same purpose as /no-content. Please treat them as one page in your index, and do not count them as spammy variants."
-
Thank you Mike. I was leaning towards your hypothesis and it's good to see you're thinking the same thing.
Here is an example page with information from one of their site developers - hoping this might help as it appears it is not a custom 404 page.
If you disable javascript and set your USER_AGENT to googlebot you will get a 404.
http://bit.ly/1aoroMuAny other insight you have would be most appreciated - thx!
-
Have you checked the HTTP header status code shown to users and are you sure that its not just a custom 404 page? Could you give a specific URL as an example?
If the page doesn't exist and only offers a small amount of info like that then making it a 200 across the site when Googlebot sees it would cause Google to view it likely as duplicate thin content or a Soft 404. So a real 404, if it is in fact a 404, is the correct thing to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Google News error in Google Search Console
My google search console states some errors as below: 1. Article fragmented Some of the urls in this error are the category urls. How to make google bot understand it is a category not an article? 2. Article too short In fact the article is quite long. I do not know why this is happen... 3. No sentence found In fact, there are a lot of sentences Please help!
Intermediate & Advanced SEO | | binhlai0 -
404 errors
Hi, we have plenty of 404 errors. We just deal with those that are of the highest priority (the ones that have high page authority). We have also a lot of errors like this: http://www.weddingrings.com/www.yoy-search.com . Does it make sense to redirect those to the home page or leave them as an 404 error?
Intermediate & Advanced SEO | | alexkatalkin0 -
Consolidate Local sites to one larger site
I am a partner in a real estate company that operates in 10 different markets across the country. Each of these markets has it's own individual domain. My question is should we consolidate each of these markets into one domain that services all markets? What would we possibly gain or lose from an organic traffic standpoint? In some of our more established markets (Indianapolis, Las Vegas, Tampa, Orlando and Charlotte) our organic traffic accounts for 50-60% of our total traffic. In some of our newer markets (Denver, Phoenix, San Diego) it accounts for less than 15%. We do operate under two different brand names. EasyStreet Realty and Highgarden Real Estate. EasyStreet has been around since 2000 with most of our Highgarden sites only up for 6-24 months. Another question is we are considering converting all EasyStreet divisions to Highgarden. I am a little reluctant to do so, since most of our organic traffic is coming from our EasyStreet sites. Thoughts? You can find links to all our sites at www.easystreetrealty.com or www.highgarden.com Thank you in advance for your insight.
Intermediate & Advanced SEO | | EasyStreet0 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
Merging 11 community sites into 1 regional site
I am merging 11 real estate community sites into 1 regional site and don't really know what type of redirect should I use for the homepage?, for instance: www.homepage.com redirect to www.regionalsite.com/community-page Should I 301 this redirect? If yes, how could I 301 redirect a homepage to an internal page in my new site? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
What happened to my site?
This is my website: http://goo.gl/Brekn It was related to health niche. I am good with the serp for the last couple of hours. I am in top for the keywords: cure for piles, hemorrhoids treatment, natural hemorrhoid treatment. My main keyword is hemorrhoids treatment. And has recently added this page, http://goo.gl/SXNtX This page has some dupe content, as content writer was busy i have copied the content from other sources. Does this have any effect? I have not added this page onto homepage as i dont like google to index it until my copywriter writes the content. Dont know what happened, in the last few hours i lost my serp to no where. Can anyone tell me what happened to my site. Did i done anything wrong? In the last week i have added a link to my ebooks related site. Does this affected my serps? Or did the Google recent change http://googlewebmastercentral.blogspot.com/2012/01/page-layout-algorithm-improvement.html Has any effect on my site? Awaiting for your replies...
Intermediate & Advanced SEO | | Dex324344320 -
Google.ca vs Google.com Ranking
I have a site I would like to rank high for particular keywords in the Google.ca searches and don't particularly care about the Google.com searches (it's a Canadian service). I have logged into Google Webmaster Tools and targeted Canada. Currently my site is ranking on the third page for my desired keywords on Google.com, but is on the 20th page for Google.ca. Previously this change happened quite quickly -- within 4 weeks -- but it doesn't seem to be taking here (12 weeks out and counting). My optimization seems to be fine since I'm ranking well on Google.com: not sure why it's not translating to Google.ca. Any help or thoughts would be appreciated.
Intermediate & Advanced SEO | | seorm0