200 for Site Visitors, 404 for Google (but possibly 200?)
-
A 2nd question we have about another site we're working with...
Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google.
They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
-
Thanks Mike - yes, I believe this only happens on results pages on their site.
Good point on the cloaking - good thing to think about as well.
Sounds like disallowing in robots.txt is the 1st thing they should do, then they can remove the pages resulting in 404s which they can then manage through GWM.
-
Ah... its a search results page. Generally speaking, best practices for internal search results pages is to disallow them in robots.txt as Google usually considers is disfavorable to have search results appear in search results. What I'd really worry about here is that it could accidentally be viewed as cloaking since you're serving Google something completely different than you're serving human visitors. (Though a manual reviewer should see that you aren't doing it with malicious intent)
Does this only happen on search results pages?
-
If it were me, I would serve up the 200, but any time a "no-content" page was served up under a different URL I would use a canonical tag to point Google to a standard /no-content page.
This is an easy way to tell google "hey these are all really the same page, and serve the same purpose as /no-content. Please treat them as one page in your index, and do not count them as spammy variants."
-
Thank you Mike. I was leaning towards your hypothesis and it's good to see you're thinking the same thing.
Here is an example page with information from one of their site developers - hoping this might help as it appears it is not a custom 404 page.
If you disable javascript and set your USER_AGENT to googlebot you will get a 404.
http://bit.ly/1aoroMuAny other insight you have would be most appreciated - thx!
-
Have you checked the HTTP header status code shown to users and are you sure that its not just a custom 404 page? Could you give a specific URL as an example?
If the page doesn't exist and only offers a small amount of info like that then making it a 200 across the site when Googlebot sees it would cause Google to view it likely as duplicate thin content or a Soft 404. So a real 404, if it is in fact a 404, is the correct thing to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Impact & Google Impact On Removing Product From Category Page for Ecommerce Site
Hello Experts, For my Ecommerce site previously I was showing products at category pages i.e. first all subcategories name after that list all products of all subcateogries. That also approx per category 500 products via load more feature. My query is now I am planning to show products only at Product Listing Page and not on Category pages so what will be SEO impact and how google will treat this? Thanks!
Intermediate & Advanced SEO | | Johny123450 -
Google crawling 200 page site thousands of times/day. Why?
Hello all, I'm looking at something a bit wonky for one of the websites I manage. It's similar enough to other websites I manage (built on a template) that I'm surprised to see this issue occurring. The xml sitemap submitted shows Google there are 229 pages on the site. Starting in the beginning of December Google really ramped up their intensity in crawling the site. At its high point Google crawled 13,359 pages in a single day. I mentioned I manage other similar sites - this is a very unusual spike. There are no resources like infinite scroll that auto generates content and would cause Google some grief. So follow up questions to my "why?" is "how is this affecting my SEO efforts?" and "what do I do about it?". I've never encountered this before, but I think limiting my crawl budget would be treating the symptom instead of finding the cure. Any advice is appreciated. Thanks! *edited for grammar.
Intermediate & Advanced SEO | | brettmandoes0 -
How much SEO damage would it do having a subdomain site rather directory site?
Hi all! With a coleague we were arguing about what is better: Having a subdomain or a directory.
Intermediate & Advanced SEO | | Gaston Riera
Let me explain some more, this is about the cases: Having a multi-language site: Where en.domain.com or es.domain.com rather than domain.com/en/ or domain.com/es/ Having a Mobile and desktop version: m.domain.com or domain.com rather than domain.com/m or just domain.com. Having multiple location websites, you might figure. The dicussion started with me saying: Its better to have a directory site.
And my coleague said: Its better to have a subdomain site. Some of the reasons that he said is that big companies (such as wordpress) are doing that. And that's better for the business.
My reasons are fully based on this post from Rand Fishkin: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO - Whiteboard Friday So, what does the community have to say about this?
Who should win this argue? GR.0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Moving career site to new URL from main site. Will it hurt SEO for main page?
For one of our clients we are building a career site and putting it under a different URL and hosting service (mainly due to security concerns of hosting it under the same host and domain). almost 100% of the incoming traffic to their current career section (which it is in a sub-folder) receives traffic for branded keywords (brand + job/career/employment), that is, there are no job position specific keywords. The client is now worried that after moving the site, the inbound traffic to the main site will be severely affected as well as the SERP results. My questions are, will the non-career related SERPs be affected? I don't see how will they be but I could be wrong If no, how could we reassure her that the SEO to the main site wont be affected? are there any case studies of a similar case (splitting part of the website under a new URL and hosting service?) Thank you for your help. PS: this is my first post so please forgive me if this has been asked before. I could not find a good response.
Intermediate & Advanced SEO | | rflores0 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
Is my site being penalized?
I launched http://rumma.ge in February of this year. Because I'm using a domain hack (the Georgian domain), I'd really like to rank for just the word "rummage". After launching, I was steady at around page 4/5 on searches for "rummage". However since then I've tumbled out of the first 100. In fact I can't even find the site in the first 20 pages on Google for that search. Even a search for my exact homepage title text doesn't bring up the site, despite the fact that the site is still in the index. I'm wondering if one of the following could be the root cause: We have a ccTLD (.ge)--not sure about the impacts of this, but seems like it might not be the root cause because we were ranking for "rummage" when we first launched. Tried running an Adwords campaign but the site was flagged as a "bridge page" (working on getting this addressed). I'm wondering if this could have carryover impacts into natural search rankings? We've tried doing some press and built up a decent number of backlinks over the past couple of months, many of which had "rummage" in the anchor text. This was all organic, but happened over the span of a month which may be too fast? Am I being penalized? Beyond checking indexing of the site, is there a way to tell if I've been flagged for some bad behavior? Any help or thoughts would be greatly appreciated. I'm really confused by this since I feel like I've been doing things right and my rankings have been travelling downward. Thanks!! Matt
Intermediate & Advanced SEO | | minouye0