200 for Site Visitors, 404 for Google (but possibly 200?)
-
A 2nd question we have about another site we're working with...
Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google.
They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
-
Thanks Mike - yes, I believe this only happens on results pages on their site.
Good point on the cloaking - good thing to think about as well.
Sounds like disallowing in robots.txt is the 1st thing they should do, then they can remove the pages resulting in 404s which they can then manage through GWM.
-
Ah... its a search results page. Generally speaking, best practices for internal search results pages is to disallow them in robots.txt as Google usually considers is disfavorable to have search results appear in search results. What I'd really worry about here is that it could accidentally be viewed as cloaking since you're serving Google something completely different than you're serving human visitors. (Though a manual reviewer should see that you aren't doing it with malicious intent)
Does this only happen on search results pages?
-
If it were me, I would serve up the 200, but any time a "no-content" page was served up under a different URL I would use a canonical tag to point Google to a standard /no-content page.
This is an easy way to tell google "hey these are all really the same page, and serve the same purpose as /no-content. Please treat them as one page in your index, and do not count them as spammy variants."
-
Thank you Mike. I was leaning towards your hypothesis and it's good to see you're thinking the same thing.
Here is an example page with information from one of their site developers - hoping this might help as it appears it is not a custom 404 page.
If you disable javascript and set your USER_AGENT to googlebot you will get a 404.
http://bit.ly/1aoroMuAny other insight you have would be most appreciated - thx!
-
Have you checked the HTTP header status code shown to users and are you sure that its not just a custom 404 page? Could you give a specific URL as an example?
If the page doesn't exist and only offers a small amount of info like that then making it a 200 across the site when Googlebot sees it would cause Google to view it likely as duplicate thin content or a Soft 404. So a real 404, if it is in fact a 404, is the correct thing to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Merging B2B site with B2C site
Hi, A mobile phone accessory client of ours has a retail site (B2C) and a trade site (B2B). The retail site does pretty well and ranks highly for a number of terms. The trade site doesn't really rank for anything as they don't optimise it. They would like to merge the two sites and allow trade customers to log-in and purchase goods in bulk for their business. If they were to merge the trade site into the already successful consumer site, what would be the best way of doing this and what, if any, implications would it have on the organic visibility of the B2C site? Would it be possible to target retail and trade customers on one website? Cheers, Lewis
Intermediate & Advanced SEO | | PeaSoupDigital0 -
Google Panda question Category Pages on e-commerce site
Dear Mates, Could you check this category page of our e-commerce site: http://tinyurl.com/zqjalng and give me your opinion about, this is a Panda safe page or not? Actually I have this as NOINDEX preventing any Panda hit, but I'm in doubt. My Question is "Can I index this page again in peace?" Thank you Clay
Intermediate & Advanced SEO | | ClayRey0 -
Why are these sites outranking me?
I am trying to rank for the phrase "a link between worlds walkthrough" I am on page 1 but there are several results that just outranks me and I cannot see any reason that they would be doing so. My site is hiddentriforce.com/a-link-between-worlds/walkthrough/ For that page I have 5 linking domains, varied anchor text that spans from things like "here" to a variety of related phrases. All of the links come from really good sites My page has 1400 likes, 90 shares, and about 20 each in tweets and +'s DA of 44 PA of 37 The 4 and 5 ranked sites both have WAY less social interactions, lower PA and DA, less links, etc Yet they outrank me why?
Intermediate & Advanced SEO | | Atomicx0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610