200 for Site Visitors, 404 for Google (but possibly 200?)
-
A 2nd question we have about another site we're working with...
Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google.
They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
-
Thanks Mike - yes, I believe this only happens on results pages on their site.
Good point on the cloaking - good thing to think about as well.
Sounds like disallowing in robots.txt is the 1st thing they should do, then they can remove the pages resulting in 404s which they can then manage through GWM.
-
Ah... its a search results page. Generally speaking, best practices for internal search results pages is to disallow them in robots.txt as Google usually considers is disfavorable to have search results appear in search results. What I'd really worry about here is that it could accidentally be viewed as cloaking since you're serving Google something completely different than you're serving human visitors. (Though a manual reviewer should see that you aren't doing it with malicious intent)
Does this only happen on search results pages?
-
If it were me, I would serve up the 200, but any time a "no-content" page was served up under a different URL I would use a canonical tag to point Google to a standard /no-content page.
This is an easy way to tell google "hey these are all really the same page, and serve the same purpose as /no-content. Please treat them as one page in your index, and do not count them as spammy variants."
-
Thank you Mike. I was leaning towards your hypothesis and it's good to see you're thinking the same thing.
Here is an example page with information from one of their site developers - hoping this might help as it appears it is not a custom 404 page.
If you disable javascript and set your USER_AGENT to googlebot you will get a 404.
http://bit.ly/1aoroMuAny other insight you have would be most appreciated - thx!
-
Have you checked the HTTP header status code shown to users and are you sure that its not just a custom 404 page? Could you give a specific URL as an example?
If the page doesn't exist and only offers a small amount of info like that then making it a 200 across the site when Googlebot sees it would cause Google to view it likely as duplicate thin content or a Soft 404. So a real 404, if it is in fact a 404, is the correct thing to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Detecting Real Page as Soft 404 Error
We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.
Intermediate & Advanced SEO | | bheard0 -
Completely redesigned webmaster - set up new site in Google Webmaster Tools, or keep existing??
Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?
Intermediate & Advanced SEO | | Jenny10 -
International Site Migration
Hi guys, In the process of launching internationally ecommerce site (Magento CMS) for two different countries (Australia and US). Then later on expand to other countries like the UK, Canada, etc. The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk A lot of the content between these English based countries are the same. E.g. same product descriptions.
Intermediate & Advanced SEO | | jayoliverwright
So in order to prevent duplication, from what I’ve read we will need to add Hreflang tags to every single page on the site? So for: Australian pages: United States pages: Just wanted to make sure this is the correct strategy (will hreflang prevent duplicate content issues?) and anything else i should be considering? Thankyou, Chris0 -
Why is this site not indexed by Google?
Hi all and thanks for your help in advance. I've been asked to take a look at a site, http://www.yourdairygold.ie as it currently does not appear for its brand name, Your Dairygold on Google Ireland even though it's been live for a few months now. I've checked all the usual issues such as robots.txt (doesn't have one) and the robots meta tag (doesn't have them). The even stranger thing is that the site does rank on Yahoo! and Bing. Google Webmaster Tools shows that Googlebot is crawling around 150 pages a day but the total number of pages indexed is zero. It does appear if you carry out a site: search on Google however. The site is very poorly optimised in terms of title tags, unnecessary redirects etc which I'm working on now but I wondered if you guys had any further insights. Thanks again for your help.
Intermediate & Advanced SEO | | iProspect-Ireland0 -
Better SEO Option, 1 Site 3 Subdomains or 4 Separate Sites?
Hey Mozzers, I'm working with a client who wants to redo their web presence. They have a a main website for the umbrella and then 3 divisions which have their own website as well. My question is: Is it better to have the main site on the main domain and then have the 3 separate sites be subdomains? Or 4 different domains with a linking structure to tie them all together? To my understanding option 1 would include high traffic for 1 domain and option 2 would be building Page Authority by having 4 different sites linking to each other? My guess would be option 2, only if all 4 sites start getting relevant authority to make the links of value. But right out of the gates option 1 might be more beneficial. A little advice/clarification would be great!
Intermediate & Advanced SEO | | MonsterWeb280 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
Is Google taking longer to rank new sites?
We run a lot of "niche blogs" and websites focused on fairly non-competitive keywords. At the start of the year, we used to be able to put up websites and be able to achieve almost instant rankings on these sites. However, recently, it seems to be taking a lot longer for these sites to rank. It also seems to be taking longer for Google to index links. Is this a recent change in Google to protect against spam and help filter out the lower quality sites? Has anyone else noticed this or is it just me?
Intermediate & Advanced SEO | | ukss19840