200 for Site Visitors, 404 for Google (but possibly 200?)
-
A 2nd question we have about another site we're working with...
Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google.
They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
-
Thanks Mike - yes, I believe this only happens on results pages on their site.
Good point on the cloaking - good thing to think about as well.
Sounds like disallowing in robots.txt is the 1st thing they should do, then they can remove the pages resulting in 404s which they can then manage through GWM.
-
Ah... its a search results page. Generally speaking, best practices for internal search results pages is to disallow them in robots.txt as Google usually considers is disfavorable to have search results appear in search results. What I'd really worry about here is that it could accidentally be viewed as cloaking since you're serving Google something completely different than you're serving human visitors. (Though a manual reviewer should see that you aren't doing it with malicious intent)
Does this only happen on search results pages?
-
If it were me, I would serve up the 200, but any time a "no-content" page was served up under a different URL I would use a canonical tag to point Google to a standard /no-content page.
This is an easy way to tell google "hey these are all really the same page, and serve the same purpose as /no-content. Please treat them as one page in your index, and do not count them as spammy variants."
-
Thank you Mike. I was leaning towards your hypothesis and it's good to see you're thinking the same thing.
Here is an example page with information from one of their site developers - hoping this might help as it appears it is not a custom 404 page.
If you disable javascript and set your USER_AGENT to googlebot you will get a 404.
http://bit.ly/1aoroMuAny other insight you have would be most appreciated - thx!
-
Have you checked the HTTP header status code shown to users and are you sure that its not just a custom 404 page? Could you give a specific URL as an example?
If the page doesn't exist and only offers a small amount of info like that then making it a 200 across the site when Googlebot sees it would cause Google to view it likely as duplicate thin content or a Soft 404. So a real 404, if it is in fact a 404, is the correct thing to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google WMT/search console: Thousands of "Links to your site" even only one back-link from a website.
Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf
Intermediate & Advanced SEO | | vtmoz0 -
Site Migration of 4 sites into 1?
Hi Guys, I have a massive project involving a migration of 4 sites into 1. 4 sites include: **www.MainSite.com ** www.E-commerce.com www.Membership.com www.ResearchStudy.com Goal of this project is to have 1-4 regrouped into Main Site I will be following the best practice from this post https://moz.com/blog/web-site-migration-guide-tips-for-seos which has an awesome checklist. I am actually about to start Phase 3: URL redirect mapping. Because all of these sites have hundreds of duplicates, I figured I should first resolve the Main Site dup issues before creating the URL redirect mapping but what about the other domains (2,3,4) though? Should I first resolve the Dup issues on those ones as well or it is not necessary since they will be pointing into the Main Site new domain? I want to make sure I don't overwork the programming team and myself. Thanks For sharing your expertise and any tips on how should I move forward with this.
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Mission Possible? You have 3 hours to do Local SEO. Which top 5 sites do you go Social Bookmark, Local Search Engine Submit and Directory List.
Mission Possible? Here is a test. Suppose you had 3 hours (okay 7) to go and submit links, etc, on Social Bookmarking, Local Search Engines and Directories, which top 5 or more of each would you do? (Assuming your on-page is already sweetened). I just got 2 more clients and I need to get started on a few things for each. Thankful for all your advice.............
Intermediate & Advanced SEO | | greenhornet770 -
Why are our sites top landing pages URL's that no longer exist and retrun 404 errors?
Digging through analytics today an noticed that our sites top landing pages are for pages that were part of the old www.towelsrus.co.uk website taken down almost 12 months ago. All these pages had the 301 re-directs which were removed a few months back but still have not dropped out of Googles crawl error logs. I can't understand why this is happening but almost certainly the bounce rate on these pages (100%) mean we are loosing potential conversions. How can I identify what keywords and links people are using to land on these pages?
Intermediate & Advanced SEO | | Towelsrus0 -
Press Release Sites
Ok, I am getting a lot of conflicting information about press release sites. i have been doing press release's for a while (mostly manually), I have also tried a few companies that claim to do it well (never do). After the Panda update the PR sites I have been using are just not as effective. Does anyone else have this problem or are there better PR sites that can be recommended.
Intermediate & Advanced SEO | | TomBarker820