Do I need to disallow the dynamic pages in robots.txt?
-
Do I need to disallow the dynamic pages that show when people use our site's search box? Some of these pages are ranking well in SERPs. Thanks!
-
These pages that produce soft 404 errors don't show products at all because these people search for our products that are not available.
-
Yes, done that.
-
Just having a quick look what Google say about them:
Here’s a list of steps to correct soft 404s to help both Google and your users:
- Check whether you have soft 404s listed in Webmaster Tools
- For the soft 404s, determine whether the URL:
- Contains the correct content and properly returns a 200 response (not actually a soft 404)
- Should 301 redirect to a more accurate URL
- Doesn’t exist and should return a 404 or 410 response
- Confirm that you’ve configured the proper HTTP Response by using Fetch as Googlebot in Webmaster Tools
- If you now return 404s, you may want to customize your 404 page to aid your users. Ourcustom 404 widget can help.
Have you followed these steps?
Andy
-
These soft 404s produce 200 status code. We already improved our pages when someone finds a product that is not on our list. But then, these dynamic pages are still considered as soft 404s by Google webmaster tools.
-
Well, I would try and fix why they are returning 404's as it would be a shame to block all results. Is this something you can do? Or is the a reason why just blocking is preferred?
-
Yeah, some of them produce soft 404 since there's no content at all but some dynamic pages that rank well show content.
Thanks,
JC
-
OK so when you search, you get back dynamic pages that are producing 404's, but you see the pages in the SERPs?
Just want to make sure i have this right
-
I agree with Andy. Many of our search result pages rank well (and actually convert quite well). I don't think you need to disallow them unless it's for content that doesn't exist. Even at that time, you may still want them up because you may offer complementary products and etc.
-
The reason why we want to block those pages is because they produce soft 404 errors. What should we do? Thanks Andy.
-
If they are ranking well, what is the reason for wanting to block them?
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need to do 301s in this situation?
We have an e-commerce site built on Magento 2. We launched a few months ago, and had about 2K categories. The categories got indexed in Google for the most part. Shortly after launch, we decided to go with SLI for search and navigation because the native search/navigation was too slow given our database. The navigation pages are now hosted navigation pages; meaning, the URLs have changed and they are now hosted by SLI. I have done 301s for the most popular categories, but I didn't do 301s for all categories as we have to go through each category one-by-one and map it to the correct navigation page. Our new category sitemap only lists the new SLI category URLs. Will the fact that we have not 301'd all of our former categories hurt us as far as SEO? Do I have to do 301 redirects for all former category pages?
Intermediate & Advanced SEO | | kevin_h0 -
Application & understanding of robots.txt
Hello Moz World! I have been reading up on robots.txt files, and I understand the basics. I am looking for a deeper understanding on when to deploy particular tags, and when a page should be disallowed because it will affect SEO. I have been working with a software company who has a News & Events page which I don't think should be indexed. It changes every week, and is only relevant to potential customers who want to book a demo or attend an event, not so much search engines. My initial thinking was that I should use noindex/follow tag on that page. So, the pages would not be indexed, but all the links will be crawled. I decided to look at some of our competitors robots.txt files. Smartbear (https://smartbear.com/robots.txt), b2wsoftware (http://www.b2wsoftware.com/robots.txt) & labtech (http://www.labtechsoftware.com/robots.txt). I am still confused on what type of tags I should use, and how to gauge which set of tags is best for certain pages. I figured a static page is pretty much always good to index and follow, as long as it's public. And, I should always include a sitemap file. But, What about a dynamic page? What about pages that are out of date? Will this help with soft 404s? This is a long one, but I appreciate all of the expert insight. Thanks ahead of time for all of the awesome responses. Best Regards, Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
2 pages lost page rank and not showing any backlinks in google
Hi we have a business/service related website, 2 of our main pages lost their page rank from 3 to 0 and are not showing any backlinks in google. What could be the possible reason. Please guide me.
Intermediate & Advanced SEO | | Tech_Ahead0 -
Robot.txt error
I currently have this under my robot txt file: User-agent: *
Intermediate & Advanced SEO | | Rubix
Disallow: /authenticated/
Disallow: /css/
Disallow: /images/
Disallow: /js/
Disallow: /PayPal/
Disallow: /Reporting/
Disallow: /RegistrationComplete.aspx WebMatrix 2.0 On webmaster > Health Check > Blocked URL I copy and paste above code then click on Test, everything looks ok but then logout and log back in then I see below code under Blocked URL: User-agent: * Disallow: / WebMatrix 2.0 Currently, Google doesn't index my domain and i don't understand why this happening. Any ideas? Thanks Seda0 -
Help needed for a 53 Page Internal Website Structure & Internal Linking
Hey all... I'm designing the structure for a website that has 53 pages. Can you take a look at the attached diagram and see if the website structure is ok? On the attached diagram I have numbered the pages from 1 to 53, with 1 being the most important home page - 2,3,4,5, being the next 4 important pages - 6,7,8... 15,16,17 being the 3rd set of important pages, and 18,19,20..... 51,52,53 being the last set of pages which are the easiest to rank. I have two questions: Is the website structure for this correct? I have made sure that all pages on the website are reachable. Considering the home page, and page number 2,3,4,5 are the most important pages - I am linking out to these pages from the the last set of pages (18,29,20...51,52,53). There are 36 pages in the last set - and out of this 36, from 24 of them I am linking back to home page and page number 2,3,4,5. The remaining 8 pages of the 36 will link back to pages 6,7,8...15,16,17. In total the most importnat page will have the following number of internal incoming links: Home Page : 25 Pages 2,3,4,5 : 25 Pages 6,7,8...15,16,17 : 4 Pages 18,19,20...51,52,53 : 1 Is this ok considering home page, and pages 2,3,4,5 are the most important? Or do you think I should divide and give more internal links to the other pages also? If you can share any inputs or suggestions to how I can improve this it will greatly help me. Also if you know any references for good guides to internal linking of websites greater that 50 pages please share them in the answers. Thank you all! Regards, P.S - The URL for the image is at http://imgur.com/XqaK4
Intermediate & Advanced SEO | | arjun.rajkumar810 -
Google is displaying my pages path instead of URLS (Pages name)
Does anyone knows why Google is displaying my pages path instead of the URL in the search results, i discoverd that while am searching using a keyword of mine then i copied the link http://www.smarttouch.me/services-saudi/web-services/web-design and found all related results are the same, could anyone one tell me why is that and is it really differs? or the URL display is more important than the Path display for SEO!
Intermediate & Advanced SEO | | ali8810 -
Category pages in forums
I would like to hear feedback on the best SEO practice for forum category pages. An example would be a forum about cars. You can have a Chervorlet category which contains forums for every chevy model. Often this category page is simply a list of all the forums. If I noindex, follow the page then am I missing an opportunity? I am thinking of Google sitemaps for example where this page can be used for a category link. If I noindex the page, there probably isn't another great place for a sitemap to link to. I could fill out the page with wiki-like generic chevy information. Please share any thoughts or best practices.
Intermediate & Advanced SEO | | RyanKent0