Do I need to disallow the dynamic pages in robots.txt?
-
Do I need to disallow the dynamic pages that show when people use our site's search box? Some of these pages are ranking well in SERPs. Thanks!
-
These pages that produce soft 404 errors don't show products at all because these people search for our products that are not available.
-
Yes, done that.
-
Just having a quick look what Google say about them:
Here’s a list of steps to correct soft 404s to help both Google and your users:
- Check whether you have soft 404s listed in Webmaster Tools
- For the soft 404s, determine whether the URL:
- Contains the correct content and properly returns a 200 response (not actually a soft 404)
- Should 301 redirect to a more accurate URL
- Doesn’t exist and should return a 404 or 410 response
- Confirm that you’ve configured the proper HTTP Response by using Fetch as Googlebot in Webmaster Tools
- If you now return 404s, you may want to customize your 404 page to aid your users. Ourcustom 404 widget can help.
Have you followed these steps?
Andy
-
These soft 404s produce 200 status code. We already improved our pages when someone finds a product that is not on our list. But then, these dynamic pages are still considered as soft 404s by Google webmaster tools.
-
Well, I would try and fix why they are returning 404's as it would be a shame to block all results. Is this something you can do? Or is the a reason why just blocking is preferred?
-
Yeah, some of them produce soft 404 since there's no content at all but some dynamic pages that rank well show content.
Thanks,
JC
-
OK so when you search, you get back dynamic pages that are producing 404's, but you see the pages in the SERPs?
Just want to make sure i have this right
-
I agree with Andy. Many of our search result pages rank well (and actually convert quite well). I don't think you need to disallow them unless it's for content that doesn't exist. Even at that time, you may still want them up because you may offer complementary products and etc.
-
The reason why we want to block those pages is because they produce soft 404 errors. What should we do? Thanks Andy.
-
If they are ranking well, what is the reason for wanting to block them?
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Validated pages on GSC displays 5x more pages than when performing site:domain.com?
Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
Intermediate & Advanced SEO | | Ty19860 -
Pillar pages and blog pages
Hello, I was watching this video about pillar pages https://www.youtube.com/watch?v=Db3TpDZf_to and tried to apply it to my self but find it impossible to do (but maybe I am looking at it the wrong way). Let's say I want to rank on "Normandy bike tou"r. I created a pillar page about "Normandy bike tour" what would be the topics of the subpages boosting that pillar page. I know that it should be questions people have but in the tourism industry they don't have any, they just want us to make them dream !! I though about doing more general blog pages about things such as : Places to rent a bike in Normandy or in XYZ city ? ( related to biking) Or the landing sites in Normandy ? (not related to biking) Is it the way to do it, what do you recommend ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How To Rank Our Featured Snippet - What Changes Are Needed On Our Page?
I've read a number of articles that have been helpful, but most of them are specifically still just trying to prove the value of the snippets and more recently show you how to find what search terms to rank for. What I'm really struggling with is exactly 'How do we rank for them, when we already have the #1 position and the featured snippet is going to another site'? Let me break this down a bit more: 1. We are measuring the 'SERP Features' within Moz Pro Tools and I've identified ~300 pages where there is a 'Featured Snippet' but I don't have the feature. 2. In a good portion of these, I'm outranking the site that has the 'Featured Snippet'. So I can compare my site, side by side to the 'Featured Snippet'. Now that I have the question, my ranking and the competition all in front of me. What changes are recommended I implement on our page? Is there a recommended process to follow?
Intermediate & Advanced SEO | | fabfrug0 -
Help With This Page
This is page - http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ - is the most important one to my business, and I can't seem to get it to rank higher. It has the second highest authority and links, second only to my homepage (though none are all that impressive) but it is just buried in the SERPs. Granted, I know Tampa Personal Injury Attorney is the hardest keyword for us to rank for, but there must be some way to improve this. I know getting high quality links is an appropriate answer, but I'm looking for anything I can do solely on my end to improve it. However, if anyone has some ways to make the page more linkable, I'm all ears! Please, if you have a second to take a look, I'd appreciate any and all feedback. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Should We Add the W3.org Language Tag To Every Page Or Just The Home Page?
Greetings, We have five international sites around the world, two of which are in difference languages. Currently we have the following line of html code on the home page of each of the sites: Clearly, we need to change the "en" portion for the sites that aren't in English, but, should we include that meta tag in each of the site's pages, or will the home page suffice. Thanks!
Intermediate & Advanced SEO | | CSawatzky0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Are links to on-page content crawled / have any effect on page rank?
Lets say I have a really long article that begins with links to <a name="something">anchors on the same page.</a> <a name="something"></a> <a name="something">E.g.,</a> Chapter 1, Chapter 2, etc, allowing the user to scroll down to different content. There are also other links on this page that link to other pages. A few questions: Googlebot arrives on the page. Does it crawl links that point to anchors on the same page? When link juice is divided among all the links on the page, do these links count and page rank is then lost? Thanks!
Intermediate & Advanced SEO | | anthematic0