Do I need to disallow the dynamic pages in robots.txt?
-
Do I need to disallow the dynamic pages that show when people use our site's search box? Some of these pages are ranking well in SERPs. Thanks!
-
These pages that produce soft 404 errors don't show products at all because these people search for our products that are not available.
-
Yes, done that.
-
Just having a quick look what Google say about them:
Here’s a list of steps to correct soft 404s to help both Google and your users:
- Check whether you have soft 404s listed in Webmaster Tools
- For the soft 404s, determine whether the URL:
- Contains the correct content and properly returns a 200 response (not actually a soft 404)
- Should 301 redirect to a more accurate URL
- Doesn’t exist and should return a 404 or 410 response
- Confirm that you’ve configured the proper HTTP Response by using Fetch as Googlebot in Webmaster Tools
- If you now return 404s, you may want to customize your 404 page to aid your users. Ourcustom 404 widget can help.
Have you followed these steps?
Andy
-
These soft 404s produce 200 status code. We already improved our pages when someone finds a product that is not on our list. But then, these dynamic pages are still considered as soft 404s by Google webmaster tools.
-
Well, I would try and fix why they are returning 404's as it would be a shame to block all results. Is this something you can do? Or is the a reason why just blocking is preferred?
-
Yeah, some of them produce soft 404 since there's no content at all but some dynamic pages that rank well show content.
Thanks,
JC
-
OK so when you search, you get back dynamic pages that are producing 404's, but you see the pages in the SERPs?
Just want to make sure i have this right
-
I agree with Andy. Many of our search result pages rank well (and actually convert quite well). I don't think you need to disallow them unless it's for content that doesn't exist. Even at that time, you may still want them up because you may offer complementary products and etc.
-
The reason why we want to block those pages is because they produce soft 404 errors. What should we do? Thanks Andy.
-
If they are ranking well, what is the reason for wanting to block them?
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic Pages with No Results Causing Thin Content
Hi Mozers, We have dynamic listing pages that pull in clinical trial results for specific disease types. Sometimes diseases have clinical trials and sometimes they don't. This means that sometimes the page will have zero results and sometimes it will return results. We have a sizable number of these so that when there are zero results, these pages look like thin content. What is the recommended method of dealing with this? Is there a way of doing a conditional noindex where the page is indexed if results are pulled in and and not indexed when the page returns zero results? If we can do this, should we? Will it confuse Google and send negative signals? Any guidance/thoughts are much appreciated! Yael
Intermediate & Advanced SEO | | yaelslater0 -
Backlinks to internal pages
Hi, Our website of 3K+ pages currently has more links coming to internal pages (2nd & 3rd Level), compared to links to homepage. Just wanted to know if this is bad for rankings ? Please share your thoughts. Thanks.
Intermediate & Advanced SEO | | Umesh-Chandra0 -
How should I deal with this page?
Hey Mozzers, I was looking for a little guidance and advice regarding a couple of pages on my website. I have used 'shoes' for this example. I have the current structure Parent Category - Shoes Sub Categories - Blue Shoes
Intermediate & Advanced SEO | | ATP
Hard Shoes
Soft Shoes
Big Shoes etc Supporting Article - Different Types of Shoe and Their Uses There are about 12 subcategories in total - each one links back to the Parent Category with the keyword "Shoes". Every sub category has gone from ranking 50+ to 10-30th for its main keyword which is a good start and as I release supporting articles im sure each one will climb. I am happy with this. The Article ranks no1 for about 20 longtails terms around "different shoes". This page attracts around 60% of my websites traffic but we know this traffic will not convert as most are people and children looking for information only for educational purposes and are not looking to buy. Many are also looking for a type of product we dont sell. My issue is ranking for the primary category "Shoes" keyword. When i first made the changes we went from ranking nowhere to around 28th on the parent category page targeted at "Shoes". Whilst not fantastic this was good as gave us something to work off. However a few weeks later, the article page ranked 40th for this term and the main page dropped off the scale. Then another week some of the sub category pages ranked for it. And now none of my pages rank in the top 50 for it. I am fairly sure this is due to some cannibalisation - simply because of various pages ranking for it at different times.
I also think that additional content added by products on the sub category pages is giving them more content and making them rank better. The Page Itself
The Shoes page itself contains 400 good unique words, with the keyword mentioned 8 times including headings. There is an image at the top of the page with its title and alt text targeted towards the keyword. The 12 sub categories are linked to on the left navigation bar, and then again below the 400 words of content via a picture and text link. This added the keyword to the page another 18 or so times in the form of links to longtail subcaterogies. This could introduce a spam problem i guess but its in the form of nav bars or navigation tables and i understood this to be a necessary evil on eCommerce websites. There are no actual products linked from this page. - a problem? With all the basic SEO covered. All sub pages linking back to the parent category, the only solution I can think of is to add more content by Adding all shoes products to the shoe page as it currently only links out the the sub categories Merging the "Different Type of Shoe and Their Uses" article into the shoe page to make a super page and make the article pages less like to produce cannibalistic problems. However, by doing solution 2, I remove a page bringing in a lot of traffic. The traffic it brings in however is of very little use and inflates the bounce rate and lowers the conversion rate of my whole site by significant figures. It also distorts other useful reports to track my other progress. I hope i have explained well enough, thanks for sticking with me this far, i havn't posted links due to a reluctance by the company so hopefully my example will suffice. As always thanks for any input.0 -
Can URLs blocked with robots.txt hurt your site?
We have about 20 testing environments blocked by robots.txt, and these environments contain duplicates of our indexed content. These environments are all blocked by robots.txt, and appearing in google's index as blocked by robots.txt--can they still count against us or hurt us? I know the best practice to permanently remove these would be to use the noindex tag, but I'm wondering if we leave them they way they are if they can still hurt us.
Intermediate & Advanced SEO | | nicole.healthline0 -
Will Creating a Keyword specific Page to replace the Category Section page cause any harm to my website?
I am running a word press install for my blog and recently had 3 of my main keywords set as categories. I recently decided to create a static page for the keywords instead of having the category page showing all the posts within the category, and took it off the navigation bar. I read about setting the categories to use NO index so the search engines can shine more importance on the new pages i created to really replace where the category was showing. Can this have a negative effect on my rankings? http://junkcarsforcashnjcompany.com junk car removal nj is showing the category section, So i placed the no index on it. Will the search engines refresh the data and replace it with the new page I created?
Intermediate & Advanced SEO | | junkcars0 -
Are these doorway pages?
I've added category pages for counties/town on http://www.top-10-dating-reviews.com but will google see these as doorway pages? If you click on categories from the menu at the top and view some of the pages you'll hopefully see what I mean? Should I continue building these or delete them? Any advice appreciated.
Intermediate & Advanced SEO | | SamCUK0 -
301 redirect or Robots.txt on an interstatial page
Hey guys, I have an affiliate tracking system that works like this : an affiliate puts up a certain code on his site, for example : www.domain.com/track/aff_id This url leads to a page where the hit is counted, analysed and then 302 redirects to my sales page with the affiliates ID in the url : www.mysalespage.com/?=aff_id. However, we've noticed recently that one affiliate seems to be ranking for our own name and the url google indexed was his tracking url (domain.com/track/aff_id). Which is strange because there is absolutely nothing on that page, its just an interstatial page so that our stats tracking software can properly filter hits. To remove the affiliate's url from showing up in the serps, I've come up with 2 solutions : 1 - Change the redirect to a 301 redirect on his track page. 2 - Change our robots.txt page to block all domain.com/track/ pages from being indexed. My question is : if I 301 redirect instead of 302, will I keep the affiliates from outranking me for my own name AND pass on link juice or should I simply block google from crawling the interstatial tracking pages?
Intermediate & Advanced SEO | | CrakJason0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0