Do I need to disallow the dynamic pages in robots.txt?
-
Do I need to disallow the dynamic pages that show when people use our site's search box? Some of these pages are ranking well in SERPs. Thanks!
-
These pages that produce soft 404 errors don't show products at all because these people search for our products that are not available.
-
Yes, done that.
-
Just having a quick look what Google say about them:
Here’s a list of steps to correct soft 404s to help both Google and your users:
- Check whether you have soft 404s listed in Webmaster Tools
- For the soft 404s, determine whether the URL:
- Contains the correct content and properly returns a 200 response (not actually a soft 404)
- Should 301 redirect to a more accurate URL
- Doesn’t exist and should return a 404 or 410 response
- Confirm that you’ve configured the proper HTTP Response by using Fetch as Googlebot in Webmaster Tools
- If you now return 404s, you may want to customize your 404 page to aid your users. Ourcustom 404 widget can help.
Have you followed these steps?
Andy
-
These soft 404s produce 200 status code. We already improved our pages when someone finds a product that is not on our list. But then, these dynamic pages are still considered as soft 404s by Google webmaster tools.
-
Well, I would try and fix why they are returning 404's as it would be a shame to block all results. Is this something you can do? Or is the a reason why just blocking is preferred?
-
Yeah, some of them produce soft 404 since there's no content at all but some dynamic pages that rank well show content.
Thanks,
JC
-
OK so when you search, you get back dynamic pages that are producing 404's, but you see the pages in the SERPs?
Just want to make sure i have this right
-
I agree with Andy. Many of our search result pages rank well (and actually convert quite well). I don't think you need to disallow them unless it's for content that doesn't exist. Even at that time, you may still want them up because you may offer complementary products and etc.
-
The reason why we want to block those pages is because they produce soft 404 errors. What should we do? Thanks Andy.
-
If they are ranking well, what is the reason for wanting to block them?
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens to crawled URLs subsequently blocked by robots.txt?
We have a very large store with 278,146 individual product pages. Since these are all various sizes and packaging quantities of less than 200 product categories my feeling is that Google would be better off making sure our category pages are indexed. I would like to block all product pages via robots.txt until we are sure all category pages are indexed, then unblock them. Our product pages rarely change, no ratings or product reviews so there is little reason for a search engine to revisit a product page. The sales team is afraid blocking a previously indexed product page will result in in it being removed from the Google index and would prefer to submit the categories by hand, 10 per day via requested crawling. Which is the better practice?
Intermediate & Advanced SEO | | AspenFasteners1 -
Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
We have followers, following, friends, etc pages for each user who creates account on our website. so when new user sign up, he may have 0 followers, 0 following and 0 friends, but over period of time he can get those lists go up. we have different pages for followers, following and friends which are allowed for google to index. When user don't have any followers/following/friends, those pages looks empty and we get issue of duplicate content and description too short. so is it better that we add noindex for those pages temporarily and remove noindex tag when there are at least 2 or more people on those pages. What are side effects of adding noindex when there is no data on those page or benefits of it?
Intermediate & Advanced SEO | | swapnil120 -
Robots.txt Allowed
Hello all, We want to block something that has the following at the end: http://www.domain.com/category/product/some+demo+-text-+example--writing+here So I was wondering if doing: /*example--writing+here would work?
Intermediate & Advanced SEO | | ThomasHarvey0 -
How I can improve my website On page and Off page
My Website is guitarcontrol.com, I have very strong competition in market. Please advice me the list of improvements on my websites. In regarding ON page, Linkbuiding and Social media. What I can do to improve my website ranking?
Intermediate & Advanced SEO | | zoe.wilson170 -
Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Our company gets inbound links from news websites that write stories about upcoming sporting events. The links we get are pointing to our event / ticket inventory pages on our commerce site. Once the event has passed, that event page is basically a dead page that shows no ticket inventory, and has no content. Also, each “event” page on our site has a unique url, since it’s an event that will eventually expire, as the game gets played, or the event has passed. Example of a url that a news site would link to: mysite.com/tickets/soldier-field/t7493325/nfc-divisional-home-game-chicago bears-vs-tbd-tickets.aspx Would there be any negative ramifications if I set up a 301 from the dead event page to another page on our site, one that is still somewhat related to the product in question, a landing page with content related to the team that just played, or venue they play in all season. Example, I would 301 to: mysite.com/venue/soldier-field tickets.aspx (This would be a live page that never expires.) I don’t know if that’s manipulating things a bit too much.
Intermediate & Advanced SEO | | Ticket_King1 -
Why does our business directions page rank above business profile page
Hi All, We are having an issue at the moment where our business direction page is ranking above the main business profile page. Our website is zodio.com, similar to Yelp but for South East Asia. An example of each page is below: Business Profile Page - http://www.zodio.com/business/detail/126037914/chowking Business Directions - http://www.zodio.com/business/direction/126037914 On many of our long tail searches for particular businesses, the business directions rank above the business details. Does anyone have any idea of why this would happen? I have researched Yelp and they do not have this issue. A few search examples in Google are as follows (one is in Thai): agonos dental clinic เวิลด์ชาร์มมิ่ง kawanku elektrik I have been rattling my brain and search for answers but cannot find anything. The communities help would be much appreciated. Many Thanks, Neil W
Intermediate & Advanced SEO | | zodiothailand0 -
Why Is This Page Not Ranking?
Hi Mozzers, I can't rank (the page is nowhere on the Google grid that I can find) and I've not been able to move the needle at all on it. The page is http://www.lumber2.com/Western-Saddle-Pads-s/98.htm for keyword "western saddle pads." I'm inclined to think I'm cannabalizing the category with the products so I removed the word saddle from the majority of the product names on page. However, saddle pad or saddle pads is in the meta title for most if not all of the products. Do you think I'm cannabalizing with the product titles or is there something else going on? Thanks for any help.
Intermediate & Advanced SEO | | AWCthreads0 -
Not sure why Home page is outranked by less optimized internal pages.
We launched our website just three weeks ago, and one of our primary keyword phrases is "e-business consultants". Here's what I don't get. Our home page is the page most optimized around this search phrase. Using SEOmoz On-Page Optimization tool, the home page scores an "A". And yet it doesn't rank in the top 50 on Google Canada, although two other INTERNAL pages - www.ebusinessconsultants.ca/about/consulting-team/ & /www.ebusinessconsultants.ca/about/consulting-approach/ - rank 5 & 6 on Google Canada, even though they only score a grade "C" for on-page optimization for this keyword phrase. I've always understood that the home page is the most powerful page. Why are these others outranking it? I checked the crawl and Google Webmaster, and there is no obvious problem on the home page. Is this because the site is so new? It goes against all previous experience I've had in similar situation. Any guidance/ insight would be highly appreciated!!
Intermediate & Advanced SEO | | axelk0