Blocking Dynamic Search Result Pages From Google
-
Hi Mozzerds,
I have a quick question that probably won't have just one solution.
Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors?
Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea?
Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site?
I appreciate any feedback! Thanks!
Andrew
-
Hi,
So the main issue with dynamic search results is that they won't have unique content, and will quite often duplicate other pages, as you've discovered. E.g. Products under £10, Products under £20, and Products under £30 would all include content from the first, and then the first and second categories.
The usual answer is to just noindex all of them, particularly if individual product pages are your focus and ranking well. Unless you specifically want to rank for 'x products under £10' then there's no issue with removing them from search results.
You have a couple of options for doing this - either by noindexing any dynamic search content via robots.txt, or Noindex,Follow, which still allows crawling, but won't display the results. In general, I'd say that Noindexing the dynamic search results with a wildcard is easiest and most effective.
If you did have something you wanted to rank - then it's be a case of setting a specific search result with some unique content describing that category etc.
Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Google webcache of product page redirects back to product page
Hi all– I've legitimately never seen this before, in any circumstance. I just went to check the google webcache of a product page on our site (was just grabbing the last indexation date) and was immediately redirected away from google's cached version BACK to the site's standard product page. I ran a status check on the product page itself and it was 200, then ran a status check on the webcache version and sure enough, it registered as redirected. It looks like this is happening for ALL indexed product pages across the site (several thousand), and though organic traffic has not been affected it is starting to worry me a little bit. Has anyone ever encountered this situation before? Why would a google webcache possibly have any reason to redirect? Is there anything to be done on our side? Thanks as always for the help and opinions, y'all!
Intermediate & Advanced SEO | | TukTown1 -
Question About Permalink Showing Up in Search Results
Does Google determine how your permalink shows up in the search results or is that a setting on our end? I noticed most of our competitors have their permalink show up in their snippet results. Ours shows "knowledgebase" instead. I think seeing the keywords in the permalink helps with conversions. https://screencast.com/t/fyFyNaWayajx
Intermediate & Advanced SEO | | LindsayE0 -
Search queries results Wrong
Hey there, My website shows Wrong Search Queries in Google Search Console, Also Shows URLS which are not there in my website, Which Shows in crawl Errors. here i have attached Screenshot . http://prntscr.com/egzl88 Please Help me out how i can Deindex This type of URLs From Google index, & make My main pages crawl First in Google Search. because of this my website Ranking Also lost, Please any Expert can help out.. Thanx in advance.
Intermediate & Advanced SEO | | pooja.verify030 -
Previously blacklisted website still not appearing on Google searches.
We have a client who before us, had a website that was blacklisted by Google. After we created their new website, we submitted an appeal through Google's Webmaster Tools, and it was approved. One year later, they are still unable to rank for anything on Google. The keyword we are attempting to rank for on their home page is "Day in the Life Legal Videos" which shouldn't be too difficult to rank for after a year. But their website cannot be found. What else can we do to repair this previously blacklisted website after we're already been approved by Google? Here is the website in question: https://www.verdictvideos.com/
Intermediate & Advanced SEO | | rodneywarner0 -
Noindex search pages?
Is it best to noindex search results pages, exclude them using robots.txt, or both?
Intermediate & Advanced SEO | | YairSpolter0 -
Meta NOINDEX... how long before Google drops dupe pages?
Hi, I have a lot of near dupe content caused by URL params - so I have applied: How long will it take for this to take effect? It's been over a week now, I have done some removal with GWT removal tool, but still no major indexed pages dropped. Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
How come I get different rankings on same word in local search results of Google?
Dear fellow Mozzer's, for one of my clients I get different local results in Google. My client is a real-estate broker and when I search on "real-estate agent" + the city name we are on top. So whoohoo you would say BUT when Firefox has the exact city name determined as the location I am in and I only use "real-estate agent" I get also the local results but we are listed as number 8?? Hope anyone can give me insights as I have no idea what's causing this. Thanks in advance for your help!
Intermediate & Advanced SEO | | newtraffic0