How to stop crawls for product review pages? Volusion site
-
Hi guys, I have a new Volusion website. the template we are using has its own product review page for EVERY product i sell (1500+) When a customer purchases a product a week later they receive a link back to review the product. This link sends them to my site, but its own individual page strictly for reviewing the product. (As oppose to a page like amazon, where you review the product on the same page as the actual listing.)
**This is creating countless "duplicate content" and missing "title" errors. What is the most effective way to block a bot from crawling all these pages? Via robots txt.? a meta tag? **
Here's the catch, i do not have access to every individual review page, so i think it will need to be blocked by a robot txt file? What code will i need to implement? i need to do this on my admin side for the site? Do i also have to do something on the Google analytics side to tell google about the crawl block?
Note: the individual URLs for these pages end with: *****.com/ReviewNew.asp?ProductCode=458VB
Can i create a block for all url's that end with /ReviewNew.asp etc. etc.?
Thanks! Pardon my ignorance. Learning slowly, loving MOZ community
-
No you should be fine
-
thanks. you say "update on 4/21" you're talking about googles update requiring more mobile friendly sites? My volusion template has its own mobile version. It is not a responsive template. So i should not be affected correct?
-
Parameters are good for pages that are a result of a search or sort. I guess it isn't necessary really, I am just a little ocd about that kind of stuff. The parameters in WMT basically tell google that these things might appear in the URL, and then you can the bot to ignore it or let GoogleBot decide how to read the URL.
The mobile site is not the same as a responsive design and on of the main reasons I left Volusion. The mobile site will get you through the update on 4/21, but if possible you should ask them for a responsive site. Just call the support number, or your account manager and ask.
-
ive had the following in my robots.txt file. Do i need to add the astrisk like you have posted above?
Currently in my robot.txt:
User-agent:*
Disallow: /reviews.asp/User-agent:*
Disallow: /reviewnew.asp/ -
Thanks monica, can you elaborate a bit more on the webmaster tools parameter? what specifically does adding a parameter like that do? You did that as a backup in case the robot txt file was not working? we do have a mobile version enabled which came with our template. Ill keep an eye out for the 404's. where do i check for a responsive template? Ours is one of their premium templates so its possible we are already on a responsive one? Can you clarify what responsive template means?
thanks.
-
I did that in my Volusion store. I also added ReviewNew.asp?ProductCode= as a parameter in Google Webmaster Tools. Do you have an enabled mobile site as well? If you do there are several 404 errors that you will start to see from there. Make sure you are adding parameters accordingly. I am not sure if Volusion has started offering their responsive templates yet, but if they have I would see if you can implement that over the mobile site.
-
Hi ,
Yes you can block such URL by using below code in robots.txt file.
User-agent: *
Disallow: /*ReviewNew.aspThanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Safety Data Sheet PDFs are Showing Higher in Search Results than Product Pages
I have a client who just launched an updated website that has WooCommerce added to it. The website also has a page of Safety Data Sheets that are PDFs that contain information about some of the products. When we do a Google search for many of the products the Safety Data Sheets show up first in the search results instead of the product pages. Has anyone had this happen and know how to solve the issue?
Technical SEO | | teamodea0 -
Why Product pages are throwing Missing field "image" and Missing field "price" in Wordpress Woocommerce
I have a wordpress wocommerce website where I have uploaded 100s of products but it's giving me error in GSC under merchant listing tab. When I tested it show missing field image and missing field price. I have done everything according to https://developers.google.com/search/docs/appearance/structured-data/product#merchant-listing-experiences and applied fixed i.e. images are 800x800 and price range is also there. What else can be done here?!merchant listing.jpg
Technical SEO | | Ravi_Rana0 -
Optimization expert suggesting we add Canonical tag to every page on site
Hi guys, We're currently launching a new page, and we have an optimization and technical SEO expert (highly rated on Upwork, very intelligent, has solved complicated issues in the past and improved our Core Web Vitals greatly) suggesting we put canonical tags on every page of site, pointing to itself (other than the case of where canonicals should point to other page, we have those listed separately. Do you guys see a benefit to this? Could it harm us? He says large retailers do this, couldn't quite glean the benefit from it though. Current site ranks well and isn't set up like this. Any insight would be much appreciated! Thanks!
Technical SEO | | CitimarineMoz0 -
UGC - Product Reviews Search Engine accesibility
Hi everyone A question about using product reviews on a website - I completely understand genuine product reviews can be great for creating totally unique content and for seo in general. But my understanding is also that it is great only when it's accessible by google. So the question I have is that by using aggregate review services - i.e. reviews(dot)co(dot)uk widgets, are they always easily accessible by Google? Examples I've seen that use the widget are: https://www.oliverbonas.com/jewellery/silver-honeycomb-bee-necklace-29494 http://www.fires-cookers.co.uk/hoover-dyc169a-tumble-dryer-white.html#prod-reviews So when you view the page source for these pages, the actual content of the reviews are nowhere to be seen. I was wondering if anyone has experience of using any of these thrid party widgets, and if these come as issues, what you would do to make sure these reviews content are seen by Google? Thank you
Technical SEO | | MH-UK0 -
Pages with 301 redirects showing as 200 when crawled using RogerBot
Hi guys, I recently did an audit for a client and ran a crawl on the site using RogerBot. We quickly noticed that all but one page was showing as status code 200, but we knew that there were a lot of 301 redirects in place. When our developers checked it, they saw the pages as 301s, as did the Moz toolbar. If page A redirected to page B, our developers and the Moz toolbar saw page A as 301 and page B as 200. However the crawl showed both page A and page B as 200. Does anyone have any idea why the crawl may have been showing the status codes as 200? We've checked and the redirect is definitely in place for the user, but our worry is that there could be an issue with duplicate content if a crawler isn't picking up on the 301 redirect. Thanks!
Technical SEO | | Welford-Media0 -
Google Seeing Way More Pages Than My Site Actually Has
For one of my sites, A-1 Scuba Diving And Snorkeling Adventures, Google is seeing way more pages than I actually have. It sees almost 550 pages but I only have about 50 pages in my XML. I am sure this is an error on my part. Here is the search results that show all my pages. Can anyone give me some guidance on what I did wrong. Is it a canonical url problem, a redirect problem or something else. Built on Wordpress. Thanks in advance for any help you can give. I just want to make sure I am delivering everything I can for the client.
Technical SEO | | InfinityTechnologySolutions0 -
Google has deindexed 40% of my site because it's having problems crawling it
Hi Last week i got my fifth email saying 'Google can't access your site'. The first one i got in early November. Since then my site has gone from almost 80k pages indexed to less than 45k pages and the number is lowering even though we post daily about 100 new articles (it's a online newspaper). The site i'm talking about is http://www.gazetaexpress.com/ We have to deal with DDoS attacks most of the time, so our server guy has implemented a firewall to protect the site from these attacks. We suspect that it's the firewall that is blocking google bots to crawl and index our site. But then things get more interesting, some parts of the site are being crawled regularly and some others not at all. If the firewall was to stop google bots from crawling the site, why some parts of the site are being crawled with no problems and others aren't? In the screenshot attached to this post you will see how Google Webmasters is reporting these errors. In this link, it says that if 'Error' status happens again you should contact Google Webmaster support because something is preventing Google to fetch the site. I used the Feedback form in Google Webmasters to report this error about two months ago but haven't heard from them. Did i use the wrong form to contact them, if yes how can i reach them and tell about my problem? If you need more details feel free to ask. I will appreciate any help. Thank you in advance C43svbv.png?1
Technical SEO | | Bajram.Kurtishaj1