How to stop crawls for product review pages? Volusion site
-
Hi guys, I have a new Volusion website. the template we are using has its own product review page for EVERY product i sell (1500+) When a customer purchases a product a week later they receive a link back to review the product. This link sends them to my site, but its own individual page strictly for reviewing the product. (As oppose to a page like amazon, where you review the product on the same page as the actual listing.)
**This is creating countless "duplicate content" and missing "title" errors. What is the most effective way to block a bot from crawling all these pages? Via robots txt.? a meta tag? **
Here's the catch, i do not have access to every individual review page, so i think it will need to be blocked by a robot txt file? What code will i need to implement? i need to do this on my admin side for the site? Do i also have to do something on the Google analytics side to tell google about the crawl block?
Note: the individual URLs for these pages end with: *****.com/ReviewNew.asp?ProductCode=458VB
Can i create a block for all url's that end with /ReviewNew.asp etc. etc.?
Thanks! Pardon my ignorance. Learning slowly, loving MOZ community
-
No you should be fine
-
thanks. you say "update on 4/21" you're talking about googles update requiring more mobile friendly sites? My volusion template has its own mobile version. It is not a responsive template. So i should not be affected correct?
-
Parameters are good for pages that are a result of a search or sort. I guess it isn't necessary really, I am just a little ocd about that kind of stuff. The parameters in WMT basically tell google that these things might appear in the URL, and then you can the bot to ignore it or let GoogleBot decide how to read the URL.
The mobile site is not the same as a responsive design and on of the main reasons I left Volusion. The mobile site will get you through the update on 4/21, but if possible you should ask them for a responsive site. Just call the support number, or your account manager and ask.
-
ive had the following in my robots.txt file. Do i need to add the astrisk like you have posted above?
Currently in my robot.txt:
User-agent:*
Disallow: /reviews.asp/User-agent:*
Disallow: /reviewnew.asp/ -
Thanks monica, can you elaborate a bit more on the webmaster tools parameter? what specifically does adding a parameter like that do? You did that as a backup in case the robot txt file was not working? we do have a mobile version enabled which came with our template. Ill keep an eye out for the 404's. where do i check for a responsive template? Ours is one of their premium templates so its possible we are already on a responsive one? Can you clarify what responsive template means?
thanks.
-
I did that in my Volusion store. I also added ReviewNew.asp?ProductCode= as a parameter in Google Webmaster Tools. Do you have an enabled mobile site as well? If you do there are several 404 errors that you will start to see from there. Make sure you are adding parameters accordingly. I am not sure if Volusion has started offering their responsive templates yet, but if they have I would see if you can implement that over the mobile site.
-
Hi ,
Yes you can block such URL by using below code in robots.txt file.
User-agent: *
Disallow: /*ReviewNew.aspThanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Appending Blog URL inbetween my homepage and product page is it issue with base url?
Hi All, Google Appending Blog URL inbetween my homepage and product page. Is it issue or base url or relative url? Can you pls guide me? Looking to both tiny url you will get my point what i am saying. Please help Thanks!
Technical SEO | | amu1230 -
Why does my site have so many crawl errors relating to the wordpress login / captcha page
Going through the crawl of my site, there were around 100 medium priority issues, such as title element too short, and duplicate page title, and around 80 high priority issues relating to duplicate page content - However every page listed with these issues was the site's wordpress login / captcha page. Does anyone know how to resolve this?
Technical SEO | | ZenyaS0 -
Bingbot appears to be crawling a large site extremely frequently?
Hi All! What constitutes a normal crawl rate for daily bingbot server requests for large sites? Are any of you noticing spikes in Bingbot crawl activity? I did find a "mildly" useful thread at Black Hat World containing this quote: "The reason BingBot seems to be terrorizing your site is because of your site's architecture; it has to be misaligned. If you are like most people, you paid no attention to setting up your website to avoid this glitch. In the article referenced by Oxonbeef, the author's issue was that he was engaging in dynamic linking, which pretty much put the BingBot in a constant loop. You may have the same type or similar issue particularly if you set up a WP blog without setting the parameters for noindex from the get go." However, my gut instinct says this isn't it and that it's more likely that someone or something is spoofing bingbot. I'd love to hear what you guys think! Dana
Technical SEO | | danatanseo1 -
Off-site company blog linking to company site or blog incorporated into the company site?
Kind of a SEO newbie, so be gentle. I'm a beginner content strategist at a small design firm. Currently, I'm working with a client on a website redesign. Their current website is a single page dud with a page authority of 5. The client has a word press blog with a solid URL name, a domain authority of 100 and page authority of 30. My question is this: would it be better for my client from an SEO perspective to: Re-skin their existing blog and link to the new company website with it, hopefully passing on some of its "Google Juice,"or... Create a new blog on their new website (and maybe do a 301 redirect from the old blog)? Or are there better options that I'm not thinking of? Thanks for whatever help you can give a newbie. I just want to take good care of my client.
Technical SEO | | TheKatzMeow0 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Should I delete a page or remove links on a penalized page?
Hello All, If I have a internal page that has low quality links point to it or a penality. Can I just remove the page, and start over versus trying to remove the links? Over time wouldn't this page disapear along with the penalty on that page? Kinda like pruning a tree? Cutting off the junk limbs so other could grow stronger, or to start new fresh ones. Example: www.domain.com Penalized Internal Page: (Say this page is penalized due to keyword stuffing, and has low quality links pointing to it like blog comments, or profiles) www.domain.com/penalized-internal-page.com Would it be effective to just delete this page (www.domain.com/penalized-internal-page.com) and start over with a new page. New Internal Page: www.domain.com/new-internal-page.com I would of course lose any good links point to that page, but it might be easier then trying to remove old back links. Thoughts? Thanks! Pete
Technical SEO | | Juratovic0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Please recommend a tool to list pages on my site.
I have taken a major hit from the latest update. Site has been online for 10 years, white hat SEO all the way but I do have some legacy pages were I would duplicate title or the description on a new page. Things are just unorganized currently and trying to find the best approach to organizing what I already have as well as track new content. I would like to have a tool that would basically extract a list of my current pages, the title tags and the description in an Excel file. Not sure how the pros organinze the SEO on a site but my biright idea is that I can have a large excel file with the pages listed so I can detect duplicate info. Site only has about 300 pages. Just regular php pages, no CMS. Thanks in advance!
Technical SEO | | Force70