Hi . I have a question regarding a surge in 302 temporary direct errors following reactivation of my site reviews.
-
Hi. i submitted the following question to the support team at volusion regarding a large number of 302 temporary redirects on the SEOMOZ crawl of following the reactivation of product reviews on my website. Here is the question and their response. Any opinions. Thanks
TRANSCRIPT:
Hi Howard,
Looking at the link you have referenced, this page is unable to be accessed unless you are logged into your site, so SEO isn't applicable for the page.
We would not be able to change the 302 redirect for this page.
Please don't hesitate to contact us if you have any further questions.
Thanks,
Meredith G.
Here for you 24x7x365 Volusion SupportCheck out our online support: http://support.volusion.com
| 4 days ago (102hrs ago) [8/19/2012 10:06:00 AM] by Customer | |
Hi . I had a report of 302 temporary redirects when SEO moz did a simulated site crawl.
This is an example of the address of one of the many 302 temporary redirect errors which I think consume link juice or have some type of negative effect on SEO.
http://www.mrgrabbar.com/reviewhelpful.asp?ProductCode=GT100&ID=562&yes=yes
Since it is a live link I cannot seem to redirect it with a 301 redirect in the redirect manager. Is there anything I can do to handle these temporary redirects. They only are appearing now since I reactivated my reviews and I am concerned about the SEO effect that this may soon have.
-
Hi Ben,Where do you add the 'rel="nofollow"'? I can't seem to find it.
-
I've got the same problem with the 302 redirects and the "review helpful?" issue in Volusion. Did you figure out how to make these links "no follow" in Volusion?
-
Well, the nofollow tag doesn't consume any link juice, but it will save your "crawl allowance" - this means you want to cut back on Google crawling unnecessary links and pages. And since all those "review" links point to a login page, this can be seen as a low-value page.
-
Thanks Cyrus. I thought that I had read somewhere that the nofollow tag would consume link juice in that the flow of link juice would be interrupted and not circulate back through the site whereas the noindex follow would prevent indexing but conserve link juice.
I do not see a way to be able add tags to these particular pages in volusion and support at volusion seems to think these pages will not be crawled anyway
-
Without having much direct knowledge of Volusion, Ben appears to be correct. Since the links are being generated by the "leave a review" links, you don't really want these followed anyway.
Where you are thinking of adding the "NOINDEX, FOLLOW" or canonical tags?
-
Thanks Ben. I am also looking into the possibilty of using the "noindex,follow" tag or the "canonical" meta tag. What is your opinion on these and would these meta tags be added in the meta tags overide section. I actually do not see a way to add meta tags for the review helpful page within volusion. I do see a way that a met tag overide can be added for email a friend.
Thanks,
Howard
-
I have a client in Volusion as well. You need to go into the link on the page for leaving a review and add a "rel="nofollow" " tag to the link. This will solve your 302's.
I did the same thing to the "email a friend" option on the product pages and it solved that issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce site Header tag compulsory to follow till h6?
Hello Experts? Header tags for homepage, category, product pages is it required to be have from h1 till h6? If No...then w3c validator will give me Warning and it will never Pass my site so? Do google say h1, h2, h3..etc should be in sequence only? As per my site design I cannot define h1 always on top it can be after h2 or h3 so? h1 can be only 1 for per page but it is fine h2, h3 etc multiple for same page? At listing page to give all 30 products names as h3 is fine or it is not correct way? For ecommerce site what is the best header tags you suggest? I mean till h3 is fine or h1 also fine? Do header tags helps google in crawling ? or for ranking purpose? or header tags should be best for blog not for ecommerce site? Thanks!
On-Page Optimization | | Johny123450 -
Site is too slow? Seeing a new code.
Site is too slow. I am seeing this new code more than 1000 times in my home page- What should I do now? My site- http://a1stlucie.com/
On-Page Optimization | | Beachflower0 -
No index, or no index no follow?
Wondering if I could garner some views on this issue please. I'm about to add an affiliate store to a website I own, the site has a couple of pages of unique content (blogs, articles, advice etc on home improvement - all written by my team). Obviously, the affiliate store will not be unique content, it will be made using the datafeeds from cj.com et al, and so I don't want to get any duplicate content type penalties from Google for this store. Should I add a no index to the pages and allow the bots to still crawl them, or should I add no index and no follow? Ideally I would like to get the affiliate store category pages indexed as they will be a mixture of lots of different merchants and be fairly unique. Can Google still mark the site down for duplicate content if it can crawl it, even if it is noindex? Thanks, Carl
On-Page Optimization | | Grumpy_Carl0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Page Review Please
Can you please tell me what I can do to improve the SEO of this page. I feel like I am seeing the forest and not the trees or the other way around. I just want to learn from the errors of this page, so I can just simply learn http://azbestlistings.com/casa-grande-az-real-estate-homes-for-sale-in-casa-grande-az Thank you,
On-Page Optimization | | sansonj0 -
Duplicate page content errors
Site just crawled and report shows many duplicate pages but doesn't tell me which ones are dups of each other. For you experienced duplicate page experts, do you have a subscription with copyscape and pay $.05 per test? What is the best way to clear these? Thanks in advance
On-Page Optimization | | joemas990 -
How to trace back a link from a 404 error?
How do I find the link that is still refferring to a 404 error? Somewhere on my site there is a link pointing to an un-usable page and I cant find where that link is to fix the 404 error. I could just delete the page all together, but I would like to find the problematic link first.
On-Page Optimization | | AtoZion0