Nofollow and ecommerce cart/checkout pages
-
Hi!!
Another noob question:
Should I be nofollowing my site's cart and checkout pages? Or as SEs can't get to the checkout pages without either logging in or completing the form is it something I shouldn't worry about? Have read things saying both. Not sure which is correct.
Thank you! Appreciate the help.
Lynn
-
Thank you James!! I really appreciate the insight and your patience.
Lynn
-
yes that's all correct.
-
On my site the only things that are accessible via HTTPS are the checkout pages and the my account pages (or so I am told - still testing). So for these I could mark "noindex, nofollow" correct as don't really want Google to crawl these? And robots.txt can accomplish the same thing (robots.txt may be easier for me as requires no dev time; I can't control this tag via the CMS)?
Thanks for the input!
Lynn
-
1. yes
2. yes, robots.txt works too - there are numerous ways to have the same effect. personal preference comes into it, plus one may be easier than another in your site/CMS. The reason I use noindex is that any page on my site could be accessed by https - so I prefer to dynamically throw noindex into any page that is accessed that way.
-
Hello!
Thank you both for taking the time to answer. A follow-up question just so I understand:
1. "noindex, follow" will allow SEs to crawl a page but NOT put it in the index correct?
2. Can't I also stop SE access to certain directories/pages by putting an entry in the robots.txt? This would stop crawling AND indexing correct?
Why would one use one over the other? Just want to understand the idea behind it.
Thank you so much guys!!
Lynn
-
the safest route is to "noindex, follow" any page that is requested by https - this also squashes duplicate content when the user accesses non-cart pages using https...
-
Hey,
I'd 'noindex, nofollow' cart pages as they are no use to anyone searching and you're just going to dilute your authority through those extra pages.
DD
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPS for form pages?
I am creating a small business website for a friend in Recruitment. It’s very small and mainly just a shop window for the business. There’s no login area for the website, but there are two areas were users can enter information: General contact us form (giving email and phone number) Applying for a job (attaching a resume) The forms are using Ninja Forms – which I believe are secure in passing information. But am I missing anything? Do I need to make these pages https at all? I’m quite new to building sites from scratch. Thanks for your help
Technical SEO | | joberts0 -
Spammy structured data for http://www.heritageprinting.com/ might be dropped from search results
We received the above message, which I'm see may also have. Before I go making hours of edits can someone give me an opinion on what may need fixed? Here's a link to one of our products: http://heritageprinting.com/products/step-and-repeat.phpAll products are uniquely marked upIt may be the $ dollar sign, but I'm not certain.Looking at WMT > Search Appearance > Structured Data, I see no errors for Schema Markup. TY in advance :)KJr
Technical SEO | | KevnJr0 -
Page Speed or Size?
Hi everyone. I have a client who really wants to add a 1min html5 video to the background of their homepage. I have managed to reduce the size of the video to 20MB and I have tested the page in pingdom. The results are 1.85 s to load, and weighed in at 21.2 MB. My question is does Google factor page load speed or size in it's ranking factors? I am also mindful of the negative effect this could have on bounce rate. Thanks.
Technical SEO | | WillWatrous0 -
Does adding subcategory pages to an commerce site limit the link juice to the product pages?
I have a client who has an online outdoor gear company. He mostly sells high end outdoor gear (like ski jackets, vests, boots, etc) at a deep discount. His store currently only resides on Ebay. So we're building him an online store from scratch. I'm trying to determine the best site architecture and wonder if we should include subcategory pages. My issue is that I think the subcategory pages might be good from a user experience, but it'll add an additional layer between the homepage and the product pages. The problem is that I think a lot of user's might be searching for the product name to see if they can find a better deal, and my client's site would be perfect for them. So I really want to rank well for the product pages, but I'm nervous that the subcategory pages will limit the link juice of the product pages. Home --> SubCategory --> Product List --> Product Detail Home --> Men's Ski Clothing --> Men's Ski Jack --> North Face Mt Everest Jacket Should I keep the SubCategory page "Men's Ski Clothing" if it helps usability? On a separate note, the SubCategory pages would have some head keyword terms, but I don't think that he could rank well for these terms anytime soon. However, they would be great pages / terms to rank for in the long term. Should this influence the decision?
Technical SEO | | Santaur0 -
Can You Use More Then One Google Local Rich Snippet on a single site/ on a single page.
I am currently working on a website for a business that has multiple office locations. As I am trying to target all four locations I was wondering if it is okay to have more then one Local Rich Snippet on a single page. (For example they list all four locations and addresses within their footer and I was wondering if I could make these local rich snippets). What about having more then one on a single website. For example if a company has multiple offices located in several different cities and have set up individual contact pages for these cities, can each page have it's own Local Rich Snippet? Will Google look at these multiple "local rich snippets" as spaming or will they recognize the multiple locations and count it towards their local seo?
Technical SEO | | webdesignbarrie1 -
Pages extensions
Hi guys, We're in the process of moving one of our sites to a newer version of the CMS. The new version doesn't support page extensions (.aspx) but we'll keep them for all existing pages (about 8,000) to avoid redirects. The technical team is wondering about the new pages - does it make any difference if the new pages are without extensions, except for usability? Thanks!
Technical SEO | | lgrozeva0 -
Ranked on Page 1, now between page 40-50... Please help!
My site, http://goo.gl/h0igI was ranking on page one for many of our biggest keywords. All of a sudden, we completely fell off. I believe I'm down somewhere between page 40-50. I have no warning or error messages in webmaster tools. Can anyone please help me identify what the problem is? This is completely unexpected and I don't know how to fix it... Thanks in advance
Technical SEO | | Prime850 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0