Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Checkout on different domain
-
Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain.
An example is this page: http://www.printingforless.com/2/Brochure-Printing.html
One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain.
We could also move the checkout process to a subdomain instead of a new domain.
Please ignore the concerns with visitors security and conversion rate. Thanks!
-
In my opinion there isn't really any downside to this from a Google perspective; as you said, they shouldn't even be indexed anyway. Many many vendors out there have their charge/fulfillment go straight to PayPal for example, and don't even host any checkout specific code (other than cart-building, account creation, etc.) on their site at all.
There's also the case where multiple microsites will all use the same checkout on another domain, used to centralize checkouts. As far as I know these sites aren't punished either, and it definitely saves money on the secure certificates.
There is however, another angle to consider, and that is the human angle. Some people (who aren't savvy about ecommerce) might be alarmed that their secure checkout is occurring on a different domain than the one they've been browsing on. This is a 'security/conversaion' rate issue though, so you may already know of it.
In my opinion I would leave it alone and not bother with the iframe tricks and so on. A subdomain might be more reassuring to the user (e.g. secure.printingofrless.com instead printingforless1.com) but I honestly can't see why the current setup would have Google implications, as long as your SSL/non-SSL pages are separate and canonicalized properly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Masking SEO Impact
I hope I am explaining this correctly. If I need to provide any clarity please feel free to ask. We currently use a domain mask on an external platform that points back to our site. We are a non-profit and the external site allows users to create peer-to peer fundraisers that benefit our ministry. Currently we get many meta issues related to this site as well as broken links when fundraisers expire etc. We do not have a need to rank for the information from this site. Is there a way to index these pages so that they are not a part of the search engine site crawls as it relates to our site?
Technical SEO | | SamaritansPurse0 -
Images on sub domain fed from CDN
I have a client that uses a CDN to fill images, from a sub domain ( images.domain.com). We've made sure that the sub domain itself is not blocked. We've added a robots.txt file, we're creating an image sitemap file & we've verified ownership of the domain within GWT. Yet, any crawler that I use only see's the first page of the sub domain (which is .html) but none of the subsequent URL's which are all .jpeg. Is there something simple I'm missing here?
Technical SEO | | TammyWood0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Moving my domain to weebly
I am thinking of moving my html website to weebly. They offer a 301 redirect for my domain name. Is that ok for SEO?
Technical SEO | | bhsiao0 -
Handling Multiple Restaurants Under One Domain
We are working with a client that has 2 different restaurants. One has been established since 1938, the other was opened in late 2012. Currently, each site has its own domain name. From a marketing/branding perspective, we would like to make the customers [web visitors] of the established restaurant aware of the sister restaurant. To accomplish this, we are thinking about creating a landing page that links to each restaurant. To do this, we would need to purchase a brand new URL, and then place each restaurant in a separate sub folder of the new URL. The other thought is to have each site accessed from the main new URL [within sub folders] and also point each existing URL to the appropriate sub folder for each restaurant. We know there are some branding and marketing hurdles with this approach that we need to think through/work out. But, we are not sure how this would impact their SEO––and assume it will not be good. Any thoughts on this topic would be greatly appreciated.
Technical SEO | | thinkcreativegroup0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | | AL123al0 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0