Funnel tracking with one page check-out?
-
Hi Guys,
I'm creating a new website with a one page checkout that follows the following steps:
1. Check availability
2. Select product
2. Select additional product & Add features
3. Provide personal information
4. Order & PayI'm researching if it is possible to track all these steps (and even steps within the steps) with Google Analytics in order to analyse checkout abandonment. The problem is only that my one-page checkout has only one URL (I want to keep it that way) and therefore can not be differentiated on URL in the Analytics funnel. To continue to the next step also the same button (in a floating cart) in used to advance. The buttons to select/choose something within one step are all different.
Do you guys know how I can set this up and how detailed I can make this? For example, is it also possible to test at which field visitors leave when for example filling in their personal information?
Would be great if you can help me out!
-
The code Martijn suggested will account for this; essentially, it will act the same as if someone was going through a multi-URL checkout, including if/when they go back a step.
-
Hi Martijn,
Thanks for you advise. The customer can also go back the the step before and after...Is it possible to also integrate this?
-
'Same' thing, so when a step has been successfull you could fire a new pageview like this: ga('send', 'pageview', 'funnel/step-1); this pageview will get recorded in Google Analytics and could be used to your advantage.
-
Hi Martijn,
We are using AJAX for the single URL.
-
Are you using JavaScript to hide the specific steps in the funnel then you should also be able to fire a custom pageview for every step in the funnel which you could use to set up funnel tracking in Google Analytics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase in Soft 404s due to Custom 404 page?
Hi all, We have noticed recently soft 404s are increasing day by day; which are landing on our custom 404 page created a month back. Other 404 pages are NOT landing on custom 404 page. Does this custom 404 page hurting us by causing an increase in soft 404s? Our CMS is WordPress. Thanks
Web Design | | vtmoz0 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Responsive design to serve different page for IE8 - SEO Implications?
A client is planning on developing a responsive designed website which redirects visitors using IE8 to a static webpage that encourages users to visit in another browser. What are the SEO implications of a server redirect just for IE8 visitors? Possible solutions: would containing a link on the static page to "continue browsing" and give the visitor access to the entire site in IE8 work well? Or should a CSS overlay message appear to IE8 visitors, no redirect, that encourages them to visit in another browser? Or serving a separate stylesheet for IE8 visitors, and not giving a responsive experience be optimal? Any suggestions or thoughts are appreciated. Cheers, Alex
Web Design | | Alex.Weintraub0 -
Does stock art photo attribution negatively impact SEO by leaking Google Page Rank?
Greetings: Companies such as Shutterstock often require that buyers place credit attribution on their web pages when photos you buy from them appear on these pages.. Shutterstock requests that credit attribution links such as these be added: Songquan Deng / Shutterstock.com Do these links negatively impact SEO? Or do search engines view them as a positive? Thanks,
Web Design | | Kingalan1
Alan0 -
How to do a non-spammy "doorway page"?
Hi there, ISSUE: I have a client who wishes to use a "doorway" page, but not in a spammy way. He would like to have a nice crisp URL for use in ads/brochures. The page is strictly a landing page (just with a separate URL). DOORWAY/LANDING PAGE WILL BE: Non-spammy -- There will be no attempt to optimize the landing page/no attempt to get the page to rank. Strictly a vanity URL -- he likes the way a separate website looks in ads as opposed to a landing page on the existing website (i.e., www.websitename.com/landing page) WHAT I'M TRYING TO DO: I'm basically trying to figure out what the best things to do to protect his other sites (which are very high quality valuable sites which rank well) from getting punished. STEPS I'M CONSIDERING: Robots no follow Separate hosting server Different person's name on a private domain registration Adding additional pages, so it's not a 1-page "doorway" Many thanks in advance to anyone who would share their experience and help me protect my client in the best way possible. I've told him there are risks, but he still wants to go ahead. MC
Web Design | | marketingcupcake1 -
How can we improve our e-commerce site architecture to help best preserve Page Authority?
Today I installed the SEOMoz toolbar for Firefox (very cool, highly recommended). I was comparing our site http://www.ccisolutions.com to this competitor: http://www.uniquesquared.com For the most part, the deeper I go in our site the more the page authority drops. We have a few exceptions where the page authority of a subcategory page is actually better than the cat. page one level up. In comparison, when I was looking at http://www.uniquesquared.com I noticed that their page authority stays at "21" on every single category page I visit. Are you seeing what I'm seeing? Is this potentially a problem with the tool bar or, is there something significantly different about their site architecture that allows them to maintain that PA across all category and sub category pages? Is there something fundamentally wrong with our (http://www.ccisolutions.com) site architecture? I understand that we have longer URLs, but this is an old store with a lot of SKUs, so we have decided not to remove the /category/ and /product/ from the URLs because the 301 redirects that would result wouldn't pass all of the authority they've built up over the years. Interested to know viewpoints on the site architecture and how it might be improved. Thanks!
Web Design | | danatanseo0 -
Decreasing Page Load Time with Placeholder Images - Good Idea or Bad Idea?
In an effort to decease our page load time, we are looking at making a change so that all product images on any page past page 1 load with a place holder image. When the user clicks to the next page, it then loads all of the images for that page. Right now, all of the product divs are loaded into a Javascript array and loaded in chunks to the page display div. Product-heavy pages significantly increase load time as the browser loads all of the images from the product HTML before the Javascript can rewrite the display div with page-specific product HTML. In order to get around this, we are looking at loading the product HTML with a small placeholder image and then substituting the appropriate product image URLs when each page is output to the display div. From a user experience, this change will be seamless and they won't be able to tell the difference, plus they will benefit from a potentially a short wait on loading the images for the page in question. However, the source of the page will have all of the product images in a given category page all having the same image. How much of a negative impact will this have on SEO?
Web Design | | airnwater0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0