Issue with GA tracking and Native AMP
-
Hi everyone,
We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates.
As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly.
The three views are:
- Sitewide (shop.winefolly.com and winefolly.com)
- Content only (winefolly.com)
- Shop only (shop.winefolly.com)
The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic.
The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern:
^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/.
Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly.
I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either.
The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before.
The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin.
This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change.
I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong.
Any suggestions or input would be very much appreciated.
Cheers,
Chris (on behalf of WineFolly.com) -
Lots going on here, so, a laundry list of follow up questions and thoughts for you...
Are you seeing AMP results showing up in the Search Console? Are you seeing them indexed as intended?
If you're doing Native AMP, you won't be able to diagnose pages by /amp URL types of formatting. It might be worth trying to fire off an event, or custom dimension in GA, for AMP = Yes / No or something like that.
For the sitewide view, have you tested loading pages on a private browser and incognito mobile browser and seeing if they show up in GA realtime in each of the 3 views when they're supposed to?
It looks like you might be using Cloudflare - I haven't dealt with an AMP site that uses it, but have you checked whether there are compatibility issues or anything you need to activate?
Are any Google Tag Manager pages set to fire on HTTPS only?
Are any GA filters in place that specify HTTP/HTTPS that need to be broadened?
Your Amp Analytics code seems to match the one on a site that is functioning as intended, so I don't think it's a formatting issue.
For the GA view filter - it seems like you should be able to simply include/exclude traffic to shop.winefolly.com - why the added complexity beyond that?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Href lang issues - help needed!
Hi, I have an issue with Google indexing the US version of our website rather than the UK version on Google.co.uk. I have added hreflang tags to both sites (https://www.pacapod.com/ and https://us.pacapod.com/), have updated and submitted an XML sitemap for each website and checked that the country targeting in search console is set-up correctly but Google are still indexing the wrong website. I would be grateful for any assistance with this issue. Many thanks Eddie
Technical SEO | | mypetgiftbox0 -
Crawl Issues / Partial Fetch Via Google
We recently launched a new site that doesn't have any ads, but in Webmaster Tools under "Fetch as Google" under the rendering of the page I see: Googlebot couldn't get all resources for this page. Here's a list: URL Type Reason Severity https://static.doubleclick.net/instream/ad_status.js Script Blocked Low robots.txt https://googleads.g.doubleclick.net/pagead/id AJAX Blocked Low robots.txt Not sure where that would be coming from as we don't have any ads running on our site? Also, it's stating the the fetch is a "partial" fetch. Any insight is appreciated.
Technical SEO | | vikasnwu0 -
IP Redirect causing Indexing Issue
Hi, I am trying to redirect any IP from outside India that comes to Store site (https://store.nirogam.com/) to Global Store site (https://global.nirogam.com/) using this methodThis is causing various indexing issues for Store site as Googlebot from US also gets redirected!- Very few pages for "store.nirogam.com/products/" are being indexed. Even after submission of sitemap it indexed ~50 pages and then went back to 1 page etc. Only ~20 pages indexed for now.- After this I tried manually indexing via "Crawl -> Fetch as Google" - but then it showed me a redirect to global.nirogam.com. All have their "status -> Redirected" - This is why bots are not able to index the site.What are possible solutions for this? How can we tell bots to index these pages and not get redirected?Will a popup method where we ask user if they are outside India help in solving this issue?All approaches/suggestions will be highly appreciated.
Technical SEO | | pks3330 -
What are the SEO strengths & weaknesses of Magnolia CMS?
We are considering upgrading our Web eCommerce platform. Our current provider has just implemented Magnolia CMS into their Web store package. Do any of you have experience with this CMS and can you share your experiences and thoughts on whether or not it has any implications for SEO? Thanks!
Technical SEO | | danatanseo0 -
Webmaster internal links issue
Hi All, In webmaster > Internal links https://www.google.com/webmasters/tools/internal-links?hl=en&siteUrl= I get counts as in the image http://imgur.com/9bO5H0f is this logical and ok or should i work on finding why so many links and reduce them? Thanks Martin
Technical SEO | | mtthompsons0 -
Track PDF files downloaded from my site
I came across this code for tracking PDF files [1. map.pdf ( name of PDF file ) and files is the folder name. Am i right ? 2. What shall i be able to track using the code given above ? a ) No. of clicks on links or how many persons downloaded the PDF files ? 3. Where in Google this report will be visible ? Thanks a lot.](http://www.example.com/files/map.pdf)
Technical SEO | | seoug_20050 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco. Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages). However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure. However, we have found that the alias does not override, but just adds another option to finding a page. Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas. I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success. How much of a problem is this with respect to duplicate content? Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0 -
"Too Many On-Page Links" Issue
I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page. The site is here - http://www.ibethel.org/ Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?
Technical SEO | | BethelMedia0