Issue with GA tracking and Native AMP
-
Hi everyone,
We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates.
As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly.
The three views are:
- Sitewide (shop.winefolly.com and winefolly.com)
- Content only (winefolly.com)
- Shop only (shop.winefolly.com)
The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic.
The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern:
^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/.
Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly.
I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either.
The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before.
The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin.
This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change.
I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong.
Any suggestions or input would be very much appreciated.
Cheers,
Chris (on behalf of WineFolly.com) -
Lots going on here, so, a laundry list of follow up questions and thoughts for you...
Are you seeing AMP results showing up in the Search Console? Are you seeing them indexed as intended?
If you're doing Native AMP, you won't be able to diagnose pages by /amp URL types of formatting. It might be worth trying to fire off an event, or custom dimension in GA, for AMP = Yes / No or something like that.
For the sitewide view, have you tested loading pages on a private browser and incognito mobile browser and seeing if they show up in GA realtime in each of the 3 views when they're supposed to?
It looks like you might be using Cloudflare - I haven't dealt with an AMP site that uses it, but have you checked whether there are compatibility issues or anything you need to activate?
Are any Google Tag Manager pages set to fire on HTTPS only?
Are any GA filters in place that specify HTTP/HTTPS that need to be broadened?
Your Amp Analytics code seems to match the one on a site that is functioning as intended, so I don't think it's a formatting issue.
For the GA view filter - it seems like you should be able to simply include/exclude traffic to shop.winefolly.com - why the added complexity beyond that?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetching & Rendering a non ranking page in GWT to look for issues
Hi I have a clients nicely optimised webpage not ranking for its target keyword so just did a fetch & render in GWT to look for probs and could only do a partial fetch with the below robots.text related messages: Googlebot couldn't get all resources for this page Some boiler plate js plugins not found & some js comments reply blocked by robots (file below): User-agent: *
Technical SEO | | Dan-Lawrence
Disallow: /wp-admin/
Disallow: /wp-includes/ As far as i understand it the above is how it should be but just posting here to ask if anyone can confirm whether this could be causing any prrobs or not so i can rule it out or not. Pages targeting other more competitive keywords are ranking well and are almost identically optimised so cant think why this one is not ranking. Does fetch and render get Google to re-crawl the page ? so if i do this then press submit to index should know within a few days if still problem or not ? All Best Dan0 -
I am having an issue with my rankings
I am having an issue with my rankings. I am not sure if there are issues with onpage dup content or with the way wordpress is behaving but there is no reason based upon the sites back link profile that the site shouldn't be ranking well. The site is mesocare.org. If anyone can help it would be appreciated.
Technical SEO | | weitzluxenberg0 -
Keyword Suggestions Tool & Different Subdomains
Hey all, Was reading Dan Shure's brilliant post on the Keyword Planner, and decided to plug a few of my own pages into the URL-suggester tool as well. What I got back was nothing short of strange. After plugging in our Features page, (which describes our Social Media Contesting Platform,) and getting back a bunch of suggestions related to Dr Seuss and Interior Design Scholarships, I realized that the Keyword Suggestion tool was being broken by our subdomains. I looked for precedent on my particular issue, but I think I might not be searching properly. Could anyone provide any insight into whether or not this might affect how spiders see the content on Strutta.com, whether or not this is just something that will affect the Keyword Suggestions Tool or actual SERP rankings, and if this content is already present elsewhere on MOZ, a link to said content? Much obliged 🙂
Technical SEO | | danny.wood0 -
Too many on-page links vs. UX issue
I am having an issue with many of our pages having too many on-page links. I have gotten many of them below the 100 page limit that is suggested and I understand this is not a critical factor with SEO, but my issue is this: Many important pages I am trying to optimize are buried at a "3rd" level which is actually not accessible from the home page navigation dropdown due to our outdated CMS. I am trying to decide if we should develop our site to display these pages on-hover from the main navigation. This would make a lot of sense since users would find these pages easier, however adding this functionality would increase on-page links by a lot more. So in your opinion, would it be worth it to spend the money to have this functionality developed? Or would it end up hurting our SEO standings?
Technical SEO | | isret_efront0 -
Removing a lot of content & changing url structure.
I recently moved an existing ecommerce site, which I recently purchased, from Volusion to Shopify. The new site has a completely different link structure. The old site also had about 120 products which are not even close to being up to par with the products I now have on the site. So I had to remove all of those pages too. I was just wondering which measures I need to take to deal with this? I created a really nice 404 page. I also 301 redirected the pages which still exist. But I was wondering if there is anything else I should do? Should I request a removal of all the old pages, which no longer exist? Should I do something else I'm not thinking about? Any help would be greatly appreciated. Thanks. jim
Technical SEO | | PedroAndJobu0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Directory Naming & File Organization
We're redoing an entire site and are going to reorganize, and link to the site's pages by directory instead of page name. So instead of:xyz.com/services/fixingtvs.phpit will be:xyz.com/fixingtvsAt first I was thinking 1 index.php page per directory but that will make content management really confusing with a bunch of files with the same name.Anyone have a better idea?Thanks,Matt
Technical SEO | | mattloht0 -
Mapping and tracking old and new information architecture
Howdy. So I'm working on "example.com", which has thousands of URLs. The site is going to be redesigned, with some changes to the information architecture. I'm trying to think of a good way to organize and account for similarities and differences between the original information architecture and the new one. This should help with building 301s. I've downloaded a list of URLs from example.com from Open Site Explorer. What I would love to do is generate a visual "tree" of the site based on the output from Open Site Explorer. It would basically look like a pyramid with all of the subfolders branching out. Does anybody know of a tool out there that will do this for me? Or am I going to have a long day in Excel? 🙂 Any other thoughts on working through this process are welcome. Thank you!
Technical SEO | | SamTurri0