Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Reporting & Analytics

Discuss the best ways to report on performance and communicate results to stakeholders.


  • Hi Moz community! I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear. Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them. As an example: This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/ But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” ) It is this custom URL that we use within GA to look up metrics about this page. This is just one example of many across our site setup to do the same thing Here is a second example: Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/ Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/ NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose. Main Questions: Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking) Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad) I cannot find any reference to this method anywhere on the InterWebs -          If method is not normal: Any recommendations on a solution to address this? Potential Problems? GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this? The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us. Thank you in advance for any insight and/or advice. Chris

    | usnseomoz
    0

  • Using custom dimensions our site tracks logged-in vs not logged-in users on our shop site (shop.example.com). A user can only login on the shop.example.com. The problem is when the logged in user visits our the main domain (example.com), we lose the custom dimension and the user becomes anonymous. At this point, if the user goes back to shop.example.com, the logged-in custom dimension comes back. How do I preserve the logged-in user dimension across the main domain? Example of snippet:

    | Evan34
    0

  • For some pre-reading, Groupon experimented with deindexing part of their site back in 2014 and found that they had a big drop in direct traffic: http://searchengineland.com/60-direct-traffic-actually-seo-195415 Recently I've conducted a similar test on a very large site and have found that Direct dropped somewhere between 50-80% for the pages deindexed, depending on the day (It has been running for ~4 days). Has anyone else tried this experiment before? If so what were your results?
    What I'm most interested in now is to find out why this traffic is classed as Direct - there appears to be no reason why the site or server would interfere, so I can only imagine that Google is sending traffic without referrer data. Does anyone have any thoughts as to why they would do that?

    | Leads.Bz
    0

  • Hi There, Wondering if anyone has any other tools they would recommend using for finding out keyword traffic on websites. Currently (and I'm sure like most), my website is connected to Google Analytics and Google Search Console. My biggest frustration becomes the "(not set)" variable that appears when I go to review the keywords section. It's always such a large number and I have no way of finding out what people might be typing in and coming across my website. Of course, I understand the privacy factor as to why Google must do this but it's certainly difficult to analyze what's working and what's not. Any tips, tricks or suggestions are greatly appreciated! Thanks, Lindsay

    | MainstreamMktg
    0

  • Hello Moz Community, A while back I noticed in Google Search Console that the volume of impressions and clicks dropped off a cliff. I also noticed that this was for primary head keywords that generally had a decent volume of impression share. The natural initial reaction was 'oh, I must have lost those rankings' but upon checking I realized these rankings are still in existence. Admittedly most are page 2 or 3, but still within the confines of being captured in GSC. Is there a logical reason why these keywords have just gone from search console? An example keyword would be something like 'online football management game' and the website is https://www.worldelitesoccer.com There is 0 queries in my search console data that includes the word 'football'. Thanks, Ben

    | melaniedsg
    0

  • Hey Mozzers, We are improving our B2B site by adding product codes to headings & meta information etc to gain some traction ranking for our own products and those supplied by others when searched for by product code. Almost immediately we are hitting the top half of the first page for most of these and seeing some nice results. We would like to track our placement for these product codes in google but feel this would be a waste of our Moz Keyword limit and we really dont need to check them once a week, just one a month or so. Has anybody got any methods of tracking our ranking for a a big list of keywords say once a month which isn't too labour intensive? Many Thanks

    | ATP
    0

  • It is messing up my Google Analytics traffic reporting. I can't figure out how to get it to stop. Do I filter it out in GA?

    | pmull
    0

  • If you filter for devices in the search analytics at search console you get that from July 29th all the data is tagged as desktop and mobile and tablet have no data from that date. I see that for all my websites I have search console for, any input on that?

    | amirbt
    0

  • Hi all, I've noticed in Google Search Console under 'Crawl errors' - 1. Why does the status code '410' come up as an 'error' in the crawl report? 2. Why are some articles labelled as '404' error when they have been completely deleted and should be a '410' - there are roughly around 1000-2000 of these. Thanks!

    | lucwiesman
    0

  • Hi Mozzers, I'm delivering some Google Analytics (Fundamentals level) training, and trying to make it was fun and as interesting as possible... which is quite a challenge when it comes to GA. I was just wondering if you're aware of training tasks, or interactions, I could bring into this kind of training session? The group are particularly interested in user journeys and the effectiveness of content. Thanks!

    | A_Q
    0

  • Hi guys, We are currently working on increasing our online marketing presence and with a new website on the cards I am turning to all of our B2B business directories to update, and being tracking referrals etc to see ones that may be worth premium listings. I have a list of 200 business directories to scope and check,  some are relevant some are not, obviously i dont want to sign up to them all and risk a dodgy link profile so im going to be selective, but im not sure how many to aim for etc. So im looking for some general advice and guidance at this early stage so I can properly plan my approach. What advice would you give and are there any major DO's and DONT's of sorting through these directories  to look for some new ways to source customers. EDIT: We are UK based Thanks

    | ATP
    0

  • Hi, I'm wondering what everyone's thoughts are on adding UTM tags to your business directories such as google my business, yelp, yellow pages ect. Would it be beneficial in sorting the data in GA? That way I can easily see which directories are performing the best. Thanks!

    | Sally94
    0

  • So my client has been asking for definitive proof of why the search query data provided on Google Search Console does not exactly match up the data presented directly in the Search Console itself.  The simple answer is that the Google Search Console is limited to 1000 rows of data.  However our client is requesting a Google article/documentation of why the new Search Console beta tool has no row limit (hence much more data for a big website). I know that the Google Search Console API was available before Google announced the new Search Console Beta tool in Google Analytics.  I also know this API could pull in more data than the 1000 row limit.  However is there any article available (preferably from Google) that Google Analytics is pulling this Search Console data via API? Thanks!

    | RosemaryB
    0

  • When looking at Google Trends and my Organic Traffic (using GA) as percentages of their total yearly values I have a correlation of .47. This correlation doesn't seem right when you consider that Google Trends (which is showing relative search traffic data) should match up pretty strongly to your Organic Traffic. Any thoughts on what might be going on? Why isn't Google Trends correlating with Organic Traffic? Shouldn't they be pulling from the same data set? Thanks, Jacob

    | jacob.young.cricut
    0

  • Hi All, I was getting payment gateway referrals for my site for many months but when I have implemented enhanced eCommerce tracking self referral issue resolved but I haven't read anywhere about this solution so I just want to confirm once . Thanks

    | Alick300
    0

  • We've had a puzzling drop in indexed pages on our ecommerce website. My crawl returns just over 25k items. Until 19/6 we had about 23-24k indexed. Then we experienced a sudden drop from 19/6 to 26/6: from 23,400 to 18,999, losing 4.4k pages from one week to the next. At the same time, our organic traffic has not decreased, it actually increased, however, it's only been a couple of weeks so that may be coincidence. A few things that have happened during the past few weeks: 31/5: we implemented pagination on category pages to avoid issues with duplicate content - could it be that this led to a decrease in indexed pages 3 weeks later? However, I can only find about 1.5k pages in my crawl that are page 2+ 18-19/6: we had some website outages over the weekend; as a B2B business, we don't get much traffic over the weekend, so I can't see an impact to traffic. However, the following week, indexation dropped by another 250 (then stayed the same this past week), so I don't think this was a factor. 21/6: we retired another website and migrated it to our main website. However, all pages were redirected to existing pages so no new pages were created for the migration. This doesn't really explain a decrease in indexation, but may account for some of the increase in organic traffic; however not all as the retired website hardly got any organic traffic. So, should we be worried? As our website is quite large, it would probably be quite difficult to pin point exactly which pages dropped off the index, but a loss of 19% of pages is quite significant. Then again, it doesn't appear to have negatively impacted organic traffic... Have you got any suggestions for what I should be looking at to find out what happened? Should I be worried at this point? I will definitely continue to have an eye on how our organic traffic (and indexation) develops but I am not sure if there is anything I can do at this point. I'd appreciate your advice on this, to make sure I am not missing something blindingly obvious. Thanks! RmWaNib JJm4tC3

    | ViviCa1
    0

  • Hi We have noticed a trend with days to transaction being high for May - June for Organic for 28+ days. Does anyone have any helpful articles or ideas on why this is happening for organic?

    | BeckyKey
    0

  • My Moz ranking is 29, Ahrefs Domain rank is 49 and Majestic Citation flow is 44 whereas trust flow is 17. This is a plain question - If the above are the rankings for backlink problem, then where is the problem more likely to be found? Backlink or content? In more detail - My site has been dropping in search since the last few weeks. With these rankings, is it likely that a backlink related problem is there? I myself agree that some content is poor and thin and there is a problem of plagiarism also. But overall, do i keep focusing on content? I do not know how good are these rankings as shown above. My URL is www.marketing91.com Please let me know whether the backlink profile looks good or not? So that at least i am not worried that there is a backlink problem as well. (i will surely work on toxic links soon)  I can worry only on content.

    | marketing91
    0

  • I have a referral from an email (2), android search(1) and another website(2) but my sessions is 100 over a two week period? I have NO returning visitors (still session data though?) Seems unlikely that just these 5 users have generated 100 sessions!! Any ideas?

    | darrenbooy
    0

  • Hi, I'm trying to beef up my knowledge of google analytics. Can you pelase tell me where I can find some good Google analytics tutorials?

    | corn2015
    0

  • Dear Friends, I need help with UTM source and UTM medium errors. There are 300 such errors on my site which is affecting the site i think, The URL appended at the end is utm_source=rss&utm_medium=rss&utm_campaign= How do i resolve this? Please help me with it.Thanks ccEpFDn.png ccEpFDn.png

    | marketing91
    0

  • Hi Mozzers, we´re running a secured https account section on our website including a messaging center where lots of non secured own URLs are being shared among the users. Is there a possibility that a user clicking on one of the shared URLs within the https section triggering another session thats been counted as direct traffic? Thanks for your help! Greets
    Manson

    | LocalIM
    0

  • We get referral traffic from Spammers to our Wordpress sites. That traffic comes from different countries: Russia, Ukraine, India, Germany, Pakistan etc. What's the best way to get rid of it? Setting up filters in Google Analytics? Is there something else that I need to do? Is there a plug-in that could help? Does that traffic have a negative impact on my SEO? Does it affect the rankings?

    | Armen-SEO
    2

  • Hi everyone: we all know the term 'Bounce Rate'. I'd like to think i have a good idea of what BR is....but some things are not really clear to me. Time to call in the experts. Question #1: What EXACTLY will stop Google from considering the visit as a bounce? As discussed not too long ago in this topic https://moz.com/community/q/will-this-fix-my-bounce-rate
    Ruben wrote: "..what it basically means is that someone clicks on your SERP, and then clicks back to google? But, it doesn't matter if they spent 10 minutes on your page or 10 seconds" Jessica Conflitti wrote a reply in which she basically said that it might be a good idea to have visitors click to a different page OR a PDF-file. That's where my confusion has been for some time now: Clicking on a PDF-document, an image in the page that opens with Fancybox, a link to a different domain? Or can it only be a different URL on the same domain? The way i would expect it to be:
    Pages contain the GA-tracking code. So am i right by thinking that Google needs to have the same GA-tracking code to be loaded twice? Because only at that point will they have two datapoints. And only then will they be able to tell that the visitor hasn't left. By clicking a PDF-document - as described by Jessica - you wouldn't load the GA-code twice. So I would expect that clicking a PDF does not make a difference for the BR. Don't get me wrong: i like the article but it is this detail that throws me off. IF Google can read or capture these clicks, what other elements can be used to reduce bounce rate? Clicking on a YouTube-video embedded in the page? I'm asking this because i want to get this right. Question #2: how much weight does BR have on Time on Page, Engagement, etc? We know Google is taking a lot of things into consideration when calculating the value of a URL or domain. So how much should we care for BR if we know the Time on Page is good and a large percentage of people are frequently returning? How about your experiences or knowledge on that? Really looking forward to your replies and help on clearing this topic for me. And perhaps some other readers as well! Bas

    | BasKierkels
    0

  • A client of mine has a large website with multiple sections (shop, forums, articles, etc.) that apparently had a significant reduction in rankings, traffic, and sales in the past. However, historic Google Analytics data is not available for the site, and I'm having troubles identifying anything concrete about the traffic drop, such as when it happened, what pages/sections it happened to, etc. The shop traffic drives most of the revenue, but it's a small number compared to the forums traffic, so it's hard to pick anything out of top-line trends like SEMrush offers. What tools or strategies might help in this situation?

    | AdamThompson
    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • In Google Analytics under Site Speed > Page Timings, you can see all pages and their loading time compared to the average. This is very handy to check which pages maybe need some optimization. I would also like to check the size for these pages in a similar way. There are multiple tools out there like GTmetix and Pingdom that give specific information and performance insights. The problem is that they are limited to check one url at a time. Does someone know about a tool to check the page size of multiple url’s at once (and if possible to easily export to Excel)? That way I can check which pages are big in size and research/optimize them. Thanks in advance

    | Mark.
    0

  • I made some changes to our Google Analytics property settings (see notes in screenshot). As a result, there was an equal drop in Google traffic and increase in Direct traffic. Has anyone else seen something like this before? I'm wondering if I should revert. jE7buO6

    | vcj
    0

  • Hi everyone, Over the last six months I've been running a few competitions on my largest site, but noticed a very large decrease in organic sessions just after the third one ended. For reference, the site is ~10 years old and gets a couple of million sessions per month.Organic sessions throughout last year and before the holiday periods were around 800k/month, which then increased by 50% during the holiday period alongside a competition I ran.These competitions double the pages per session and add another 1.5 minutes onto session duration. At the end of one of the competitions this year, daily organic sessions halved overnight and are now below the baseline of last year - and not improving. Some possible causes include; Google update - unlikely, because the date of the drop doesn't coincide with any increase in SERP volatility that I can find The extremely quick overnight drop in engagement (pages/session and session duration falling back to pre-comp baseline) caused Google to believe our site to be less popular and thus less deserving of rankings Visitors who've been bombarded with month-long competitions are sick of seeing them and are not searching for my site so often Email tagging - in the week before, UTM tracking parameters were added to all emails (of which there are a lot of subscribers) - as the number of Email visitors in Analytics increased, Organic did slightly decrease at the same time. I think this is unlikely, but I wonder if somehow some of our email visitors were previously being classed as Organic as well as some being Direct Incorrectly tagged as Direct - at the same time as the organic drop, Direct traffic doubled - it has since decreased back to just above the Direct baseline, however Organic has not improved I'd just be interested to know if anyone has any experience with something similar happening and, if so, what do you think the cause was and how did you rectify it? Thank you very much for your input in advance!

    | serges78
    0

  • We self-host our public website, but over time have also added  subdomains onto it that are not public and are for internal or even client portals.  I am seeking advice as to whether those subdomains affect the analytics data (self referrals, visits, bounces) of the public site that I am tasked with analyzing.  I feel that it does skew the data but need to build a solid case to move the public website to a new domain, so as to leave the existing one in tact with all of its subs.

    | MarketingGroup
    0

  • We recently noticed an update note in the Google Search Console that happened on April 27th.   Does this denote an algorithm update? Any feedback or article would really be helpful. Thanks! gfQ8FG9.jpg

    | RosemaryB
    0

  • We are having a little trouble coming up with a goal that shows how many product pageviews we are getting but I need to exclude search results pageviews that (unfortunately) have the same URL structure. Because it's an outside CMS, we have not ability to change the URL architecture. Products are on these types of pages: https://porscheasheville.com/inventory/Porsche+Boxster+Asheville+North+Carolina+2016+Rhodium+Silver+Metallic+536911 https://porscheasheville.com/inventory/Audi+A4+2.0T+Premium+Plus+Asheville+North+Carolina+2015+Gray+638379 Search results pages have this URL structure: https://porscheasheville.com/inventory/new/ https://porscheasheville.com/inventory/?condition=new&make=Porsche&model=Boxster https://porscheasheville.com/inventory/used/ https://porscheasheville.com/inventory/?condition=used&model=A4+2.0T+Premium+Plus I am hoping to create a GA goal with regular expressions showing only the product pages and not allowing the search results pages show up. Here's what I have, it's not working - any regex experts out there who can help? /inventory/[new/][used/] Thanks as always MOZ friends!

    | ReunionMarketing
    0

  • I have a website with Google Analytics which works fine. I tried adding the code to a subdomain using http://www.verticalrail.com/kb/filter-in-google-analytics-to-track-subdomains/ as a guide. There is no data showing up though. Any ideas?

    | EcommerceSite
    0

  • Hi Is any one privy to an online tool that will let me create a dynamic URL parameter string but will allow me to generate multiples of the URL and add a distinct key for each one. E.g Campaign Source, Medium, Name, a Keywords etc. are all the same in the string but then I want to generate a unique id code at the end. Then export them as a csv and integrate into my database lists. Looking to run this into a few thousand as well. I was going to just do this in Excel and combine two columns withe the string and the number count in the other column but if there is a tool that does it all that would be interesting to know.

    | David-E-Carey
    0

  • ...when I fired my original web designer, did they sabotage coding?  I have never checked my Alexa/Google Analytics, or any blog ranking until last night.  Subsequently, I have spent the last 24 hours googling away, and finally joining MOZ b/c I'm desperate to find out WHY I'm not ranking.   I've googled and found many answers to a problem directly opposite of mine: (How to increase traffic with a high ranking), but I already have quite a bit of traffic (via Wordpress Stats), but can not be found on any ranking system.  So, fiddled with some NoFollow/NoIndex boxes in Genesis SEO settings thinking maybe when my domain name changed it messed everything up?  Most the boxes HAD been checked, so I unchecked them all. Anyhow, basically signed up for the monthly service so i could ask this question on the forum.   My site is hellowhitney.com **it's so weird---i have a LOT of organic direct hits coming directly to my blog (for instance a celebrity re-posted a post which gained a lot of traffic from Twitter to the page), but Google nor another ranking is seeing it.  IN FACT, it stops any and all ranking data back to FEBRUARY 2016 when I changed my domain name from Myscriptedreality.com to HelloWhitney.com Ignorance is NOT bliss in this case--would appreciate any help! #ForeverGrateful

    | hellowhitney
    0

  • Howdy Mozzers Does anyone know if the 'average time on site' in Google analytics is calculated with bounce rate included? For example if you have a 50% bounce rate and your average time on site is 2 minutes the actual time would be 4 minutes as the 50% bounce rate time is classed as 0. I hope that is clear! Cheers

    | CraigAddyman
    2

  • Hello, I don't know what happened with my Google Analytics initial page. For around a week I have there all my websites, but they show no results on global page. If I enter on any property I can see everything as normal, just not on the initial page. Screenshot: http://imgur.com/JKFVIh4 For sure I changed anything that I shouldn't. Perhaps you know what's wrong? Thank you in advance.

    | prozis
    0

  • We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!

    | localwork
    0

  • We run two websites and as part of our KPIs we are treating those who visit 3 or more pages of our website as a client served. As a digital team we are not convinced that this is the best metric to use as the improvements we are making to the sites mean that people are able to find the information quicker. Additionally other organisations including forums etc link to us so those users will get the info they need in one click. What I would like to know is how Google calculates page depth in GA. Are they treating the landing page  as ground zero and then when  users clicks a link they go one page deep? Or is the landing page, page depth 1 . Is page depth a measure of how many clicks a user needs to find their information?

    | MATOnlineServices
    0

  • Hello! For the last month i have notices a drastic change in Avg.position  for a keyword on my clients site http://livoart.lv   keyword: "kapu pieminekļi".  According to webmaster tools it went from 13 to 4-2. The problem is that in orgranic search i keep seeing the keyword on the second page and the CTR of the keyword did not change at all. Why would webmaster tools show false data?

    | AlexElks
    0

  • I know going from https to http (usually) strips referrers but I was wondering if the referrer is stripped when your website is a mix of both http and https? Say someone browses your site (on http), adds a product and then goes to your cart (https), then decides to go back to another page on your website which is http. Will this strip the referrer? Any help on this would be great, thanks!

    | Fitto
    0

  • Question for you MOZ friends: We added two new views about a week ago, the same views that we have added for more than 50 Accounts, and they are not showing any data. The views are for 'Filtered Traffic' (no IPs or spam) and 'Organic Only' including only the organic medium. Any ideas why this may not be working? First time we've come across this and couldn't find much help researching it... Thanks as always! street%20ga_zpsorfdnemi.png

    | ReunionMarketing
    0

  • Hi there, I'm fairly new at all of this and would appreciate any help with understanding why our website has taken a hit in traffic. We curate an online magazine, which was previously accessible through: forensicoutreach.com. It was receiving about 2,000+ unique visitors per day up until a week ago, when we changed a few things. However, the magazine doesn't reflect what our business does, so we created a product-focused web presence on forensicoutreach.com, and moved the magazine (which everyone loved) to library.forensicoutreach.com (DA 37, PA 1). We thought separating the properties was a good idea, but now I'm not so sure. Our traffic on library.forensicoutreach.com is 1,500 (so 500 less than usual!) and our main property has about 56 unique visits a day. It's pretty substantial. A few questions: 1. If we move the library to forensicoutreach.com/library, will that make any difference? 2. Where did we go wrong here and how can we fix it? Any help would be appreciated. Thanks.

    | shivaniseos
    1

  • Hello here, I have a question about the Queries report under "Search Engine Optimization" in Google Analytics: is the "Average Position" information a reliable one? I have a lot of queries that appear, from that report, to average first position, but when I verify that on Google by connecting anonymously, I can't even find my result on the first page! To me, that information is worthless and makes me think all the rest of that report is unreliable. If anyone can help me to understand it, I'd really appreciate it. Thank you in advance for any thoughts.

    | fablau
    0

  • We have an instance of a page where visitors can click a button to start an interactive quiz. The quiz pops up in a modal window that references another domain (the interactive content provider). Will the person completing the quiz in the modal pop-up still be counted as an active visitor on the original host page during the time they are completing the quiz?

    | MuhammadInc
    0

  • Hi, one of my client is launching a new site on a subdomain newsite.brand.com.au but will use a shorter URL on the packaging on their new rang of products to be sold in supermarkets, prompting shoppers to visit xxx.brand.com.au. This specific sub-domain xxx.brand.com.au will be 301 redirecting to newsite.brand.com.au How can I identify and track accurrately all the visits coming from xxx.brand.com.au to newsite.brand.com.au?The visits from sub-domain xxx.brand.com.au will be attributed to direct channel I guess. Is it worth adding GA on this 301 redirected sub-domain xxx.brand.com.au  or there another way to identify those visits on newsite.brand.com.au ?

    | mecglobal
    0

  • Hey, If a site has category or tag pages showing in search results for a particular keyword - sometimes higher than the page you would like to rank - could using canonical redirects on categories and tags be a solution? Thanks

    | wearehappymedia
    0

  • Hi guys, Unique issue with google analytics reporting for one of our sites. GA is reporting sessions for 404 pages (landing pages, organic traffic) e.g. for this page: http://www.milkandlove.com.au/breastfeeding-dresses/index.php the page is currently a 404 page but GA (see screenshot) is reporting organic traffic (to the landing page). Does anyone know any reasons why this is happening? Cheers. http://www.milkandlove.com.au/breastfeeding-dresses/index.php GK0zDzj.jpg

    | jayoliverwright
    2

  • Hi all First question here but I've been lingering in the shadows for a while. As part of my companies digital marketing plan for the next financial year we are looking at benchmarking against certain KPIs. At the moment I simply report our conversion rate as Google Analytics displays it. I was incorrectly under the impression that it was reported as unique visits / total orders but I've now realised it's sessions / total orders. At my company we have quite a few repeat purchasers. So, is it best that we stick to the sessions / total orders conversion rate? My understanding is multiple sessions from the same visitor would all count towards this conversion rate and because we have repeat purchasers these wouldn't be captured under the unique visits / total orders method? It's almost as if every session we would have to consider that we have an opportunity to convert. The flip side of this is that on some of our higher margin products customers may visit multiple times before making a purchase. I should probably add that I'll be benchmarking data based on averages from the 1st April - 31st of March which is a financial year in the UK. The other KPI we will be benchmarking against is visitors. Should we change this to sessions if we will be benchmarking conversion rate using the sessions formula? This could help with continuity and could also help to reveal whether our planned content marketing efforts are engaging users. I hope this makes sense and thanks for reading and offering advice in advance. Joe

    | joe-ainswoth
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.