Why would our client's site be receiving significant levels of organic traffic for a keyword they do not rank for?
-
We have a client that has received 100+ organic visits for the keyword 'airport transfers', yet the site does not rank in the top 100 search results for this keyword. We have checked that it is not untagged PPC traffic. Truly baffling. Can anybody help?
-
Hi David,
Did you manage to see where these visits were coming from yet? Agree with many posters below that 100+ visits from Google would be surprising given that Google does not report 99% of organic keyword data anymore. Were the visits from other search engines?
-
Hi David,
-
You might be seeing them from analytics or your site stats so they are probably from Bing or another search engine.
-
Billy's point is also a possibility.
-
-
I have a similar question. I am just beginning to use MOZ. Trying to understand why a page that is rated A has dropped from 10 to 45 in Google for a specific keyword yet other keywords that are rated F have and still do rank in top 5 in Google. I am on the east coast and had them tested in mid west with same results. This is true for what I believe should be our best keywords for good organic results and are in the top 10 in our Google source report. I really appreciate the help. Our PPC seems to be working great in all selected areas (worldwide) and locally page 1 for Google+ and paid ads.
-
You are correct Billy. I am not referring to G+ local results really. More that the organic sites are delivered based upon my IP & location. If I fire up HideMyAss and set it to Manchester, I will get sites local to there.
-Andy
-
We don't mean Google Local specifically - at least I didn't and I don't think Andy did as well (but I can not speak for him). We mean If you, Andy and I all typed in 'airport transfers' in to Google, we would all get different results due to our being in totally different places in the world.
Now, we would all probably use whatever method we like to get unpersonalized/un-local results but most people wouldn't do that. So although your client doesn't appear to rank for that term... maybe they do right in their area...
I could, of course, be totally wrong... just thought Andy's idea made sense.
-
in analytics, you can pretty easily tell what visits came through local (Google + Local anyway). The keyword will be replaced by the click through url.
You can rule in/out Google Local by looking at keyword as a secondary dimension.
-
I think that is the most likely cause.
-
It could be that there is an element of local search results creeping in.
If I search for 'Airport Transfers' from here, then I get companies who are in Chester (UK).
-Andy
-
Are these 100+ visits from 100+ different users? It could be just one user who initially came to the site for that keyword and kept coming directly back to the site but GA is counting it as another visit for that keyword.
-
Where are you getting this data from? Perhaps "organic" means bing/yahoo/other search engines - where your client is ranking.
-
They may be getting traffic from search engines other than Google. Have you checked the Google Analytics to determine which Search Engine the traffic is coming from? Are they getting traffic from Google Images? or Video?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL marked 'noindex'
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt. Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+. There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt. Can anyone please suggest me a solution here?
Reporting & Analytics | | Reema240 -
Google Analytics - Organic Search Traffic & Queries -What caused the huge difference?
Our website traffic dropped a little bit during the last month, but it's getting better now, almost the same with previous period. But our conversion rate dropped by 50% for the last three weeks. What could cause this huge drop in conversion rate? In Google Analytics, I compared the Organic Search Traffic with previous period, the result is similar. But the Search Engine Optimization ->Queries shows that the clicks for last month is almost zero. What could be the cause of this huge differnce? e9sJNwD.png k4M8Fa5.png
Reporting & Analytics | | joony0 -
Site relaunch and impact on SEO
I have some tough decisions to make about a web site I run. The site has seen around for 20 years (September 1995, to be precise, is the date listed against the domain). Over the years, the effort I've expanded on the site has come and gone, but I am about to throw a lot of time and effort back into it. The majority of the content on the site is pretty dated, isn't tremendously useful to the audience (since it's pretty old) and the site design and URL architecture isn't particularly SEO-friendly. In addition, I have a database of thousands vendors (for the specific industry this site serves). I don't know if it's a factor any more but 100% of the links there have been populated by the vendors themselves specifically requesting inclusion (through a form we expose on the site). When the request is approved, the vendor link shows up on the appropriate pages for location (state) and segment of the industry. Though the links are all "opt-in" from vendors (we've never one added or imported any ourselves), I am sure this all looks like a terrible link farm to Google! And some vendors have asked us to remove their link for that reason 🙂 One final (very important) point. We have a relationship with a nationwide brand and have four very specific pages related to that brand on our site. Those pages are essential - they are by far the most visited pages and drive virtually all our revenue. The pages were put together with SEO in mind and the look and feel is very different to the rest of the site. The result is, effectively, a site-within-a-site. I need to carefully protect the performance of these pages. To put some rough numbers on this, the site had 475,000 page views over the last year, with about 320,000 of those being to these four pages (by the way, for the rest of the content "something happened" around May 20th of last year - traffic almost doubled overnight - even though there were no changes to our site). We have a Facebook presence and have put a little effort into that recently (increasing fans from about 10,000 last August to nearly 24,000 today, with a net gain of about 2,500 per month currently). I don't have any sense of whether that is a meaningful resource in the big picture. So, that's the background. I want to totally revamp the broader site - much improved design, intentional SEO decisions, far better, current and active content, active social media presence and so on. I am also moving from one CMS to another (the target CMS / Blog platform being WordPress). Part of me wants to do the following: Come up with a better plan for SEO and basically just throw out the old stuff and start again, with the exception of the four vendor pages I mentioned Implement redirection of the old URLs to new content (301s) Just stop exposing the vendor pages (on the basis that many of the links are old/broken and I'm really not getting any benefit from them) Leave the four important pages exactly as they are (URL and content-wise) I am happy to rebuild the content afresh because I have a new plan around that for which I have some confidence. But I have some important questions. If I go with the approach above, is there any value from the old content / URLs that is worth retaining? How sure can I be there is no indirect negative effect on the four important pages? I really need to protect those pages Is throwing away the vendor links simply all good - or could there be some hidden negative I need to know about (given many of the links are broken and go to crappy/small web sites, I'm hoping this is just a simple decision to make) And one more uber-question. I want to take a performance baseline so that I can see where I started as I start making changes and measure performance over time. Beyond the obvious metrics like number of visitors, time per page, page views per visit, etc what metrics would be important to collect from the outset? I am just at the start of this project and it is very important to me. Given the longevity of the site, I don't know if there is much worth retaining for that reason, even if the content changes radically. At a high level I'm trying to decide what questions I need to answer before I set off on this path. Any suggestions would be very much appreciated. Thanks.
Reporting & Analytics | | MarkWill0 -
Self-Reffering Traffic After upgrading to Universal Analytics
Backstory: We have always had an issue with self-referring traffic but in waiting for the UA upgrade we put it on the backburner for getting fixed. We have now upgraded to UA and I was under the impression that GA would automatically exclude the domain associated with a property as a referral source. However this is not what I am seeing under my referral traffic source. With 10 websites having issues with this I need some help. Should I use the referral exclusion list? Also on a handful of our sites we have region specific URLs, I am also seeing these come in as self-referring traffic. I should also mention that about 85% of our sales are being attributed to the self-referring traffic. Here are two sites for example sake: ZootSports.com and K2snowboarding.com
Reporting & Analytics | | K2_Sports0 -
Google analytics : exclude traffic to a subdomain
Hi, I have a website with a client access on a subdomain. I want to exclude the traffic to that subdomain because it messes up my conversion goal for the main site. Per example, 2 out of 10 visitors are existing clients that want to access to my SaaS product. The 8 other are potential clients. I want to exclude the 2 clients from my stats so I could have the good conversion percentage for my free trial for the other 8 potential clients. Thanks in advance for your help!
Reporting & Analytics | | slestage0 -
Magic UVs - PPC landing pages delivering organic traffic by magic...
I have checked and double checked this. GA is showing over the last couple of weeks mysite.com/ppc/landingpage1 as a landing page for organic traffic, where it shouldn't. Main facts: The entire /ppc/ folder is blocked from the googlebot, and doesn't appear on any internal site maps. As far as I can tell, these pages have never been cached for the main index. I cannot recreate any of the organic searches myself (i.e. typing in keywords that triggered the traffic, even the almost unique long-tail ones). We just don't appear in the organic listings with these pages. The analytics and adwords accounts are linked. We are not paying for this mystery traffic through our PPC - these keywords are not appearing in our AdWords account (though other keywords / traffic are). The traffic is real - we have received phone calls from these pages, tracked to the visits recorded as organic These pages should only receive PPC traffic. They are receiving organic traffic also, but I can't recreate it. Can anyone suggest what's going on? I'm concerned about duplicate content issues and also skewing the analysis of the PPC campaign. Thanks
Reporting & Analytics | | RobPell0 -
Measuring events to external sites
Im having problem measuring click on ads by using events in GA or Jetpack. For example when I checked out yesterday this is what I read: 1. In GA events it says 12 clicks 2. In Jetpack it says 9 clicks But when I look at Referrals to the actual site directly it says 18 clicks Which one is the rights one? I need this because I use this to invoice clients end of month! and it cant be any "maybe".something. cheers, R
Reporting & Analytics | | rrrobertsson0 -
Tool to check pervious rankings for a keyword ?
I am looking a tool/site to check ranking history of specifc keyword . Is this possible ? Thanks
Reporting & Analytics | | bfletc40