How does Google measure page position in Webmasters?
-
Does anyone know exactly how Google measures page position in Webmaster Tools?
For example: In Google Webmaster Tools, we had a product which on the 22/12/15 was at position 7, and then dropped to position 112 on the 30/12/15. It then rose back up to position 7 on the 6/01/16 and then down to position 25 on the 16/01/16.
What does this mean and why?
-
Hi Li! Did Peter answer your question? If he did, please mark one or both of his responses as a "Good Answer."
-
Page position is volatile as i can see. You need to select page and switch to query. This will return all keywords that was shown and clicks are performed for THAT url. If you select single query you can see dance for that keyword position in SERP. I make 2 screenshots with obfuscated keywords and url where you can see this. You can look them on attachment. If you wish you can digg inside grouping traffic by device (mobile/tablet/desktop) and by country.
So dancing up and down is normal. Because Google make changes each day, they releasing new algos. Also your site getting links, social signals, etc. BUT meanwhile your competitors in SERP do same. And you playing against 100 or 1000 for some keywords.
If you look Algo change history then you can see that starting from Jan a new algo has been released as "core update". No additional information was given and it shake well SERP. Probably your up/down movements are caused from it. At this time no one can give you advice what to do except "write great content and they will follow" plus "follow guidelines". You can follow SEO experts in Twitter, Facebook and G+ because they can share interesting about it. Like:
https://twitter.com/dr_pete
https://twitter.com/CyrusShepard
https://twitter.com/randfis -
Hi
Thanks for your answer.
The positions we are looking at are all showing in the graph for a particular product. Using the page position.
22/12 8.5
3/1 50
5/1 8
12/1 7
13/1 2
14/1 11
Can you or anyone explain why this rank varies so much day by day.
-
Search Analytics is only one way to check what position is your keywords and/or pages for specific queries.
There are two ways for aggregating - per site or per page. They are described here:
https://support.google.com/webmasters/answer/6155685In first site when we have site that on one page return 3 pages on position 1,2,3 and user click somewhere - CTR = 100% because we have one click to site and average position = 1 because is calculated from highest position.
In second way (per page) - CTR = 33% (3 results and one click) and average will be 2 (1+2+3)/3 = 2.Now going back to your questions. It's normal your positions to goes up or down. Google was making changes to their algorightms - over 400-500 per year. Some of them can be seen, other was unseen. There are few ways to be informed for changes:
http://mozcast.com/ <- this is something as weather report for Google. There are sunny days and rainy too.
https://moz.com/google-algorithm-change <- this is one of lists with changes. Some of them are confirmed, other doesnt.
Watching SEO community - can be in Twitter, Facebook, G+
Reading SEO news sitesNow back on your question. I believe that your site was going up and down from updates on 6 Jan and 16 Jan (they are confirmed as Core update). But i don't know about updates on 22 Dec or 30 Dec. Since "core update" is still in analysis of many industry leaders SEOs (even in Moz) i can't give you advise about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0 -
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Anyone else having webmaster tools delays?
My errors just stopped updating on April 25. Everything else is updating normal, did something change/
Reporting & Analytics | | EcommerceSite0 -
Goal tracking in Google Analytics
Hi folks I read from various sources that if you setup goals in Google Analytics each of these goals can only be fulfilled once per visit. Also some sources suggesting that only one goal from each goal group can be fulfilled per visit. On our site we have a goal for external links since this provides value to partners. Some users do open an external link in a new tab, then come back to the main site. Any further goal completions would then not get tracked. Since we apply a result based payment model for our work this means we are literally loosing money. Anyone has official info from Google on this? Can it be configured? How long is a visit? Thanks a million and have a great day. Fredrik
Reporting & Analytics | | Resultify0 -
I have two campaigns that are only crawling one page, why is this?
I have a total of three campaigns running right now, and two of them are only crawling one page. I set the campaigns up the same, what is the problem?
Reporting & Analytics | | SiteVamp0 -
How do I best segment tablets on Google Analytics
I would like to find a way to best segment out my tablet traffic to measure performance; however I'm finding that there are road blocks. It doesn't seem that device operating systems or screen resolutions have clear cut differences in the tablet/mobile versions. Has anyone here found a good way to create a "tablet" segment in Google Analytics? Right now I'm having to lean on solely the ipad traffic to get indicators of tablet performance. Thanks!
Reporting & Analytics | | lvstrickland0 -
Backlinks on Google Webmaster Tools
As I was reviewing my Google Webmaster tools, I notices a major drop in the number of back links. It used to show over 4,500 links to my site yet the other day it dropped down to 41. None of the directories I've submitted to are showing, nor any blog comments I've posted. Since then, my SEO traffic has started to drop for some keywords. Anyone know why?
Reporting & Analytics | | MikeAndres0