Thanks Joel,
In terms of the expected swing month-on-month, is there any kind of ballpark figure to compare against? 14% seemed pretty high.
Thanks
Darroch
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks Joel,
In terms of the expected swing month-on-month, is there any kind of ballpark figure to compare against? 14% seemed pretty high.
Thanks
Darroch
I just ran into a problem that I hadn't expected. Testing the Keyword Difficulty I saw the results contained a result for a page that has Domain Authority=1 and Page Authority=1.
As a result, Keyword Difficulty was reduced (compared to last month), which may actually be reversed if the site is crawled. Sadly, I didn't run the report on the figures as it was a small project.
Questions:
The problem
Using the Keyword Difficulty tool, I found swings of up to 14% in Keyword Difficulty (between Oct -Nov). Dr Pete may suggest that this is because of changes in Google's index ( http://www.seomoz.org/blog/a-week-in-the-life-of-3-keywords ). However, It would be helpful to have a figure for Keyword Difficulty that isn't affected by the gaps in the Mozscape data.
The (bad) solution
You can mirror something close to Keyword Difficulty using:
=(Sum of Page Authorities + Sum of Domain Authorities )/20
Right now, I have resorted to manually calculating keyword difficulty. I use the SEOMoz Page Authority & Domain Authority figures and a quick splash of Excel SUMIF and COUNTIF.
I find the results don't look as 'easy' when I can ignore results where the data is unknown (PageAuthority=1 & DomainAuthority = 1).
Background Info
One result I still have a report on is for the phrase [fixing your business puzzle] using US Results on Google. For the specific result, I found the additional information about the site:
DNS lookup shows the domain was registered in 2010
Archive.org shows no records
OSE shows no data for the site
Site uses https
Google showing No links
Robots.txt file seems fine
No Sitemap.xml
It does seem that the numbers change for different numbers of results, but as Russ mentioned, they are indications/estimates.
What I'd be more interested in is which pages Google Webmaster Tools knows about from your sitemap.xml (since these are the pages that you consider important). With a larger site you could also split your sitemaps into sections to give easier analysis of which sections are poorly indexed.
For more resources on URL parameters:
http://searchengineland.com/google-power-user-tips-query-operators-48126
http://searchengineland.com/google-power-user-tips-serp-url-parameters-49736
One suggestion for who to follow is to listen to the people that SEOMoz trusted for the recent Search Engine Ranking Factors 2011. I've compiled the list of contributors into an easy to follow twitter list: http://twitter.com/#!/darrochreid/seo-insights
As you don't care about their eating habits, I also created a filtered version of SEO/Search related updates. This updates each day, so should be more practical: http://paper.li/darrochreid/1308375741
For Web Design I'd follow the following 5 folk:
@smashingmag
@alistapart
@zeldman @Malarkey
@paul_irish
Enjoy
Darroch