Seaches & Clicks Research
-
Is there a way to check the percentage of clicks on specific websites based on searches that people do? For example, say I searched "sneakers", what percentage of viewers clicked on a particular site.
-
Thanks!
-
There is a company is the UK that offer a tool that does this. Not sure if this is the right link but the tool is part of Experian.
http://www.experian.co.uk/integrated-marketing/web-analytics.html
They call me a month or so ago to demo it. It had amazing data but was extremely expensive (circa £10-50k per year if I remember correctly).
-
I do not know of such a tool - maybe try SEMRush? They have a lot by way of competitive analysis.
-
I mean for all sites. ie: competitors
-
You mean for your own site? yo can see this in both bing and goole wmt
-
Thank you - this is general info. I was wondering if there's an actual tool to see the click-through rate for certain keywords.
-
You could use the percentages from any of the click through rate reports out there for a rough guess;
Coconut Headphones (there's a 2nd part to this article too)
Bear in mind, everyone's reports are always a bit different. There are so many variables to estimating click through rate, its nearly impossible to come up with exact percentages across the board, as they can vary by industry, amount of PPC ads, local search vs general search, if there's videos or images in the result etc.
But hope those links help!
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to do effective keyword research with categories and subcategories?
Hi all, I'm trying to breakdown some SEO 101 tips and start from scratch. Starting with Keywords! I would like to audit our site for main keywords, grouping them in categories and subcategories. My questions are: 1. Is it possible to see where we rank on google AND search trends of visits to our site?
Algorithm Updates | | Eric_S
2. What is a good method or structure to document (excel?)
3. What analysis can be made from finding the results of these keywords and how can I make use of this? As a beginner your help is much appreciated!!2 -
Has anyone seen any research regarding URL structure correlating/impacting rank brain results?
We are currently writing some "rank brain-friendly" content and were wondering if anyone had seen or conducted research on best URL structure practices. Any insights would be appreciated! Thanks, Zach
Algorithm Updates | | Chris-2417530 -
Fred Google Update & Ecommerce Sites
Hi I've seen a couple areas of our site drop in average rankings for some areas since the 'Fred' update. We don't have ads on our site, but I'm wondering if it's 'thin' content - http://www.key.co.uk/en/key/ We are an ecommerce site and we have some content on our category pages - which is a bit more generic about the section/products within that section - but how can it not be if it's a category page with products on? I am working on adding topic based content/user guides etc to be more helpful for customers, but I'd love some advice on generating traffic to category pages. Is it better to rank these other topic/user guide pages instead of the category page & then hope the customer clicks through to products? Advice welcome 🙂
Algorithm Updates | | BeckyKey0 -
Cached status(date & time) not showing
I Have a website http://weddingsbylydia.com/. when i use command site:weddingsbylydia.com in Google to see the cached status of page like date & time, is not showing. So, please help me to know the reason behind this.
Algorithm Updates | | 1akal0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Dropped off cliff for a partic keyword & can't find out why
At the beginning of Dec we ranked consistently in the top 3 for the keyword 'Suffolk' for the site www.suffolktouristguide.com (apge rank 4, thousands of quality inboud links, site age 5 years +). Since then we've been falling off a cliff and today aren't even in the top 50 for this search term, but most of our othr search terms are unaffected. Our SEOMoz grade remains A for 'Suffolk' and we haven't changed anything in that time that could have had such a material effect (knowingly at least). A similar issue happened to my other site www.suffolkhotelsguide.com back in April and it hasn't recovered despite grade A's on the homepage and key pages. We've checked internal broken links, page download times, external links (used the disavow tool and reconsideration request and got back 'We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google'); etc etc Any thoughts on what I can try next? All suggestions appreciated as I am completely stuck (& have spent a fortune on 'SEO experts' to no effect).
Algorithm Updates | | SarahinSuffolk0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0