Seaches & Clicks Research
-
Is there a way to check the percentage of clicks on specific websites based on searches that people do? For example, say I searched "sneakers", what percentage of viewers clicked on a particular site.
-
Thanks!
-
There is a company is the UK that offer a tool that does this. Not sure if this is the right link but the tool is part of Experian.
http://www.experian.co.uk/integrated-marketing/web-analytics.html
They call me a month or so ago to demo it. It had amazing data but was extremely expensive (circa £10-50k per year if I remember correctly).
-
I do not know of such a tool - maybe try SEMRush? They have a lot by way of competitive analysis.
-
I mean for all sites. ie: competitors
-
You mean for your own site? yo can see this in both bing and goole wmt
-
Thank you - this is general info. I was wondering if there's an actual tool to see the click-through rate for certain keywords.
-
You could use the percentages from any of the click through rate reports out there for a rough guess;
Coconut Headphones (there's a 2nd part to this article too)
Bear in mind, everyone's reports are always a bit different. There are so many variables to estimating click through rate, its nearly impossible to come up with exact percentages across the board, as they can vary by industry, amount of PPC ads, local search vs general search, if there's videos or images in the result etc.
But hope those links help!
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googles Search Intent – Plural & Singular KW’s
This is more of a ‘gripe’ than a question, but I would love to hear people’s views. Typically, when you search for a product using the singular and plural versions of the keyword Google delivers different SERPs. As an example, ‘leather handbag’ and ‘leather handbags’ return different results, but surely the search intent is exactly the same? You’d have thought Google was now clever enough to work this out. We tend to optimise our webpages for both the plural and singular variations of the KW’s, but see a mixed bag of results when analysing rankings. Is Google trying to force us to create a unique webpage for the singular version, and another unique webpage for the plural version? This would confuse the visitor, and make no sense.. the search intent is the same! How do you combat this problem? Many thanks in advance. Lee.
Algorithm Updates | | Webpresence0 -
Product descriptions & category pages
Hi I wanted to ask if anyone knew how much, if at all, product page titles/descriptions affected the rankings of the category page they're linked from? I am looking for ways to improve the ranking of category pages, but we don't want to put too much content which overshadows the product listings. Thanks!
Algorithm Updates | | BeckyKey0 -
UX & Product Page Design
Hi I have a question regarding UX testing. Is it best when testing a product page to: 1. Redesign and test the new page - if it works, test elements to see what worked. 2. Start testing element by element to see what has a positive impact. We have differing opinions within the company, and I'd like to hear some feedback from others in the industry. Thank you
Algorithm Updates | | BeckyKey0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Anchor name URLs & anchor blocks: how Google sees them?
Hi guys, Anchor name URLs & anchor blocks: how Google sees them? As far as I know Google hasn't ever recommended anchor name URLs and anchor blocks, mostly when you have one page site, but I have ran into an organic result with an hyper-link to an anchor name URL. anchor name link There is a proper link and there aren't on the page and the code the words "Jump to". It means Google has put those words there and it has also taken the header of that block as anchor text. Why has Google placed that link? The query is "faqs umbrella company", so I thought that Google has seen "faqs umbrella company" like "what is the most popular faq about umbrella companies?" and therefore perhaps the correct answer could be "Is an umbrella company the only option I have? What are the alternatives?". Although, IMHO the most popular FAQ on Umbrella Companies should always be "what is an umbrella company". Unfortunately, that page is only worthy of third Google organic result page and there is no hint of rich snippet or any kind of conversational/KBT optimisation on its source code. no-rich-snippet Someone has any idea of why Google shows that link and if it's something that we can optimise in our pages? Cheers Pierpaolo IhwGwkb.jpg VWORt5F.jpg
Algorithm Updates | | madcow780 -
301-Redirects, PageRank, Matt Cutts, Eric Enge & Barry Schwartz - Fact or Myth?
I've been trying to wrap my head around this for the last hour or so and thought it might make a good discussion. There's been a ton about this in the Q & A here, Eric Enge's interview with Matt Cutts from 2010 (http://www.stonetemple.com/articles/interview-matt-cutts-012510.shtml) said one thing and Barry Schwartz seemed to say another: http://searchengineland.com/google-pagerank-dilution-through-a-301-redirect-is-a-myth-149656 Is this all just semantics? Are all of these people really saying the same thing and have they been saying the same thing ever since 2010? Cyrus Shepherd shed a little light on things in this post when he said that it seemed people were confusing links and 301-redirects and viewing them as being the same things, when they really aren't. He wrote "here's a huge difference between redirecting a page and linking to a page." I think he is the only writer who is getting down to the heart of the matter. But I'm still in a fog. In this video from April, 2011, Matt Cutts states very clearly that "There is a little bit of pagerank that doesn't pass through a 301-redirect." continuing on to say that if this wasn't the case, then there would be a temptation to 301-redirect from one page to another instead of just linking. VIDEO - http://youtu.be/zW5UL3lzBOA So it seems to me, it is not a myth that 301-redirects result in loss of pagerank. In this video from February 2013, Matt Cutts states that "The amount of pagerank that dissipates through a 301 is currently identical to the amount of pagerank that dissipates through a link." VIDEO - http://youtu.be/Filv4pP-1nw Again, Matt Cutts is clearly stating that yes, a 301-redirect dissipates pagerank. Now for the "myth" part. Apparently the "myth" was about how much pagerank dissipates via a 301-redirect versus a link. Here's where my head starts to hurt: Does this mean that when Page A links to Page B it looks like this: A -----> ( reduces pagerank by about 15%)-------> B (inherits about 85% of Page A's pagerank if no other links are on the page But say the "link" that exists on Page A is no longer good, but it's still the original URL, which, when clicked, now redirects to Page B via a URL rewrite (301 redirect)....based on what Matt Cutts said, does the pagerank scenario now look like this: A (with an old URL to Page B) ----- ( reduces pagerank by about 15%) -------> URL rewrite (301 redirect) - Reduces pagerank by another 15% --------> B (inherits about 72% of Page A's pagerank if no other links are on the page) Forgive me, I'm not a mathematician, so not sure if that 72% is right? It seems to me, from what Matt is saying, the only way to avoid this scenario would be to make sure that Page A was updated with the new URL, thereby avoiding the 301 rewrite? I recently had to re-write 18 product page URLs on a site and do 301 redirects. This was brought about by our hosting company initiating rules in the back end that broke all of our custom URLs. The redirects were to exactly the same product pages (so, highly relevant). PageRank tanked on all 18 of them, hard. Perhaps this is why I am diving into this question more deeply. I am really interested to hear your point of view
Algorithm Updates | | danatanseo0 -
Frequency & Percentage of Content Change to get Google to Cache Every Day?
What is the frequency at which your homepage (for example) would have to update and what percentage of the page's content would need to be updated to get cached every day? What are your opinions on other factors.
Algorithm Updates | | bozzie3110