The best checking tool Keyword Cannibalization
-
hi guys i have a Keyword Cannibalization isuue, please Introduce best free tools for checking Keyword Cannibalization.
-
When that happens here, we improve both (or all) of those pages, then go out for beers.
-
That is unusual - is that data from search console? Can you let me know why or what data relying upon so you believe multiple pages from the same site are attracting the same customer query?
-
thanks. i have a multiple page ranking with similar keyword.
-
The best tool is search console, but not sure what precisely constitutes a keyword cannibalisation issue. Is it you have multiple keywords ranking for one page, none in top 3?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the most effective way of selecting a top keyword per page on a site?
We are creating fresh content for outdated sites and I need to identify the most significant keyword per page for the content developers, What is the best way to do this?
Reporting & Analytics | | Sable_Group0 -
Search Keywords, Meta Keywords and Meta Descriptions; Keeping Webmaster Tools Current
What are Search Keywords, Meta Keywords and Meta Descriptions? What exactly is the difference between them and which one is more important? In regards to Webmaster Tools, if we delete a page or a product, it still shows up in Search Analytics. How can we update Webmaster Tools so as to keep it current with our website? Lastly, again in regards to Webmaster Tools, in Search Analytics. At the moment we put relevant queries into the Meta Description of low ranking pages, in order to raise the position of the page. Is this the right way to handle queries? Should we be putting the queries into the Meta Description or the Meta Keywords?
Reporting & Analytics | | CostumeD0 -
Can not divide in different properties a domain in Search Console (Webmaster Tools)
Dear Moz Community, I hope you can give me a hand with the following questions. Im in charge of SEO of an ecommerce site in LATAM. It´s service is available in several countries, therefore each country has it subdirectory Eg. /ar /pe /co /bo /cl /br,etc... (in the future we will move to differente ccTLDs). I have been recomended to split or create different Search Console or Webmaster Tools properties (one for each subdirectory) but when Im creating a new property with a subdirectory, lets say www.domain.com/ar, Webmaster tools starts creating a property for www.domain.com/ar/ (NOTICE THE LAST SLASH) and it returns since that page doesn´t exist, what do you recomend me to do? Best wishes, Pablo Lòpez C
Reporting & Analytics | | pablo_carrara0 -
How can I remove parameters from the GSC URL blocking tool?
Hello Mozzers My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google. The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool. Thank you Mozzers!
Reporting & Analytics | | Vsood0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Best way to handle ignored Rel=Canonical?
My Google Analytics is reporting organic traffic for URL's with a QueryString attached, even though there's a Canonical tag that points to the preferred (non-QueryStringed) version. Would the best way to handle this be the GWT URL Parameters Tool? I'm fairly unfamiliar with the tool, but after some research, it looks like this might be the best way to go. Does anyone have any good/bad advice for using the tool? Thanks!
Reporting & Analytics | | GalcoIndustrial1 -
Best top 5 actionable insights from Google Analytics
Buongiorno from 14 degrees C cloudy wetherby UK 🙂 With a whole gamut of books out there are written around the topic of "Actionable Web analytics" where you have to wade through chapters of how to suck eggs and how to get analytics part of your business culture (whatever! most clients who look after their websites are entry level Marketing Execs who pay the digital agencies to do the thinking for them) I'd like to cut to the chase and list out say a top 5 of actionable insights a client would actually give the time of day for so heres my top five, whats yours? 1. Landing page bounce - showing them the £5k landing page designed with no conversion thinking is turning off customers
Reporting & Analytics | | Nightwing
2. Synching up PPC campaigns to analytics and showing them how much money they are waisting
3. Goal funnels - Showing them there 20 step shopping conversion funnel gets abandoned at step 3
4. Non brand keyword traffic (although a stack of this gets blocked by "not provided" Big groan...)
5. Event tracking on there 10 page carousel anner showing them no one clicks on banners 2 - 10 So under the strict label of "Actionable data only" from Google analytics whats your best insights you share with clients in the hope that they may actually give data reports the time of day! Grazie tanto,
David1 -
Is Webmaster Tools Useless as a broken Link Detector?
Buongiorno from yes we still have free parking Wetherby UK!
Reporting & Analytics | | Nightwing
Ok when it comes to detecting broken links I'm getting really frustrated with webmaster tools. Now I'm probably going to end up with egg on my face with this one but here is an example of webmaster tools reporting a broken link which i cant find. http://i216.photobucket.com/albums/cc53/zymurgy_bucket/phantom-broken-links_zpsb74e1246.jpg Having trawled through the code i just cant see the knackered link? Is it a phantom report or is something usefull being detected here? Grazie tanto,
David1