The best checking tool Keyword Cannibalization
-
hi guys i have a Keyword Cannibalization isuue, please Introduce best free tools for checking Keyword Cannibalization.
-
When that happens here, we improve both (or all) of those pages, then go out for beers.
-
That is unusual - is that data from search console? Can you let me know why or what data relying upon so you believe multiple pages from the same site are attracting the same customer query?
-
thanks. i have a multiple page ranking with similar keyword.
-
The best tool is search console, but not sure what precisely constitutes a keyword cannibalisation issue. Is it you have multiple keywords ranking for one page, none in top 3?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Anyone help me Checking the SEO of my Website?
Hi, I hope you are doing well. I want know how is looking my website? Is that attractive or boring for visitor. I really need your answer to improve my website. My Website is: https://sortscut.com/
Reporting & Analytics | | Pauline210 -
Can not divide in different properties a domain in Search Console (Webmaster Tools)
Dear Moz Community, I hope you can give me a hand with the following questions. Im in charge of SEO of an ecommerce site in LATAM. It´s service is available in several countries, therefore each country has it subdirectory Eg. /ar /pe /co /bo /cl /br,etc... (in the future we will move to differente ccTLDs). I have been recomended to split or create different Search Console or Webmaster Tools properties (one for each subdirectory) but when Im creating a new property with a subdirectory, lets say www.domain.com/ar, Webmaster tools starts creating a property for www.domain.com/ar/ (NOTICE THE LAST SLASH) and it returns since that page doesn´t exist, what do you recomend me to do? Best wishes, Pablo Lòpez C
Reporting & Analytics | | pablo_carrara0 -
Google webmaster tools hiccup?
Our flagship website, up until March 16 was getting 1600 impressions and 300 branded clicks per day as per GWT. After 3/16, branded search fell to 300 impressions and 25 clicks per day. Our rankings haven't changed, and neither has our traffic. We would definitely notice the decline in GA and Core Metrics, and it is running about the same. according to GWT, 75% fewer people started searching for our brand on 3/16, but all of our other metrics are indicating otherwise. Has anyone seen this before? Is it a tracking issue on our side?
Reporting & Analytics | | AMHC0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Is Google turning Off Webmaster Tools Search data
My Webmaster Tools account has stopped showing data past 9/23, which is a full week old. Typically it is just a few days behind schedule. Is Google cutting this off?
Reporting & Analytics | | gametv0 -
Select Keywords Strategy
Hi, when planing a strategy for the keywords i am looking at the following parameters: 1. Using Google Analytics i check what are the queries that cause my website to appear in search results 2. I am trying to understand what is the message the website (the company) want to deliver 3. I am trying to find the best keywords that will have lower competition to minimize the costs The Q 1. Any other parameters that i missed? 2. What will be the best way for me to check the competition and the costs for each keyword before i select it? Thank you very.... SEOwiseUs
Reporting & Analytics | | iivgi0 -
What is best practice for tracking RSS feed subscribers
What is the most accurate/achievable way of tracking data about subscribers to your RSS feed through Google Analytics? With standard WordPress sites, we place the RSS link to Feedburner so we could track statistics. However it wouldn't track the way that I use it. I use Pulse on an Android Tablet to read my feeds offline on the bus each morning. At home, Pulse automatically downloads the latest feeds wirelessly overnight. So then I can read them without a connection. The obvious downside for my reading experience is that I only get what is contained in the feeds. If the company only includes an excerpt, it's too annoying to read the teaser and be unable to connect and follow a link. So I only subscribe to feeds that contain the full post. Yeah to seomoz, aimclear, SEL, adwordsblog. I dont subscribe to bruceclays blog, much as i'd like to, because it doesn't contain the full feed. That's probably deliberate on their part, because I have to consciously visit their blog on my desktop at work, to see the whole post. The other problem with say Pulse, is how it locates the feed. I typed in the URL, and Pulse subscribed me. I assume that Pulse simply looked for the domain.com/feed URL and added that, rather than look for feeds2.feedburner.com/domain. I looked at Feedburner stats and they didn't go up for 2 days, so basically it didn't track me. Would it be as simple as using the Google URL builder to add parameters to each post in the RSS feed? Eg utm_source=feedreader, utm_medium=rss, utm_campaign=tracking. But that still wouldn't track offline users. I assume that most people are also not going to paste the Feedburner URL into their FeedReader, but would let the platform auto-detect the feed. Any suggestions?
Reporting & Analytics | | ozgeekmum1 -
Is there a tool to automatically gather website SEO data?
I am looking for a tool that will crawl a website and create a spreadsheet listing out all key data such as title, meta description, etc. Anyone know of an available tool to do that?
Reporting & Analytics | | jfeitlinger0