Keyword Suggestions Tool & Different Subdomains
-
Hey all,
Was reading Dan Shure's brilliant post on the Keyword Planner, and decided to plug a few of my own pages into the URL-suggester tool as well. What I got back was nothing short of strange.
After plugging in our Features page, (which describes our Social Media Contesting Platform,) and getting back a bunch of suggestions related to Dr Seuss and Interior Design Scholarships, I realized that the Keyword Suggestion tool was being broken by our subdomains.
I looked for precedent on my particular issue, but I think I might not be searching properly. Could anyone provide any insight into whether or not this might affect how spiders see the content on Strutta.com, whether or not this is just something that will affect the Keyword Suggestions Tool or actual SERP rankings, and if this content is already present elsewhere on MOZ, a link to said content?
Much obliged
-
Thanks for getting back, Dan! Doing these checks definitely satisfies me that it's a non-issue.. Content Keywords gives me "facebook", "strutta", "contest", etc.
Interesting bug with Adwords though, they should consider doing subdomain-specific "research", since I think that in the case of SaaS businesses especially, this makes a lot more sense.
Cheers,
-Danny -
Dan who? Sorry for the delay here, was on vacation. But let me see if I can help.
This is a very intriguing. I think the best way to confirm/deny if this is an actual issue, is by going to Webmaster Tools -> Google Index -> Content Keywords - webmaster tools will be registered of course for just www and in theory should not take into account any of the subdomains. So if a subdomain is adversely affecting the main domain, those erroneous Keywords would show up probably in WMT.
Another way to check would be the related: search operator which when I run on Strutta's root domain, I get sites that make sense - other social media, facebook and contest sites. A straight branded search for "strutta" looks pretty strong too, everything is relevant.
Lastly you could look in "impressions" under webmaster tools for the www site - if you are "ranking" www pages for those extraneous keywords with random pages that could be a signal of something weird.
But I do think this is probably more a question of how the AdWords tool works - which to be honest I am not exactly sure, but probably a big combination of domain content, page content, anchor text from links etc.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with GA tracking and Native AMP
Hi everyone, We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates. As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly. The three views are: Sitewide (shop.winefolly.com and winefolly.com) Content only (winefolly.com) Shop only (shop.winefolly.com) The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic. The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern: ^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/. Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly. I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either. The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before. The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin. This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change. I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong. Any suggestions or input would be very much appreciated. Cheers,
Technical SEO | | winefolly
Chris (on behalf of WineFolly.com)0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Dynamically serving different HTML on the same URL
Dear Mozers, We are creating a mobile version for a real estate website. We are planning to dynamically serve different HTML on same URL. I'm a little confused about the on-page optimization for the mobile version. The desktop version pages has lot of text content and I strongly believe that made us ranking for various keywords. Now if I'm creating this mobile version do I need to serve all the same exact text content on the mobile version too? I found zillow.com using the same method, their desktop version has lot of text content and mobile version is clean without any text. Does this affect the sites SEO anyway? Please help, share your thoughts. RIyas
Technical SEO | | riyas_0 -
How to target long tail keywords
Apologies if this is already answered, I'm a newbie. I'm looking to determine how to target long tail keywords. I'm supporting a site that is new and with very small budgets in the very competitive life insurance market. If we devise a list of long tail keywords, should we then add those to the page titles, meta descriptions etc rather than the short tail ones that the site will never rank for anyway?
Technical SEO | | aoifep0 -
How many keywords should I target?
Hi there I'm looking for advice from the community on how many keywords to target. What are the pros and cons of: focussing on the 40 keywords that we rank for already, with specific attention paid to those where we are on pages 2-5. Spread our link building / onsite optimisation work a little further - and continue to target all 280 keywords on our list as and when they are appropriate to target. I'd love to hear what strategies people recommend. Thanks
Technical SEO | | HeatherBakerTopLine0 -
Keyword Cannibalization?
I am not quite sure I totally understand the concept of keyword cannibalization. I have seen the SEO Moz Snowboard example... I tried to apply the concept but the on-page ranking sees a category page of mine with KW cannibalization. By the way, I still get an A for the targeted KW. I have an e-commerce site, one category page targets 'wool sweaters' and a product page for example is : 'chunky-knit wool turtleneck sweater' (there are 8 products total in this category all are flagged Cannibalizers). I didn't think KWC would be an issue...ranking seems to be effected judging ranking for other category pages w/o KW cannibalization issues. So, my question I guess is KW cannibalization really a big deal? What is taken into account when judging KW cannibalization. Title Tags? URLs? Thanks in advance
Technical SEO | | IOSC1 -
Multiple pages - Similar keywords
I'm working on a site with a parent page and two minor pages all dealing with the primary/root keyword "log siding" - How do I optimize all three pages without bastardization of the primary keyword? Parent page - keyword: half-log-siding and log-siding Child Pages (linking from the parent) cedar-log-siding and Pine-log-siding. They all feature "log-siding" and grade well for that keyword (as well as their own long-tail keywords), yet I think based on my rank tracking that Google is unhappy with the multiple pages all (seemingly focused) on log-siding. Any ideas how I can effectively target all the long-tail keywords within their respective landing pages and not draw a penalty from Google towards my parent page and the root keyword? Thanks, Bill
Technical SEO | | Marvo0 -
Ecommerce SEO with 130 keywords
Hello everyone, My name is Davys and I'm what you call a newbie...so the question may sound stupid....But where we go. In this campaign I will be targeting around 130 to 150 keywords to my store. So here is the technical question. What is the right way of indexing 150 keywords? Should I attempt to have all 150 going to www.mysite.com or should I break it down into smaller pages if I can put it that way. Like www.mysite.com/pages/bikiniwax for example. Even if I break it down, what is the right amount of keywords that I should SEO per page? Or do 150 pages? Please helppppppppppppppppppppp.... 🙂 Thanks a lot for your help.
Technical SEO | | Davys0