Who's suggesting anything?
I was merely asking for some information about blog networks, and how (if at all) they may be utilized for SEO...
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Who's suggesting anything?
I was merely asking for some information about blog networks, and how (if at all) they may be utilized for SEO...
Thanks. So the idea is that you apply to a network and if they let you in, what happens next? The article, though useful, didn't explain too much about how one can take advantage of blog networks in one's niche...
I read on a forum recently that someone was using blog networks as his primary tool for SEO. I'm not sure what he meant beyond this, however...
Can anyone provide tips on how I might use one, and which are some of the best?
Thank you very much in advance.
Ok, thanks. I get the overall point you're making, and will indeed focus more on link quality... I guess my gripe is that if the # of linking root domains (for example) are to be counted at ALL - even as a corollary ranking factor - then it would be really good to be sure of the accuracy of the data, which it's hard to be when Majestic, for example, presents such different numbers in each case.
Hi Cyrus,
Thanks for this. I understand that counting links accurately is hard work, but if I am take this correctly, the above metrics are identical for SEOMoz and Majestic, which is a real head-scratcher, since it would suggest an enormous discrepancy in the data being presented, for almost every keyword link I have tested this on. To take just one other (basically random) example: "how to remove rust stains"
8th down in the first SERP is:
http://www.apartmenttherapy.com/la/how-to-remove-rust-stains-from-stainless-steel-home-hacks-108429
Which returns either THIS data (Linkscape):
Root Linking Root Domains (total # of unique domains with backlinks pointing to this specific domain?): 15,471
Page Linking Root Domains (total # of unique domains with backlinks pointing to this page?): 91
Or this data (Majestic):
RDD (same definition as RLRD, above): 71, 152
RDP (same definition as for PLRD, above): 6
I encourage you to register for Majestic's services and try this out for yourself if this is news to you. In default of further knowledge of the specific methodologies used by these two different link-measuring tools, if one were otherwise to consider them both equal, I don't see how any use could be made of this data, since it is so vastly different in each case? I still can't help feeling that I'm missing something...
Thank you again for your input on this - it's much appreciated.
Best,
Zak
I also use Market Samurai, and I've noticed what seem to be big discrepancies with the keyword data presented by this (data comes from Majestic SEO) and the Keyword Difficulty Tool.
To take just one example, I analyze the term "how to remove tea stains" In the Keyword Difficulty Tool, this returns the following:
Root Domain Linking Root Domains: 2,233
Page Linking Root Domains: 4
When I use Market Samurai, however, the data returned is:
RDD (Domains linking to this domain): 19,911
RDP (Domains linking to this page): 19
I thought that these two metrics were the same for both tools, but I've written them out in case someone sees a difference. As I say, Market Samurai data is sourced from Majestic SEO - a reputable SEO company - but I have no idea where the Keyword Difficulty Tool data is from, nor why these differences are so pronounced? Are they indeed the same metrics in both cases, or am I missing something?
Any insight would be much appreciated.
Thanks for this, but I was referring specifically to the search query aspect. What does it mean, search-wise, if "removing blueberry stains" receives 10,000 hits a month... Does it mean that there have been that many search queries (and ONLY that many, as a ballpark) which contain all of the words "removing," "blueberry" and "stains" in any particular order, and amongst any other number of words?
Exact match is perfectly easy for me to get my head around - broad match, not so much! Take the phrase, "removing blueberry stains." Is the broad match data for this that I'm seeing in the Google keyword tool for searches that involve any of these particular words, in any phrase, in any order - just so long as they're all there?
Any help with this concept would be much appreciated.
Thanks Cyrus, you've been incredibly helpful.
Thank you, I'll be sure to check out your video. Very final question then: If we're saying "askmen" is a keyword rich domain, could the same be said for "manism," for a site dealing with "men's issues"? i.e. Google recognizes the "man" in there?
Thank you, I'll be sure to check out your video. Very final question then: If we're saying "askmen" is a keyword rich domain, could the same be said for "manism," for a site dealing with "men's issues"? i.e. Google recognizes the "man" in there?
Thanks, Cyrus. Let me just ensure I understand correctly:
1. "The Domains are exact match for their brands." Isn't that nearly always the case? Or are you referring specifically to a highly relevant keyword being present in the brand name (in which case, we're saying Google can detect the word "men" within "askmen" and give "credit" for it?)
2. Understood.
Beyond both of these, I'm still curious why 90%+ of inbound links should be brand name or close variants. You wrote:
"In the end, you want anchors closely associated with the keyword topics you are trying to rank for."
...Yet again, it appears that a 90%+ proportion of SEOMoz's inbound links are ALSO brand name, or close variants. The takeaway I'm getting is that this ~90/10 split is what I should be focused on in order to achieve ranking success. So I guess my final question is, is it wrong to think that?
Thanks for the great answer. I am still a bit perplexed, however, and here are those examples to clarify:
Just taking the first example - If you plug that in, you'll notice that under the "Anchor Text" tab, almost all of the keywords with an even vaguely significant number of links to them are brand-name or a close variation of this. There is the occasional "cars on askmen," or "fashion on askmen," but even these hardly seem to be very carefully selected.
Askmen is an enormous site with a PR of 7. Can this really not be taken to imply that brand-name links are significantly more (or at least, no less) effective than targeted keyword links, and one's campaigns should thus be heavily weighted in their favor?
Thanks again for your help, and thanks Cyrus.
I reported this as a bug in OSE, because often I explore these links and find that the pages include both a brand-name link AND a regular keyword link, but for some reason OSE was only reporting the brand-name link...
This led me to wonder how many links this occurred for, and therefore whether or not to trust the fact that the majority of the sites I ran OSE on returned at least (in most cases, more) than 90% brand-name links.
I understand that brand-name links are amongst the most important to obtain, but that it's also important to get anchor text for keywords to build a varied profile. Given this apparent flaw in OSE, is it wrong - in the case of very successful sites - to take this ~90% as being anywhere near the correct percentage of brand-name links that I should be aiming for as a proportion of the total profile?
Extra Credit :)... And this may help potentially help resolve the issue: does "Inbound Links" tab in OSE just report links to the Root Domain, or to that and every other page on the site?
I know that Google prefers a varied back link profile, and so it's ideal to get both - but I wanted to know, are followed back links from blog comments, forum posts etc. (i.e. The low-hanging fruit) weighted significantly lower by Google than links appearing within the of a page, for example? If so, is it possible to quantify by how much?
OK. So My takeaway from this is that it's probably best (since it's so much easier) to go mostly for brand-related anchor text, other perhaps than in cases where I'm really trying to make a push on a particular page? If that's correct, then my last question is: Are brand anchor text links to a particular page thought to be weighted considerably less by Google than page title anchor text for that particular page?
I'm launching a new stain removal website, and wanted to know what would be considered the best way to organize the content? Since most articles will roughly involve "removing X from Y" or "how to remove Z," I can see two ways... 1. Organize articles by Stained Items, Stain Agents and perhaps Cleaning Detergents. 2. Spread the categories out more, to try and group stained items according to categories... E.g. Hard surfaces, delicates, fabrics, ceramics etc. Any thoughts on which of these two might be the best way to organize the site, or are there any better suggestions? Not sure what the main considerations are here... Either of these two seem equally user-friendly.
This seems to make a lot of sense, but then I look at the link profile with OSE of a site like Askmen - in the top 500 of the most-trafficked sites in the U.S - and over 95% of their 10,000 or so backlinks are brand name... There are very few that appear to be targeting specific keywords at all. I know one counterexample isn't much, but this was literally the first site of that size that I looked at. If Google had changed its criteria to those you suggest, could one not expect a site like this to be massively penalized?
I'm looking at a lot of competitor's sites, and they only seem to have gone after root domain anchor text in their link-building campaigns. Since I am just launching and am essentially a one-man band (with some hired help), is it worth my while to attempt to optimize individual keywords or pages at this point, or should I just do as they have done, and try to get domain-name links wherever I can?
For that matter, should I spend more time going for the low-hanging fruit of followed blog comments, forum posts etc, or emailing influencers to try and get editorial links?
Sorry if that last one is a bit broad, and thanks very much in advance for any and all help.
By keyword optimization I just meant looking at the traffic based on a smaler number of articles and making better future selections based on which keywords were performing the best etc.
This was a nice answer, thanks. I was curious how you saw RSS syndication as helping to ensure articles will be indexed as quickly as possible?
I'm building a content site (the model is AdSense revenue) around a certain niche, and I'm currently paying for about 6 articles to be contributed per week. I have the capacity to be paying for a lot more articles, however, so I'm wondering what, if any, factors exist to recommend building the site up slowly as opposed to throwing on e.g. 100 articles over the next week? Those I can think of are:
1. Going slowly leaves room for better keyword optimization etc.
2. Google seems to favor aged domains/content, so 100 good articles now certainly isn't as advantageous as 100 articles 2 years from now.
All that being said, I still feel like the benefit in terms of traffic of adding more content now - since I can - might outweigh these considerations. Does anyone have any thoughts?
Bah, I haven't read the white paper yet, but now I'm even more confused! Both previous answerers veered towards "use the keyword difficulty too, go for keywords you think you have a good chance of hitting the first SERP for," whereas this study would suggest doing otherwise might not be a bad strategy. Hmm.
I operate a stain removal website and was wondering how consistent it was worth being from title tag to title tag. To give you an example, here is a group of keyword phrases that I might wish to target:
"getting out pet stains with vinegar"
"how do I remove water stains from wood"
"removing chocolate stains"
Does the benefit to be gained (whatever that might be) from making these consistently of the form "how to remove X from Y, " or "how to remove X" outweigh simply giving articles titles based on the exact phrases above?
I heard from someone that Google is getting more proficient at spotting "clumsy" title tags, although I'm not sure if any of the above examples would fall into that category, and was thinking that I should then probably proceed on the basis of directly titling articles based on the exact keywords I am uncovering...
Any advice much appreciated.
I understand that the scores it generates are essentially based on the difficulty of appearing on the first SERP for the keyword in question. That said, I am having a lot of difficulty finding keywords in my niche which return a score that would make this easily achievable for a site of my size....
The reason I'm pointing this out is because theoretically, a keyword could have a HIGHLY competitive first SERP, with a significant drop-off on the second SERP, which would make achieving a top ranking on that page substantially easier. So my question really is, is the importance of appearing on the first SERP so unequivocally important that it is a pointless activity to attempt deliberately to rank for keywords on the second SERP, which is ignored by the keyword difficulty tool?
I know the breakdown of clicks goes something like 40% for top spot, 12% for second and downwards from there, but if a certain query has over a million searches per month, for example, it would still be possible to get considerable amounts of traffic by trying to rank highly on the second SERP, which the keyword difficulty tool cannot help with. So is this really a useless activity?
It helps with the theory, but some more specificity would help Are you saying 40-50 would be considered high demand/difficulty phrases? The problem is that I'm not finding much in the niches I'm targeting at around keyword difficulty 30...
I know that words in their 20's or 30's would be ideal, but it's proving hard for me to find relevant keywords with such scores (just a couple with scores in the 30's). Is going for words between 40-50 a waste of time?
Thanks.