Wowsers... nice response.
Thanks for spending the time on writing all that out
I'll re-work the site structure, taking all comments on board!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Wowsers... nice response.
Thanks for spending the time on writing all that out
I'll re-work the site structure, taking all comments on board!
Is it a problem if google crawls this thread?
Thanks for your insight into this!
Interesting.
Why did you search for: "Find event ideas, wedding entertainment or party ideas" ?
When I do it, I get the same result and also the google message about some results being omitted.
Maybe all those pages are considered "keyword stuffed" by Google?
You said you noticed one of them had a page. How do you mean?
There is a landing page for weddings:
http://www.superted.com/wedding-entertainment.php
But no external links to it yet.
I have a site in the UK, with these metrics:
The above metrics are better than 2 competitors who have only just launched, have fewer links, lower DA, etc, and are ranking for keywords like "wedding entertainment", "corporate entertainers"
Our site is www.superted.com
Why doesn't OSE show 33,000 links when I select "links to this root domain". Why will those links only be shown when I do a domain comparison in OSE?
I'm thinking our anchor text distribution is the problem. We have very few links containing the keywords want. Also maybe our sitemap is an issue as it only lists 500 of our pages? But given our domain age, DA, etc... surely anchor text / site map isn't the only issue here?
Any hints would be awesome.
I've never come across any stats, but logic (well, my logic at least) and from everything I hear from anyone is that if anything, it's increasing.
People are getting better and better at searching for specific things online.
But... it's ultimately up to Google. Google may decide that your search query, "where to get a hawaiin pizza 24 hours a day in New York" is very specific but it's going to focus on shorter phrases like "24 hour pizza" in the search results.
In Webmaster Tools, you can "fetch as google bot" meaning you can enter one of those 77 URLs, and see what the Google "bot" sees when going to that URL.
You can also use:
http://www.dnsqueries.com/en/googlebot_simulator.php
For the URL: http://www.in2town.co.uk/Entertainment-Magazine
the Google Bot Simulator says:
HTTP CODE = HTTP/1.1 301 Moved Permanently
Location = http://www.in2town.co.uk/Showbiz-Gossip
and for: http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone
HTTP CODE = HTTP/1.1 301 Moved Permanently
Location = http://www.in2town.co.uk/Weight-Loss-Hypnotherapy
Interestingly, both the NEW URLs work fine although http://www.in2town.co.uk/Weight-Loss-Hypnotherapy doesn't look too good (at least in my web browser) but that's another issue.
You have a fairly complex .htaccess file (hint: I looked up your OLD .htaccess file - you should delete old htaccess files or something so people can't access them via a web browser), so I'm guessing the problem will be within your .htaccess file.
If possible, put a plain and simple .htaccess file on, test it with Google Webmaster Tools and see if the error still persists.
Adam
I know this question has been asked many times in this forum but I still can't work it out.
Why does this link:
Which is showing all links, external, to pages "on this sub domain" show 1,935 external links but this link:
which is exactly the same but this time shoing followed + 301 links, says "showing 1 - 50 external links) but won't show the total links (and I know the mouse-over on the question mark says it's won't show the total links, but I don't understand why it can't show the total links when it could show the total links when I requested to see "all links" instead of just "followed+301" links.)
but it actually lists 700 links (14 pages, 50 results each page). I know the link list is limited to 25 links per domain but then it means you can NEVER know the total link count unless you download the full report.
This makes using OSE to know numbers of links (internal, external, or otherwise) impossible.
And if anyone uses the API, why the API (external+follow) returns 1,451 links?
I'm sure it's an ongoing issue with people trying to get their head around all of this and I've never really been able to.
Any insight would be much appreciated!
I did some correlation testing of the "SEMRush Rank" metric and across 12,000 keywords I couldn't find any correlation between their metric and SERP positions. I've quizzed them on what the metric actually is and I get very little information from them. They say it's based on Adwords competition which would then mean it's not related to organic positions but then I couldn't find any correlation between the metric and paid ad positions either. So in my opinion, SEMRush can't help you there however their suite of tools is awesome.
The SEOmoz metric, "keyword difficulty" is the best I've found. It sucks that you can only enter one keyword at a time (there's no API access for that metric either) and it's fairly slow in returning results. So if you have hundreds of keywords you'd like some kind of "difficulty" or "competition" metric on, then as far as I know there's very few.
LongTail Pro have added a "difficulty" metric to their downloadable software though I'm not a fan of the price / feature ratio and I haven't tested how well it performs, or wether it's an absolute metric or a relative (to your list of keywords) metric.
Sean mentioned Raven Tools. They have plenty of great tools, pulling data mostly from SEOmoz and Majestic SEO. From my memory they don't have a "difficulty" or "competition" metric, though I could be wrong.
I ended up writing my own software to get "difficulty" metrics across an entire set of keywords. I don't sell it, or advertise it, but I am looking for people to help me improve on it through testing / feedback. The benefit is of course, you can use it. Shoot me a message if you're interested.
Adam
For me, when I hire someon to to "backlinking", it's about links from external websites pointing to the client's site.
I prefer the term "link building" as I think it describes more accurately what I want done.
The link profile on the actual client's site (internal links) is something I usually handle myself. It's more "on site" / "on page" optimisation (SEO) as far as I'm concerned.
So I think "backlinking" is external link building and then there's internal link building or on-page SEO.
Not sure why you're asking but if it's because of some confusion between you and a contractor or you and a client, there's no dictionary definition and so clarification is always required when using these terms.
I'm actually not entirely sure how much access Google has to this kind of information. Over the last few years my opinion is there's been ongoing debate about exactly what Google can see in terms of Facebook data but it's clear that Facebook (social) signals do seem to have a correlation to rankings.
Just to clarify, people can't "like" a website (well, they can like the home page, but that's just the home PAGE, not the web SITE), so you have can have a "like" button on every page of your website.
You can also encourage people to like your facebook page.
And as a side, "like" and "recommend" are the same thing. "Share" takes a little more effort from the user and I don't think many people use "share" any more, not too sure.
But why not have both buttons? - a like button on all key pages of your website and also encourage people to like your facebook page.
Someone liking your web page means it'll appear in their news feed, their friends will see it, and then that's it. Liking a facebook page means you have that person seeing all activity on your facebook page.
I know you're asking about SEO / Google / Social Signals but I thought it was worth clarifying the above as I'm sure while you want better rankings, you also want more traffic to both your site and your facebook page.
You can associate your website with your facebook page and using Open Graph, post updates to that page and they'll appear on the feeds of anyone who has liked your facebook page.
As for which is better? I'd like to see if someone actually has an answer to that. I haven't been able to find specific information on exactly what Google sees, what it indexes, and what social signals it takes into account.
Adam
Well Spyfu is awesome, no doubt about that.. except..
I'm in Australia. Spyfu only covers the US and UK.There are quite a few tools that simply don't cater for Australia.
My software caters for nearly 30 countries, and most major cities in those countries.
Sure, SERP results will change on a city-by-city basis but that's a problem any tool is going to have. If the tool uses the Google API then the SERP results from the API are different (sometimes vastly) from an actual browser searching on Google.
I'm not sure how to overcome the geolocation issue entirely. I don't think it's possible.
But otherwise, what I find frustrating with all these tools is they give a bunch of data but don't give any answers. So spyfu shows me historical CPC which is fantastic but it means I'm left to try and use that data to answer my question which is really simple: "which keywords are the most profitable based on ALL data available?"
The Kombat function in Spyfu does what the SEMRush "comparison" tool does - shows me which competitors share keywords. For a seasoned SEM specialist this is great because he / she can interpret all that data and make customized suggestions to their clients. But again, for me... I ran a business. I just needed an answer.
It's about having something that summarizes all the data and presents options to me.
Maybe I'm over-simplifying it, but what I wanted was to be able to give my system my website and 5 keywords, it would return 1,000 keywords that might be suitable, and then it'd automatically collect every key metric for every competitor across all 1,000 keywords and then rank those keywords from best to worst based on all the back-end mathematics.
I created an infographic to help explain the process a bit.
Hi Sean,
It's software I've written that collects all the metrics... domain age, page rank, home page rank, backlinks, CPC, etc, etc.
I would actually collect domain and page authority as well and for a while even the "keyword difficulty" metric from SEOmoz as I trust their metrics a lot and it's good to see if my results are similar to theirs.
As mentioned to David above, it could all be automated but at the moment there's a bunch of manual tinkering I do after my software collects all the data.
At the same time, full automation wouldn't be possible as the client needs to be involved in filtering out the irrelevant keywords, picking which keywords are "education" or "purchase" keywords, ignoring keywords of less than x searches per month, etc.
However I have developed nice little online modules for these steps that help do this quickly. So you can group the keywords, and filter by various methods, and make mass-changes (like deleting groups of keywords) with one click.
I think if those steps are made really simple and quick then people will be happy to be part of the process. The result will be a more relevant list of search terms in the final ranked list.
If you already have a website for which you're confident you've got an awesome keyword list for, I'd love to know if my system yields similar results.
I hope, if it doesn't yield similar results then people would be interested in breaking down why, I make modifications, and so on. It would be fun (but then again, I have few friends!)
Adam
Hi David,
It's software I've written. But it's an application that runs on PC / Mac.
The client needs to be involved though. So I need to be given the basics, then my software compiles all the keywords, gets monthly search volumes, and then the client would need to get rid of the irrelevant ones and then I get my software to do all the research, gathering absolutely everything.
It's a massive amount of data normalizations and calculations with the result being the most profitable keywords ranked from 100% to 0%.
The client can get involved again and tinker with some settings to get the list of ranked keywords tailored to their own preferences.
For the parts where the client would need to be involved (initial info, clean up the keyword list, tinker with the results), I've programmed those steps in an online interface so I can have people test this process.
I could have a sign up form for beta testers but really anyone who's an SEM professional who already has a list of keywords for a particular website is someone I'm interested in talking to. I want them to compare what my system / process outputs to what they already have.
So it doesn't really require an online form... just an expression of interest with a 1 minute bio and I'll pick a few people. At the moment I have to trigger each stage of my software to do it's thing, clean up some of the results, etc - it's not fully automated as I never had any intention of making it publicly available. So I want to make sure I'm spending my time doing all this for someone who is REALLY interested in it and can provide some quality feedback.
If you're keen, let me know
Adam
If it ever happens again, rush to you client and demand a bonus... before Google fixes it!
I love SEMRush for this. Give it a keyword and it'll show you the top organic and paid listings. Give it a URL and it'll give you organic and paid keywords for that URL.
SEMRush (and all tools) can only give an estimate as to the amount of traffic. These tools base these calculations by looking at the position of the competitor in the SERP, the search volume per month of that keyword, and the likelihood of someone clicking on that result.
If you know that x% of people click on the top organic result and you know how many people are searching per month for that keyword, then you can get a pretty good idea of what the traffic is going to be like for that URL.
As for which keywords are converting... you'll need Google Adwords / Analytics for that. You could ask your competitors for access which would be funny, but obviously refused!
But keep in mind, if you trying to find the "top" keywords, tools like SEMRush only report the keywords that are in use / working for a given competitor. It doesn't mean there aren't other keywords you should look into - keywords that your competitors aren't ranking for at all. In fact, that's the goal... find the keywords your competitors don't know about (and therefore tools like SEMRush don't know about).
Adam
Hi all,
I'm relatively new here but not new to the world of SEO / SEM. Over the years I've loved using SEOmoz and other tools but of course have found certain limitations with respect to how I like to work. That's the case with any tool / service.
So over the years I've put together a keyword research / competitor analysis process that has worked well for me and I'm wondering if it might also work for others.
I've spent the last 15 years of my life as a director of a range of companies, mainly in printing but also in systems development, marketing, etc. I spent a large percentage of my time developing systems and tools to help me with my search engine marketing. I've now sold all my companies and I'm semi-retired, somewhat bored, and would love it if I can assist others with the process I've used over the years.
I'm curious to know whether SEM professionals agree with the way my system ranks search terms from "best" to "worst". If you're interested in testing this process and telling me if you think the resulting list of search terms that I come up with for your website is "spot on", "not bad" or "horrible!", then please read on.
My key motivation here is to educate myself as well as others. I'm not charging for any of this...
If you give me your website URL, your top 5 competitors and your top 5 search terms, I will return to you:
you can then...
You can tell me which options you prefer:
I will then:
And you can adjust things to change how the keywords are ranked:
As a result of the keyword analysis, it'll also show you who your organic and adwords competitors are based on all keywords, or just your top ranked ("best") keywords.
In that competitor data you can see:
All of this is is handled in a simple web interface that I threw together recently. It's really simple, merely asking for your site and preferences and then an interface to view / sort the results.
Interested?
I'd like to hear from any SEM professionals who want to test this process.
Once I have your basic details, I can get a keyword list together simply (using my internal process / software) and then you need to do some basic sorting, particularly if your search terms are in an industry that I know nothing about. Your input will be required.
From there, give me 24-48 hours and I'll return 2 lists of search terms: "organic", and "Adwords". I'd love to hear your opinion about the relevance of the search term lists. I hope it will also spark some interesting discussion and hopefully help people learn a bit more about keyword / competitor research.
If you're interested, please shoot me a private message letting me know why you'd be a good candidate to test this system. I really do want people who are well versed in search engine marketing. So please include a basic "resume" about who you are. If you have an SEM company and that's your main career focus then I definitely want to hear from you.
Adam
Hi Mark,
So you're saying in the API "juice passing links" is the equivalent to what OSE calls "followed + 301" links ?
I'm a little confused between all the link types.
Internal / External - easy
Linking Root Domain - easy
Followed / No followed - easy
But then there's talk about "juice passing links" and I can't quite get how this is defined, and why it's something you can get from the API, but not from Open Site Explorer... or can you get that info from OSE?
This website:
http://www.opensiteexplorer.org/domains?site=www.webuycarsyourway.com.au
is ranking on page one for about 75% of a list of 300 car related keywords.
They don't appear to have done any link building, no social media, appeared out of nowhere in April 2012 and SEMRush reports between 5,000 and 10,000 visitors per month.
The only thing I can see is they have a landing page for every keyword with the keyword in the URL. But surely that can't be the only reason they're ranking well ? ?
I received a response in the developers forum which basically said the input data is scaled logarithmically rather than the output being linear which is then scaled logarithmically.
This means it's impossible to answer the question.
However I have asked (and repeated the question here for the sake of anyonewho's interested)...
What is the distribution of DA values across all values? It would be nice to know that "the median DA across all sites in our database is x." That would at least put the numbers in some perspective - and it's perspective I'm trying to get.
Can you also confirm if the "keyword difficulty" is also calculated with logarithmic inputs? And what's the median keyword difficulty?
No, I'm not seeking something deeper at all.
I don't care how they work out DA / PA.
The result is between 0 and 100, but they say it's logarithmic.
So is it log(x), or log(x+3), or log2(x)... ? ?
Yes I think my email was playing up as I missed a few emails in recent days.
Why is this quesiton marked as answered?
How did that happen?
I absolutely agree to re-invent the wheel is inefficient when people like SEOmoz have thousands of man-hours in developing great metrics... but I use the SEOmoz metrics along side other metrics for a few reasons:
To confirm the validity of metrics between providers and my own research
To customise the kinds of reports I give clients. For example, sometimes a link profile report is more relevant for a customer than a domain authority report. Sometimes both are relevant, etc.
It just sucks when you have to put caveats on data such as saying that the SEOmoz authority metric is logarithmic but to an unknown logarithmic curve.
I think SEOmoz should publish the logarithmic calculation. I'm not asking for their intellectual property on how they calculate authority or keyword difficulty, etc... I just want to know the logarithmic calculation. Otherwise I'm left asking, "what does 30 actually mean?" In addition, is the keyword difficulty logarithmic? SEOmoz doesn't say.
Adam
Hi Ryan,
Thanks but I'm actually posting here because it's been several days and no response from my ticket.
So not sure what to do
Many sites will have "Modified Since Http Header" enabled. If so, then when looking at a page you want to check, type this into your address bar:
javascript:alert(document.lastModified)
But that won't work on all sites as some may disallow javascript execution in that manner. But also if it's a dynamic site and the web owner hasn't configured their Modified Since Http Header properly, then you'll get incorrect dates anyway.
You can see if a website has the Modified Since Http Header enabled:
http://www.hscripts.com/tools/if-modified-since/index.php
You can use the internet archive to look at previous versions of a website (unless they've disallowed this in the .htaccess):
A cool tool I use to alert me to when a competitor (or any website) content is updated is:
http://www.changedetection.com
Enjoy
Just remember that if you're using the SEOmoz domain authority / page authority, that they often update their algorithm meaning these metrics can jump up or down over time. It may be best to also (or in replacement of) tracking domain and page authority, to track other metrics such as linking root domains, social analytics, etc.
What do you mean by "shut down" ?
If you can navigate the site then so can Google. Changing your home page to "we're no longer here" won't help either as Google can still access the pages "behind" your home page because it already knows they exist so it'll go directly to them when crawling.
You either need to delete the pages entirely, or delete the entire website, edit your DNS information so it no longer resolves your domain name... something that permanently "shuts down" and removes the site.
SEOmoz says their Domain / Page Authority is logarithmic, meaning that lower rankings are easier to get, higher rankings harder to get.
Makes sense.
But does anyone know what logarithmic equation they use? I'm using the domain and page authority as one metric in amongst other metrics in my keyword analysis. I can't have some metrics linear, others exponential and the SEOmoz one logarithmic.
I've googled and read and still confused on this.
Regarding the link metric bit flags:
http://apiwiki.seomoz.org/link-metrics
I understand everything about how the API works and have it all working, except for this bit flag thing... And there's no example on this page of how to use the bit flags in an API request so it's hard to work it out.
My current API call is:
"http://lsapi.seomoz.com/linkscape/" & $db & "/" & $objectURL & "?
SourceCols=" & $SourceCols &
"&TargetCols=" & $TargetCols &
"&Scope=" & $Scope &
"&Sort=page_authority" &
"&Filter=" & $Filter &
"&AccessID=" & $accessID &
"&Expires=" & $expires &
"&Signature=" & $urlSafeSignature
There are 4 Link Metric Bit Flags: 2, 4, 8, and 16. How do I use those in the above URL
Do I use those in SourceCols? And there's about 15 Link Flag Definitions (ranging from 1 to 65536). So I use those in TargetCols?
I've tested different variations and I get different responses but the responses aren't documented either.
For example, if I get "luutrp":5.432777943351583,"luutrr":1.415861867916131e-14" in a response... what IS luutrp or luutrr.
These acronyms are not outlined anywhere that I can find.
This link:
http://www.seomoz.org/ugc/the-busy-developers-guide-to-seomoz-bit-flags
outlines an example but is using Cols (not SourceCols or TargetCols) so either this page is outdated or I'm missing something. Is there also a "Cols" parameter?
Basically, I just want: - given a single URL, show me all links (page to page) pointing to that URL, and give me the same information that's displayed in Open Site Explorer (page authority, anchor text, title, etc, etc) but also give me some additional information such as, "is it on the same C block?"
What's an example URL for the above request?
Really hoping someone can shed some light on this under-documented API.
Ah so people get points?! I'll have to research what they do but don't worry Dana, the points you shall keep!
But yes, a good feature would be to be able to set a question to unanswered.
Being new here I'm pressing all sorts of buttons and breaking things...
I posted a question, liked someone's answer so click "good answer" thinking it was like a "thumbs up" but it actually set my question to "answered".
How does one set their own question to un-answered?
Hi Darcy,
I'm going through a process at the moment of a very thorough analysis of every tool I can get my hands on - more from a technical perspective because I was getting sick of the varying results from different tools / utilities.
I wrote my own scraper utility to get live results from Google as well. I tested 800 keywords using broad, phrase, exact, etc and the results from Google compared to every tool out there is different.
I'm currently writing a document on all of this. Maybe I should post it somewhere, I don't know, but my strong feelings at this point is:
check with each provider (of the online tool or downloadable software) whether they use broad match, phrase match, or exact match. SEM Rush displays results for "phrase match" but shows [exact match] search volume for that phrase. You almost need a database to remember what criteria each tool uses to return results.
do they allow you to filter results by domain (google.com, google.com.au, etc)?
in addition to the above, do they allow you to filter results by only showing results for that country (eg, if you can select "Australia", do they search on google.com.au AND give you the choice to see results for only .com.au domains, or all results that google.com.au would offer up?)
How recent is their data? Everything costs money - whether it's in API units from Google, API units from a competing tool, or even time and money spent on writing their own scraping software. Free tools aren't going to update their databases every day. SEOMoz (who I've found to be quite accurate) updates their Keyword Analysis Tool database monthly. A lot can happen in 4 weeks. Getting this information can be tough. You need to email each vendor and get their answer - and then trust their answer.
Tough one... but how savvy is the developer of the tool? I've written my own tools and it's damn hard to get accurate data even from Google directly! There's so many variables. Poorly written software returns poor results. In writing my own stuff, different proxies would return different Google results, even different user agents (web browsers) would return different results. You need a thorough understanding of the technicalities behind obtaining data from or about Google.
There's a few considerations off the top of my head. As I say I'm still putting together my findings. I'm using this reply as an opportunity to hopefully spark a little discussion on this as I think many people have the same problem.
The immediate solution? Write your own software, or use a few tools and average the results. I'm quite the analyst so varying results do my head in. However at the same time, is your goal to get 100% accurate results, or get a good "feel" for the competitive landscape and move on? Sometimes I like to get things perfect, other times my client doesn't have the time to wait, and I'm not paid enough to be that anal!
If you're not comfortable with .htaccess, you can use your hosting interface (cpanel or plesk or similar) to add them one at a time.
Hi Dana,
Very strange, I am just reading your blog post now and here you are answering my question. Thanks for the quick reply.
I agree the definition of "competitor" will vary. My question was asked with the definition of "competition" to be anyone taking up a position on the first page because if all 10 positions for shure microphones are taken up by major suppliers / manufacturers then the recommendation to my client would be "don't bother".
What I'm finding with varying tools (SEO Moz, SEM Rush, etc) is they all give search volumes in terms of phrase or exact matches. But when someone searches for wedding present in Google (as a broad match), the number of results is the number of results we care about.
I'm simply not confident in my own opinion because I would have thought all these tools would have put in enough research to decide that phrase or exact results are better. Using a phrase or exact match to provide "search volume" figures makes sense to me, but if you want:
the top 10 organic / paid results for a keyphrase
the number of organic / paid results for a keyphrase
Then broad match would be the obvious choice.
I'm not sure if anyone from SEOMoz wants to explain why the Keyword Analysis Tool shows results using exact match, but I'm sure there's a logical reason for it - I just can't see it!
Hi all,
This is a question about whether the top 10 ORGANIC Google results for a broad match is your competition, or whether it's the top 10 for a phrase match, or an exact match.
I'm a newbie here but not a newbie to the world of SEO. I hope to be able answer just as many questions as I ask
QUESTION: If a customer comes to me and says, "hey, who's my (organic) competition for wedding present?" and I want to use Google to get the top 10 organic results, do I use a broad, phrase, or exact match?
It seems many people think an exact match is the way to go but I think they were more referring to Adwords results / competition. I'm not trying to determine search volume for Adwords or even the search volume for organic results... I'm only interested in the top 10 competitors in the organic results.
No one types in "wedding present" (with inverted commas" when doing a search in Google, so surely to see who ranks organically for wedding present I'd want to simply type in wedding present (no inverted commas, aka broad match).
I understand all the concepts about how Google results cary whether you're logged in, etc, etc so I don't want to get distracted by that. And I know there's a bunch of tools we can use like the SEOMoz Keyword Analysis Tool. But I just want to know specifically what people would use (broad, phrase or exact) to look at the top 10 organic competitors are when doing a manual search in Google.