I have two questions/queries:
Posts made by James77
-
RE: Suggestion - Should OSE include "citation links" within its index?
Thanks Miranda, that would be great.
I really hope its added - this could be a very positive differentiator between Moz and other link research tools out there. Combining this citation link finder in with the 'Just Discovered' links - will in my view make a killer feature well above the competition.
James
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Thanks Rand - that's great information.
When you talk about rankings rising, did you see them rise for the KW's associated with http://moz.com/beginners-guide-to-seo or are we talking about rankings for other Moz pages ? - IE did adding http://moz.com/beginners-guide-to-seo contribute to a rise in rankings across the whole domain or just that subfolder.
I hope you consider making this into one of your WBF's / Posts, as I think it would be fascinating to see the "What you did", "What were the results" etc, and also get feedback from what others have experienced.
Many thanks
-
Moz's official stance on Subdomain vs Subfolder - does it need updating?
Hi,
I am drawing your attention to Moz's Domain basics here: http://moz.com/learn/seo/domain
It reads:
"Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)." I am wondering if this is still Moz's current recommendation on the subfolders vs subdomains debate, given that the above (sort of) implies that SE's may not combine ranking factors to the domain as a whole if subdomains are used - which (sort of) contradicts Matt Cutts last video on the matter ( http://www.youtube.com/watch?v=_MswMYk05tk ) which implies that this is not the case and there is so little difference that their recommendation is to use whatever is easiest. It would also seem to me that if you were looking through the eyes of Google, it would be silly to treat them differently if there were no difference at all other than subdomain vs subfolder as one of the main reasons a user would use a sud-domain is a technical on for which it would not make sense for Google to treat differently in terms of its algorithm.
I notice that in terms of Moz, while most of the site uses subfolders, you do have http://devblog.moz.com/ - and I was wondering if this is due to a technical reason or conscious decision, as it would seem to me that the content within this section is indeed linkworthy (as it has external links pointing to it from external sources), therefore it would seem to not be following the initial advice that is posted in Moz's basics on domains. Therefore I am assuming it is due to a technical reason - or that Moz's adive is out of date with current Moz thinking, and is indeed in line with Matt C in that it doesn't matter.
Cheers
-
RE: Suggestion - How to improve OSE metrics for DA & PA
Cheers Pete.
I totally understand the data dependency. One thing you could do, which would not require data dependency (long term), and also help with the spam detection your building is to take a single snapshot of "Ranking" - then use this as a data set to pattern match spam sites. EG if you managed to pull say 100,000's of ranking scores (say traffic scores from SEMRush), then match that with Moz's current scoring on that domain, then bucket the sites into groups that have higher or lower ranking scores than DA would predict, then try and reverse engineer the link or other patterns Moz use which are common to those buckets.
-
Suggestion - How to improve OSE metrics for DA & PA
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be.
Some examples
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack.
I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be.
1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics.2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site.Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.
-
Suggestion - Should OSE include "citation links" within its index?
This is really a suggestion (and debate to see if people agree with me), with regard to including "citation links" within Moz tools, by default, as just another type of link
NOTE: when I am talking about "citation links" I am talking about a link that is not wrapped in a link tag and is therefore non clickable, eg moz.com
Obviously Moz have released the mentions tool, which is great, and also FWE which is also great. However, it would seem to me that they are missing a trick in that "citation links" don't feature in the main link index at all. We know that Google as a minimum uses them as an indicator to crawl a page ( http://ignitevisibility.com/google-confirms-url-citations-can-help-pages-get-indexed/ ), and also that they don't pass page rank - HOWEVER, you would assume that google does use then as part of their alogrithm in some manner as they do nofollow links.
It would seem to me that a "Citation Link" could (possibly) be deemed more important than a no follow link in Googles alogrithm, as a "no follow" link is a clear indication by the site owner that they don't fully trust the link, but a citation link would neither indicate trust or non trust.
So - my request is to get "citation links" into the main link index (and the Just Discovered index for that matter).
Would others agree??
-
RE: Best SEO practice - Umbrella brand with several domains
I would if possible try and concentrate on a single brand/domain as opposed to multiple domain - on the assumption that an identical piece of content will rank higher on a single strong domain than it would on any of your less strong segmented domains.
With a single domain you also don't have the issue of worrying about how to cross market across domains (although as you point out many sites do so - Moz / OSE being one example) . Of course this also gets rid of the worry about whether google may see cross linking between multiple domains as a 'Link Network' - unlikely if it is just a few domians but if you are talking about 10's of different domains then this could well raise a flag in googles alogrithm.
It is still unclear as to how google treats multiple domains owned by the same companies, and on the same theme. Many websites have sub domains for their blogs and other sub-sections and the standard thought a few years ago was that you were better off moving those into folders. However, I think things have now changed such that if there are enough connecting signals it doesn't matter if its a sub domain or sub folder. When considering this, as a subdomain is technically a separate site, you would think that if similar connecting signals where to appear between 2 separate domains, then Google could (and probably should) treat them as they would sub-domains. However, whether there is some, all or no truth in this is likely to be a debate that will be speculation until Google confirms something along these lines.
-
RE: How long before your rankings improved after Penguin?
I think its very unlikely it takes 6 months for a disavow file to be processed - I have heard ranges from a couple of weeks to a couple months but no more.
If your talking about an algorithmic penalty (like penguin) then most likely you should see more gradual changes than if it were a manual penalty (which you would need to get removed via a re-inclusion request to see improvement).
When you mention 'go overboard' then that is a matter for you to decide what is a good and bad link, but in my opinion if you are under any penalty you need to be pretty thorough in removing all links that you think could be causing you issues.
-
RE: How to NOT appear in Google results in other countries?
Do they cause you any other issue than effecting your bounce rate?
If not, then I would not try to stop them visiting your website, but just segment the bounce rates as you have done above.
The reason I say this is three fold:
1/. Trying to prevent SE's not displaying you to certain countries etc, could be fraught with problems and you could well end up damaging the SERP's in your key countries.2/. 'Country by IP' is not an exact science - and you may exclude some people who are located in your target countries.
3/. If you ever expand into the countries you want to currently block, would it not be nice to already have traffic from that market? What about customers who are from your target countries but searching while they are on holiday abroad?
Hope that helps
-
RE: Robots.txt and Magento
I assume this is a robots.txt that has been automatically created by Magento? - or has it been created by a developer?
I ran it through a tool and it showed 1 error and 10 warnings - so i would say you definitely need to do something about it.
The reason for all those disallows is to try and stop search engine indexing them (whether they would even find them to index them if they were not there is debatable).
What you could do is set up robots.txt as you have suggested and then stop the SE's indexing the directories or pages you don't want in appropriate webmaster tools.
I don't like displaying a lot of 'don't index' paths in the robots texts as it is pretty much telling any hacker or nasty spider where your weak points may be.
-
RE: Dramatic decline in rankings
Firstly - check in Google Webmaster Tools to see if you have got a manual penalty.
Secondly make sure you haven't done anything silly like blocking search engines on robots.txt or response headers.
Thirdly I would then do a full link audit of your site - use OSE, Ahrefs, Majestic and WMT. Go through the links and ask yourself if they were honestly and naturally placed, and if you think Google could have an issue with them - and if so try and get them removed.
-
RE: To all the PPC expert :
With regard to the above I think the tool you are looking for is http://www.google.co.uk/intl/en/adwordseditor/ - This is basically a desktop app that allows you to manage all your campaigns. For some reason Google doesn't promote it very much, so a lot of people don't know it exists. It will save you a lot of time when creating and managing large campaigns.
With regards to managing your budgets it really depends on what your trying to do, and Adwords is so flexible (and pretty confusing unless your an expert!) in this that it is hard to define a single answer. You can manage budgets on many different levels from overall campaign to individual Keywords etc.
The thing with adwords is that its an organic learning process and needs constant monitoring and adjustment. My advise would be to make sure you have daily budget caps in place, and then try putting max bids on you ad groups. Then as time goes by, monitor your ROI on these campaigns and adjust accordingly.
-
RE: Do 'Just Discovered' Links get added to the main link index?
Hi Sam,
That's great to know. IMO the ability to get these 'just discovered' links into the main index is a significant advantage to Moz over other tools. Given a just discovered link must first be tweeted, these links are (generally) more important than non tweeted links, and therefore of more importance to the overall link profile.
Cheers
-
RE: Removing the clutter of site-wide links
ahrefs.com provides the separation of sitewides and nonsitewides as a standard headline metric, and this is a useful metric.
My question wasn't in regard to the effect of sitewide links (we all know that manipluated sitewides are a red rag to a bull - but can equally be totally natural and clean ), but the fact that they create an awful lot of noise in some of the reports, and it would be nice to be able to filter that noise out.
-
Removing the clutter of site-wide links
I have a multi-part question with regard to the moz link index and some presentation suggestions.
Firstly I would be interested to know how the link index treats site-wide links with regard to metrics such as DA, and PA. We all know that it is highly likely that SE's are unlikely to pass full link value across from sitewide links, and therefore it would make sense for Moz values to account for this as well - if they do not already.
One annoying thing that also relates to sitewides is that they tend to clutter the much of the information presentation in a few of the tools (you can't see wood for trees as it were). This is most prominent in the "Just Discovered" page - if you have a sitewides on a large site, you can often find that this screen is just totally filled with these links as they are found. It would be very useful to be able to filter these out, as they are of little interest - currently I can't see a way of filtering them out.
A further value where they create to much noise is the 'Total Links' value. Where sitewides are included in this value, the value actually becomes pretty meaningless as you can find that the majority of that value is sitewides. It would therefore be useful if there was another value for 'Total Links - Excluding Sitewides' where maybe value of 1 was just added to the count for a site wide
-
RE: Is it OK to Delete a Page and Move Content to a Another Page without 301 re-direct
yes, I would certainly recommend 301'ing the link - or better could you just simply overwrite the old page (A) with the new page you are intending to create (B) - unless of course B already exists?
If you struggling with managing the 301, then I would first check if there are any external backlinks going to page A. If there are, then I would certainly try and find a way to 301. If not, then it will not effect things to much as there is no external link equity going to the page to loose.
A further option if you are unable to 301, and copying the content from page A to page B is to rel=canonical A to B, which should pass the link equity across in a similar way to a 301.
-
Do 'Just Discovered' Links get added to the main link index?
Hi,
I was wondering if the 'Just Discovered' links get added to the main link crawl index?
It would seem to make sense for them to do so, as this would enable the enable the link index to be more up to date than it would otherwise be. Observing the link index it would seem that at the moment it does not do this and they are totally separate indexes (based on personal observation).
Thanks
-
RE: Doubt with no follow links: disavow or no action?
It really depend on what those 'no follow' links are. Generally they cannot hurt you, but if they are done in a manipulative way eg mass blog comments, and furthermore if they are on bad sites - then they can hurt you - see Matt Cutts video with more info on this: http://www.youtube.com/watch?v=QSEqypgIJME
-
RE: Any way to track rank of a URL for keyword BEFORE setting up in Rank Tracker?
Have a look at SEMRUSH.com - they provide historical data.
-
RE: Linking without loosing link equity.
It would be for just some links not all.
For example in my main navigation I have a 'shop' which is run as an affiliate on a subdomain of the affiliate website. As its in the primary nav, its on every page of the website, so in theory it will be passing an awful lot of link equity.
Currently I have got the link as 'no follow' but from what I have read, although no link equity is passed to the site, my website is still loosing that equity as if it were a normal link.
Thanks
-
Linking without loosing link equity.
Hi,
I was wondering if anyone had a solution to linking without loosing link equity?
From what I have read using 'no follow' on both internal and external links DOES NOT pass any equity across the link to the link target, but also, the latest thought goes that it DOES loose link equity (as if it were a FOLLOW' link).
So is there a method of retaining link equity using another method?
Thanks
-
RE: Google shutting down Rank Tracking Software? - Raven and AHREFS close down ranking results.
Thanks Rand - that good to hear.
-
Google shutting down Rank Tracking Software? - Raven and AHREFS close down ranking results.
Hi,
I was wondering if other people had noticed the (very sudden) closing down of the ranking tools for both Raven and Ahrefs - both of which were major parts of their software. Given the sudden and synced timing my guess is that Google is throwing its weight around and banking on the doors of companies offering ranking software with the threat of a fat lawsuite!
What is everyone else using for their rank checking ?
Will this impact SEOMOZ ranking results ?
-
Making AJAX called content indexable
Hi,
I've read a bit up on making AJAX called content indexable and there seems to be a number of options available, and the recommended methods seems to chaneg with time.
My situation is this:
On a product pages I have a list of reviews - of which I show the latest 10 reviews.
The rest of the reviews are in a paginated format where if the user clicks a "next" button, the next set loads in the same page via AJAX.
No ideally I would like all this content indexable as we have hundreds of reviews per product - but at the moment on the latest 10 reviews are indexed.
So what is the best / simplest way of getting google to index all these reviews and associate them with this product page?
Many thanks
-
Do links script tags pass value?
Hi I was wondering if there was any consensus over whether links in script tags pass any value - such as the link code below?
Thanks
-
RE: Query for checking is a link to domain A already exists on domain B
Thanks Sebastian,
It's a bit of a conundrum I am having problems solving - essentially it needs to be very quick and not resource intensive. I this case it looks like the search engines don't want to provide the data.
Thanks
-
RE: Query for checking is a link to domain A already exists on domain B
Hi Sebastian,
Not quite - I'm actually looking for something simpler than that.
Suppose for example you took your own website (whatever that is) - and you want to find out if there is a link anywhere on SEOMOZ.org to your website.
Is there is simple query you could punch into a search engine to provide a list of result showing all pages on SEOMOZ that have links to your website.
-
RE: Query for checking is a link to domain A already exists on domain B
Close but I don't think correct - Unless I have hundreds of links from the BBC
-
RE: Query for checking is a link to domain A already exists on domain B
Thats not really what I am after to be honest. For link building I just want a very quick simple query I can punch in a search engine to check if I already have a link on that website.
-
RE: Query for checking is a link to domain A already exists on domain B
The problem with open site explorer is that the index only gets updated once a month, so it wouldn't show any links that have been acquired in the current month.
-
RE: Query for checking is a link to domain A already exists on domain B
Hi - thanks for the answers but I think you miss understood my question.
I am looking for a query I would use to check if Domain A LINKS to Domain B.
EG What query would I use, and what search engine, to check if there was any link on SEOMOZ pointing to My Website.
-
Query for checking is a link to domain A already exists on domain B
Hi,
I was wondering if anyone can help. I need to have a simple check where I have 2 domains, and I can check of there are any links from domain A to domain B.
Does anyone what would be the best query for this and if you would use google, bing, yahoo or other SE.
Many thanks
-
RE: Any Tool Suggestions for Sharing Bookmarks in a small group?
Cheers Mate - Will take a look.
Thanks
-
Any Tool Suggestions for Sharing Bookmarks in a small group?
Hi,
We have a small group of co-workers, and we are constantly firing round emails of great articles we have found, new websites to read etc etc.
Emails are just very cumbersome and get lost, so I am wondering if people can suggest any tools which would be good for creating a "work board" where we can quickly mark and comment on articles/websites each of us finds and it gets posted across other peoples boards.
It needs to be fast, user friendly, allow grouping by topics, visually appealing (and ideally allow you to mark of when you have read something).
Anyone have an suggestions as to a tool that would be similar to this?
Thanks
-
RE: Domain and subdomain comparison - a deeper look into the subdomain metrics
I wouldn't worry about it to much to be honest, your better off spending your time building great content and getting people to link to it rather than watching what your competitions stats say. Focus on getting your own stuff right, and you'll soon be leaving the competition wondering what happened.
-
RE: Domain and subdomain comparison - a deeper look into the subdomain metrics
To be honest a DA in the 30's is pretty low. If you concentrate on getting a 30-50 good quality links you should easily get into the 40's.
-
RE: Google driving me Nuts - How do you combine 2 accounts?
Ummm - Its absolutely infuriating - I have to now constantly use 2 separate browsers, so I can be logged into one account for my emails (work email) and another for googles other tools.
The fact that I now have 2 G+ accounts (effectively automatically created) and I can do nothing about it is a huge turn off - If I wasn't effectively forced to use G+ because of my job, then their is no way I would use something which created conflicting accounts for me and I couldn't do anything about.
-
RE: Domain and subdomain comparison - a deeper look into the subdomain metrics
It all depends on where links are pointing to - remember a www. is also subdomain.
-
Google driving me Nuts - How do you combine 2 accounts?
I know this must be driving a lot of other people mad as I see loads of people who now have 2 registered accounts at google plus due to their seemingly terrible ability to merge or connect accounts.
We have a work email address set up through google, then I have a personal Gmail address. In Google Plus now I have 2 profiles - even though I have not signed up to google plus with my work email, I cannot add this email to my Google + account set up on personal email as it just tells me to log into that account taking me to a page to set up a profile for that account.
Has anyone managed to solve this problem - it is happening to everyone in the company, driving us all nuts and our IT guys have no idea how to solve it.
Were trying to use G+ for the purposes of SEO & Marketing, but if they make it this cumbersome for people to use, then they are going to die a quick death - after a few weeks of use and noticing the huge number of dead accounts, only live accounts being SEO/Internet Marketing related, and huge number of duplicate accounts, I think their user figures are hugely suspect!
Rant over - anyone know how to merge accounts?
-
RE: Some thoughts on MozTrust based on OSE Findings X ref'd with SERPS
Thanks Jackie - I'll do that now.
-
RE: Why do some sites have a higher Page Authority than Domain Authority in OSE??
Hi Rob & Rand,
Many thanks for helping explain the concept. What I am trying to understand is if DA is a calculation based on PA of all pages on that domain (or if it is possible to work out DA from PA of all pages on that domain) or a totally separate calculation which incorporates other metrics. EG
A domain has 2 pages A (PA 50), B (PA 40). Is it possible to work out what DA is for this domain from these 2 values alone?Many thanks
-
RE: Is the DoFollow vs NoFollow ratio important?
Ryan is correct. You should not worry about it - if worry about it then you will naturally try to alter the ratio to something that is NOT natural.
My guess is that there is a flag in googles algorithm when it see's a ratio of dofollow to nofollow that is well above the norm. SEO's concentrate on dofollow links, therefore it is an indication of possible manipulative tactics if you see a very high ratio (one that would not occur naturally).
-
Why do some sites have a higher Page Authority than Domain Authority in OSE??
Hi,
I have noticed when using OSE and enter a domain you very often see a higher Page Authority than Domain Authority.
If someone could explain why this would happen then I would be very grateful - its my current understanding that page authority would ALWAYS be LESS THAN Domain Authority but that is not always the case (I have seen cases where PA is more than 10 higher then DA)
Here's an example where PA > DA
http://www.opensiteexplorer.org/links.html?site=www.primelocation.com
Thanks
-
RE: Duplicate Content http://www.website.com and http://website.com
You can set a preference for whether you want you domain www or not in google webmaster tools.
I would personally not worry to much about you site showing the same content on www and root and not having a full redirect in place - SE's are smart enough to know this is a common problem (especially amongst genuine Mom & Pop sites who are not that internet savvy but have genuine sites) that they (IMO) are very unlikely to apply any penalty.
-
API's for User Profiles other than Alexa
Hi,
We are building an app that requires some audience profiling of websites as part of its architecture.
Alexa provides some of this info, along with visitors by country which is useful, however I am quite skeptical of the information it provides as it is well known it can be way off (particularly on smaller sites), so are there any other API's that provide better data along the same lines as Alexa?
NOTE - The app will profiling websites without having their authorized logins etc, so it needs to be an API that doesn't require authorisation from the website to get the data .
Many Thanks
-
RE: Quick Survey - How much would you pay for a blog post?
Ok - let me put this another way - Your an SEO company right - I assume blog posting on your clients niche is part of what you offer as part of your SEO services?
If your in business I assume you charge for this service - therefore it has a value (in $ terms) to both you and your client. What I am asking is how you would put a value on this service as you obviously would need to put a value on it to invoice you client.
-
RE: Quick Survey - How much would you pay for a blog post?
I would hope for all three, as they should not occur in isolation.
Point C should only be a by product of both A and B.
I would disagree with your point that DA doesn't mean "anything" in relation to A and B. If you were to attempt to get an article on quality high authority site, then such a site is not going to publish an article that is not relevant or interesting to its audience so that covers B. On average a higher authority website (on average) will have a higher audience than a lower authority site - so that covers A.
This is really digressing from the initial point of the post though, which was really to get a gauge of what people would pay approximately be willing to pay for a blog post - given equal niche relevance, but based different DA's. Think of it like this - if you had the opportunity to get 100 posts on 100 DA 90 sites or 100 DA 10 sites, of equal niche target which would you take DA 90 sites or DA 10 sites - obviously the DA 90 sites - but what sort of $ differential would you put on the difference.
-
RE: Quick Survey - How much would you pay for a blog post?
Thanks Philipp,
I do agree with all you say, but I am trying to get a subjective feel based on "average". I am not trying to say that you should only consider DA.
Your point on $15 - $50 is helpful for an "average" blog is helpful, as I would put an "average" blog in the region of say DA 25 - DA 40 - which matches up with what I would pay. However, I also assume you would be willing to pay a lot more than $50 for a post on say mashable, CNN, BBC etc