I have two questions/queries:
- Home
- James77
James77
@James77
Job Title: CEO
Company: Chillisauce
Website Description
The Worlds Greatest Hen and Stag Weekends
Set up https://chillisauce.com from a bedroom a little over 10 years ago, and grown it to a 7 figure turnover employing around 100 people. Chillisauce is a specialist events company providing custom made stag do's, hen parties, corporate events and activity weekends.
Favorite Thing about SEO
Getting creative and doing things a bit differently
Latest posts made by James77
-
RE: Suggestion - Should OSE include "citation links" within its index?
Thanks Miranda, that would be great.
I really hope its added - this could be a very positive differentiator between Moz and other link research tools out there. Combining this citation link finder in with the 'Just Discovered' links - will in my view make a killer feature well above the competition.
James
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Thanks Rand - that's great information.
When you talk about rankings rising, did you see them rise for the KW's associated with http://moz.com/beginners-guide-to-seo or are we talking about rankings for other Moz pages ? - IE did adding http://moz.com/beginners-guide-to-seo contribute to a rise in rankings across the whole domain or just that subfolder.
I hope you consider making this into one of your WBF's / Posts, as I think it would be fascinating to see the "What you did", "What were the results" etc, and also get feedback from what others have experienced.
Many thanks
-
Moz's official stance on Subdomain vs Subfolder - does it need updating?
Hi,
I am drawing your attention to Moz's Domain basics here: http://moz.com/learn/seo/domain
It reads:
"Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)." I am wondering if this is still Moz's current recommendation on the subfolders vs subdomains debate, given that the above (sort of) implies that SE's may not combine ranking factors to the domain as a whole if subdomains are used - which (sort of) contradicts Matt Cutts last video on the matter ( http://www.youtube.com/watch?v=_MswMYk05tk ) which implies that this is not the case and there is so little difference that their recommendation is to use whatever is easiest. It would also seem to me that if you were looking through the eyes of Google, it would be silly to treat them differently if there were no difference at all other than subdomain vs subfolder as one of the main reasons a user would use a sud-domain is a technical on for which it would not make sense for Google to treat differently in terms of its algorithm.
I notice that in terms of Moz, while most of the site uses subfolders, you do have http://devblog.moz.com/ - and I was wondering if this is due to a technical reason or conscious decision, as it would seem to me that the content within this section is indeed linkworthy (as it has external links pointing to it from external sources), therefore it would seem to not be following the initial advice that is posted in Moz's basics on domains. Therefore I am assuming it is due to a technical reason - or that Moz's adive is out of date with current Moz thinking, and is indeed in line with Matt C in that it doesn't matter.
Cheers
-
RE: Suggestion - How to improve OSE metrics for DA & PA
Cheers Pete.
I totally understand the data dependency. One thing you could do, which would not require data dependency (long term), and also help with the spam detection your building is to take a single snapshot of "Ranking" - then use this as a data set to pattern match spam sites. EG if you managed to pull say 100,000's of ranking scores (say traffic scores from SEMRush), then match that with Moz's current scoring on that domain, then bucket the sites into groups that have higher or lower ranking scores than DA would predict, then try and reverse engineer the link or other patterns Moz use which are common to those buckets.
-
Suggestion - How to improve OSE metrics for DA & PA
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be.
Some examples
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack.
I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be.
1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics.2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site.Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.
-
Suggestion - Should OSE include "citation links" within its index?
This is really a suggestion (and debate to see if people agree with me), with regard to including "citation links" within Moz tools, by default, as just another type of link
NOTE: when I am talking about "citation links" I am talking about a link that is not wrapped in a link tag and is therefore non clickable, eg moz.com
Obviously Moz have released the mentions tool, which is great, and also FWE which is also great. However, it would seem to me that they are missing a trick in that "citation links" don't feature in the main link index at all. We know that Google as a minimum uses them as an indicator to crawl a page ( http://ignitevisibility.com/google-confirms-url-citations-can-help-pages-get-indexed/ ), and also that they don't pass page rank - HOWEVER, you would assume that google does use then as part of their alogrithm in some manner as they do nofollow links.
It would seem to me that a "Citation Link" could (possibly) be deemed more important than a no follow link in Googles alogrithm, as a "no follow" link is a clear indication by the site owner that they don't fully trust the link, but a citation link would neither indicate trust or non trust.
So - my request is to get "citation links" into the main link index (and the Just Discovered index for that matter).
Would others agree??
-
RE: Best SEO practice - Umbrella brand with several domains
I would if possible try and concentrate on a single brand/domain as opposed to multiple domain - on the assumption that an identical piece of content will rank higher on a single strong domain than it would on any of your less strong segmented domains.
With a single domain you also don't have the issue of worrying about how to cross market across domains (although as you point out many sites do so - Moz / OSE being one example) . Of course this also gets rid of the worry about whether google may see cross linking between multiple domains as a 'Link Network' - unlikely if it is just a few domians but if you are talking about 10's of different domains then this could well raise a flag in googles alogrithm.
It is still unclear as to how google treats multiple domains owned by the same companies, and on the same theme. Many websites have sub domains for their blogs and other sub-sections and the standard thought a few years ago was that you were better off moving those into folders. However, I think things have now changed such that if there are enough connecting signals it doesn't matter if its a sub domain or sub folder. When considering this, as a subdomain is technically a separate site, you would think that if similar connecting signals where to appear between 2 separate domains, then Google could (and probably should) treat them as they would sub-domains. However, whether there is some, all or no truth in this is likely to be a debate that will be speculation until Google confirms something along these lines.
-
RE: How long before your rankings improved after Penguin?
I think its very unlikely it takes 6 months for a disavow file to be processed - I have heard ranges from a couple of weeks to a couple months but no more.
If your talking about an algorithmic penalty (like penguin) then most likely you should see more gradual changes than if it were a manual penalty (which you would need to get removed via a re-inclusion request to see improvement).
When you mention 'go overboard' then that is a matter for you to decide what is a good and bad link, but in my opinion if you are under any penalty you need to be pretty thorough in removing all links that you think could be causing you issues.
-
RE: How to NOT appear in Google results in other countries?
Do they cause you any other issue than effecting your bounce rate?
If not, then I would not try to stop them visiting your website, but just segment the bounce rates as you have done above.
The reason I say this is three fold:
1/. Trying to prevent SE's not displaying you to certain countries etc, could be fraught with problems and you could well end up damaging the SERP's in your key countries.2/. 'Country by IP' is not an exact science - and you may exclude some people who are located in your target countries.
3/. If you ever expand into the countries you want to currently block, would it not be nice to already have traffic from that market? What about customers who are from your target countries but searching while they are on holiday abroad?
Hope that helps
Best posts made by James77
-
Moz's official stance on Subdomain vs Subfolder - does it need updating?
Hi,
I am drawing your attention to Moz's Domain basics here: http://moz.com/learn/seo/domain
It reads:
"Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)." I am wondering if this is still Moz's current recommendation on the subfolders vs subdomains debate, given that the above (sort of) implies that SE's may not combine ranking factors to the domain as a whole if subdomains are used - which (sort of) contradicts Matt Cutts last video on the matter ( http://www.youtube.com/watch?v=_MswMYk05tk ) which implies that this is not the case and there is so little difference that their recommendation is to use whatever is easiest. It would also seem to me that if you were looking through the eyes of Google, it would be silly to treat them differently if there were no difference at all other than subdomain vs subfolder as one of the main reasons a user would use a sud-domain is a technical on for which it would not make sense for Google to treat differently in terms of its algorithm.
I notice that in terms of Moz, while most of the site uses subfolders, you do have http://devblog.moz.com/ - and I was wondering if this is due to a technical reason or conscious decision, as it would seem to me that the content within this section is indeed linkworthy (as it has external links pointing to it from external sources), therefore it would seem to not be following the initial advice that is posted in Moz's basics on domains. Therefore I am assuming it is due to a technical reason - or that Moz's adive is out of date with current Moz thinking, and is indeed in line with Matt C in that it doesn't matter.
Cheers
-
RE: Any way to track rank of a URL for keyword BEFORE setting up in Rank Tracker?
Have a look at SEMRUSH.com - they provide historical data.
-
RE: Do 'Just Discovered' Links get added to the main link index?
Hi Sam,
That's great to know. IMO the ability to get these 'just discovered' links into the main index is a significant advantage to Moz over other tools. Given a just discovered link must first be tweeted, these links are (generally) more important than non tweeted links, and therefore of more importance to the overall link profile.
Cheers
-
RE: Dramatic decline in rankings
Firstly - check in Google Webmaster Tools to see if you have got a manual penalty.
Secondly make sure you haven't done anything silly like blocking search engines on robots.txt or response headers.
Thirdly I would then do a full link audit of your site - use OSE, Ahrefs, Majestic and WMT. Go through the links and ask yourself if they were honestly and naturally placed, and if you think Google could have an issue with them - and if so try and get them removed.
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Thanks Rand - that's great information.
When you talk about rankings rising, did you see them rise for the KW's associated with http://moz.com/beginners-guide-to-seo or are we talking about rankings for other Moz pages ? - IE did adding http://moz.com/beginners-guide-to-seo contribute to a rise in rankings across the whole domain or just that subfolder.
I hope you consider making this into one of your WBF's / Posts, as I think it would be fascinating to see the "What you did", "What were the results" etc, and also get feedback from what others have experienced.
Many thanks
-
RE: Doubt with no follow links: disavow or no action?
It really depend on what those 'no follow' links are. Generally they cannot hurt you, but if they are done in a manipulative way eg mass blog comments, and furthermore if they are on bad sites - then they can hurt you - see Matt Cutts video with more info on this: http://www.youtube.com/watch?v=QSEqypgIJME
-
RE: Suggestion - How to improve OSE metrics for DA & PA
Cheers Pete.
I totally understand the data dependency. One thing you could do, which would not require data dependency (long term), and also help with the spam detection your building is to take a single snapshot of "Ranking" - then use this as a data set to pattern match spam sites. EG if you managed to pull say 100,000's of ranking scores (say traffic scores from SEMRush), then match that with Moz's current scoring on that domain, then bucket the sites into groups that have higher or lower ranking scores than DA would predict, then try and reverse engineer the link or other patterns Moz use which are common to those buckets.
-
RE: Ultimate Ranking Tool integrating Analytics / Adwords / Google WM Tools
You can colour code all you competitors (and your keywords) in AWR, so this may help you do this. Otherwise you can set up multiple campaigns with different sets of competitors and campaigns.
-
RE: Is it OK to Delete a Page and Move Content to a Another Page without 301 re-direct
yes, I would certainly recommend 301'ing the link - or better could you just simply overwrite the old page (A) with the new page you are intending to create (B) - unless of course B already exists?
If you struggling with managing the 301, then I would first check if there are any external backlinks going to page A. If there are, then I would certainly try and find a way to 301. If not, then it will not effect things to much as there is no external link equity going to the page to loose.
A further option if you are unable to 301, and copying the content from page A to page B is to rel=canonical A to B, which should pass the link equity across in a similar way to a 301.
-
Suggestion - How to improve OSE metrics for DA & PA
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be.
Some examples
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack.
I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be.
1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics.2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site.Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.
Set up https://chillisauce.com from a bedroom a little over 10 years ago, and grown it to a 7 figure turnover employing around 100 people. Chillisauce is a specialist events company providing custom made stag do's, hen parties, corporate events and activity weekends.
Looks like your connection to Moz was lost, please wait while we try to reconnect.