Is SEOMoz only good for "ideas"?
-
Perhaps I've learned too much about the technical aspects of SEO, but nowhere have I found scientific studies backing up any claims made here, or a useful answer to a discussion I recently started.
Maybe it doesn't exist. I do enjoy Whiteboard Friday's. They're fantastic for new ideas. This site is great. But I take it there are no proper studies conducted that examine SEO, rather just the usual spin of "belief from authority".
No?
-
Exactly. And so, would it not be greatly beneficial knowledge to all of us to know if and when a limit is reached where this strategy is no longer effective?
For example, there are many PR8 sites with literally hundreds of PR6 pages that allow dofollow commenting. We can alter anchor text and the deeplink to gain links from these PR6 pages. The question is when does this strategy become ineffective? Let's say our site has 100k pages. Should we spend our time getting a link from every available PR6 page from the same domain? Or is there a diminishing value?
Having some sort of a study that's tried and proven to show if a persistent benefit exists, and when it wears off, would be invaluable to practical SEO, and the results of a study such as this are highly unlikely to change within a year.
Surely you'd like to see something like this too?
I do understand the need to keep SEO in-line with Matt Cutt's objectives, however the reality is that Matt Cutts objectives and what works are two different things. There would be no such thing as off-site SEO at all if Google worked the way it meant to. The thing is, is that it doesn't, and that is why off-site SEO exists.
Instead of people giving hogwash answers, we should be demanding these sorts of useful studies. That is just my opinion anyway.
-
OK... for your question... maybe a little bit more if all links go to the same URL. However, if the links go to different URLs you might get a lot more.
-
And when Schema.org is fully implemented, and in turn integrated into ranking factors, it's going to go through the roof as far as factors go.
-
I wasn't asking to pin point anything to decimal places. I was only after peoples view on whether its "a little bit more" or "probably nothing" etc. That way I can see who here actually knows anything about SEO.
-
Oh - and let's not forget that unless we have the exact same data set as any search engine, we move even further away from the mark of accepted scientific best practices methodology.
This is really important. In the past SEOs could count all of the onpage factors and count their links and count their anchor text. Now as Google starts using things like social data, analytics data from Google.com and other information that SEOs can not count or even see - that is when the ability to reverse engineer begins to disappear - and google becomes less likely to be manipulated.
-
I like how EGOL summed it up in regard to the fact that search engines won't reveal their methods.
They claim hundreds of factors, yet when those are cross-related, that leads to exponential sub-factors (and thus why Google and Bing like to tout 10,000 factors behind the hundreds).
We live in a correlation industry. Any true scientific analysis to reverse engineer the actual factors is by nature going to very likely miss something, and it could be quite significant in how much the results are actually false flag.
Where it gets more complex is that no two situations (in any truly competitive landscape) are exactly alike, and thus the need to replicate for verification is an even more elusive task.
Then add in that hundreds of changes occur to search algorithms throughout the year (some small, some big), and now we're talking about a barrier to true scientific evaluation.
Oh - and let's not forget that unless we have the exact same data set as any search engine, we move even further away from the mark of accepted scientific best practices methodology.
On a final note, the amount of time, computing power and analysis required in most situations, is more likely cost-prohibitive since the results of such effort can not be recouped. So that leaves it to an entity that has the financial, technical, and academic willingness to take on such a task without expectation of compensation.
Rand gets hammered all the time for referring to Moz's process with the tag line "correlation is not causation", even though we work in a correlation industry.
-
Hi Steven,
While I don't feel it's necessary to defend SEOmoz, or SEO for that matter, it is impossible to know exactly what do to implement perfect SEO techniques and rank in the top spot for each industry.
However, SEOmoz is one of the few places I have watched webinars that collect vertical data, and process the statistics to made heads and tails of it. I do remember reading a Google Best Practices Guide from 2010 that did something similar, but not as well as SEOmoz.
The more I read and learn from this site, the more I realize that SEO is not abut tricking Google and other search engines. It is about relying on the best practices to create high quality, relevant websites that Google will appreciate. I continue to believe that if SEOers stay consistent in their practices, Google will end up changing their algorithms to suit SEOmoz's standards. Search engines will be chasing our sites, and our clients sites based on the excellent product we are helping produce.
SEO can be an extremely frustrating if you are looking for black and white answers. I find the best practices approach to be much less frustrating, and easier to manage in regards to expectations and results.
Edit
I also wanted to mention that I wish people would have more ideas to consider in reference to best practices and techniques. This is how SEO will evolve, and rarely do people have a chance to be in on the ground floor with cool stuff like this.
-
Can you tell me the diminishing value of a sitewide link vs a single homepage link for example? And how is it you know the answer to this?
Nobody is going to tell you that a sitewide link is worth 2.76788756 times the value of a homepage link.
The answer isn't that simple. Sites come in different sizes, they have different linkages, they have different navigation structures and they have different numbers of links hitting homepages and internal pages. Maybe it even matters if these links are in the footer, the top navigation or some other location.
There is a word that is used when people try to reverse engineer something that is really complex..... that is fuzzy logic. I think that this term has huge application in SEO.
But my personal opinion is when we deal with questions of fuzzy logic, we make the most effective use of our time when we accept answers such as "a little bit more"..... "probably nothing".... and "maybe a little less". Then we simply apply them, and move on rather than trying to get into proving theorems and attempting to calculate out to five decimal points.
-
Hi Ryan,
You're quite right. The site does offer some useful tools and interviews. In fact for that reason alone I will be retaining my membership. I wasn't really after a secret handshake, but more sharing of analysis data.
-
Steven, it sounds like you joined to learn the secret handshake. If that is the case then, from your point of view, you will be disappointed.
EGOL and Marcus both shared excellent perspectives on the SEOmoz site's offerings. You have also looked around the site. Based on your replies I think you have examined the site and have not missed any major components.
What SEOmoz does offer members is:
-
Tools: Site Explorer, MOZbar and other tools to examine websites. Yes, there are similar tools out there and also some of these tools can be improved. I would like to see all of SEOmoz's tools improved to be the best in the industry. They aren't there yet.
-
Original interviews: you mentioned Whiteboard Fridays. The recent interview with Duane Forrester from Bing offered fantastic insight into upcoming changes at Bing. Being part of a network which talks to industry leaders and asks the right questions is very rewarding.
-
Active community: if you do get stuck or otherwise have a question, SEOmoz offers a place you can go to for help. From what I have seen, other communities are not very active nor do they offer the quality of feedback these forums provide.
The idea seems to be centered around providing a place for those interested in SEO the tools, information and discussion area to do their job better. If that is not what you are looking for, then I would suggest you simply enjoy your free month then try something else.
-
-
Marcus,
I appreciate the good natured will of your post, and I thank you for that. However as I stated to Egol, I am not by any means new to SEO. Please don't let my small post count here deceive you.
The links you provided regarding "asking the experts" is what is known in science as an "appeal to authority", which has no relevance whatsoever to evidence. Just because some prominent scientists, or SEO's, give their opinion about something, even if it is in the vast majority, does not constitute evidence. Those, myself included, who know more than many of the names mentioned within those lists, care not for opinions, but replicatable tests. This is how any knowledge is truly understood.
I do very much respect the ideas presented here, and the community, and in fact have learned some non-technical SEO related things. I especially applaud the Whiteboard Friday section. However I'm not really after trends, or patterns, but rather a discussion regarding specific questions, such as the one brought up in a previous discussion I created.
Can you tell me the diminishing value of a sitewide link vs a single homepage link for example? And how is it you know the answer to this? That is what I want studies demonstrating, so we can analyze this in detail and work out finer points of the algorithm which, despite common theory, doesn't change as much as everyone believes.
-
Egol I agree with you wholeheartedly, though I'm not as much of a newbie as you seem to infer.
The information is indeed kept secret to maintain a competitive advantage, though I thought this site may have revealed some of its own studies. My primary reason for joining here wasn't to learn as much as it was to compare analysis of various SEO algorithms, and I cannot seem to do that here.
One thing I'm sure you're aware given the dynamic, changing nature of SEO is that an understanding of even some temporary aspect, be it for only a few months until it is changed, can bring in huge profits.
Not all fields of SEO are like this however. Some aspects of SEO which haven't changed for years are not even discussed, such as the topic raised in a previous discussion I created. We can study these aspects with near the same level of detail we can study the electron configuration of atoms, it simply requires observation and deduction. This is what I was hoping for.
-
Hey Steven
Your kidding right? The "usual spin"?
It's hard to know exactly what you are looking for here but if there are any studies about SEO then it's SEOMoz and their partners from within the industry that are doing them.
Here are a few that just jump to mind:
1. http://www.seomoz.org/article/search-ranking-factors
The ranking factors article takes so much into consideration. A panel of 130 or so experts and correlation-based analysis from the vast amount of data gathered through the link scape index and other means.
2. http://www.davidmihm.com/local-search-ranking-factors.shtml
This is an article looking at the myriad local search ranking factors. Again it uses a panel and a whole bunch of data to best give an overview of local search ranking factors.
3. http://www.distilled.net/blog/
If you want more general data and mathematical analysis of search then the guys at distilled publish some great information.
4. SEOMoz Tools
You want data? Then the tools here will give you that. Granted, it's not perfect and it won't SEO your site for you but it will give you some metrics to work with.
The blog at SEO book pulls no punches and is often a great place to go for some cutting analysis of what is right and wrong in search.
But, psst, come here, wan't to know the real secret?
Hard work - that's it, plain and simple. SEO is a closed box. The engines don't publish their algorithms and even if they did give us a comprehensive overview of everything covering all areas of search it would likely be out of date before we have finished reading it.
I kind of feel your pain, you want exact answers, specific things that you can do to succeed but SEO just does not work like that. There are way to many variables and for every industry, every site, every country, every city things can be a little different.
If you want to 'examine seo' then you need to do it from your perspective. If you want analysis it has to be done within the context of whatever you are trying to achieve. There are plenty of great SEO's on this board and if you want some help understanding what you can do to rank better then post some more details and I am sure you will get some help, I will certainly take a look.
I hope that helps a little, even if it was not exactly what you may have been looking for.
Marcus
-
Search engines such as Google don't reveal how they rank websites and they modify their methods continuously. One of their most important goals is to avoid manipulation.
As a result nobody knows exactly how google ranks websites. So SEOs must have a mind that is comfortable dealing with a changing uncertainty.
A few people do scientific studies on how rankings work but a lot of that information is kept secret to maintain a competitive advantage - but it has a limited value over time because search engines change their methods.
In my opinion there are four good sources of information....
-
Basic search optimization guides such as the Beginner's Guide to SEO
-
Surveys of web professionals such as the Search Engine Ranking Factors.
-
Forums such as this one where people ask questions and share ideas.
-
Personal records that SEOs keep about what changes they made to their website and the results that occur.
The bottom line is you will rarely know EXACTLY what to do. But you must draw information from what you get from 1,2,3 and 4 to place your best bet.
The primary caution is.... don't be suckered into accepting the answer that you "want to hear" because usually the "most difficult to pull off" is the one that works best.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEMRush's Site Audit Tool "SEO Ideas"
Recently SEMRush added a feature to its site audit tool called "SEO Ideas." In the case of specific the site I'm looking at it with, it's ideas consist mostly of suggesting words to add to the page for the page/my phrase(s) to perform better. It suggests this even when the term(s) or phrases(s) it's looking at are #1. Has anybody used this tool for this or something similar and found it to be valuable and if so how valuable? The reason I ask is that it would be a fair amount of work to go through these pages and find ways to add the select words and phrases and, frankly, it feels kind of 2005 to me. Your thoughts? Thanks... Darcy
Technical SEO | | 945010 -
Implementation of rel="next" & rel="prev"
Hi All, I'm looking to implement rel="next" & rel="prev", so I've been looking for examples. I looked at the source code for the MOZ.com forum, if anyone one is going to do it properly MOZ are. I noticed that the rel="next" & rel="prev" tags have been implemented in the a href tags that link to the previous and next pages rather than in the head. I'm assuming this is fine with Google but in their documentation they state to put the tags in the . Does it matter? Neil.
Technical SEO | | NDAY0 -
Choosing the right page for rel="canonical"
I am wondering how you would choose which page to use as a canonical ? All our articles sit in an article section and they are called in the url when linked from a particular category. Since some articles are in many categories, we may have several links for the same page. My first idea was to put the one in the article category as the canonical, but I wonder if Google will lose the context of the page for it's ranking because it will not be in the proper category. For exemple, this page in the article section : http://www.bdc.ca/en/advice_centre/articles/Pages/exporting_entering.aspx Same page in the Expand Your Sales > Going Global section : http://www.bdc.ca/EN/advice_centre/expand_your_sales/going_global_or_international_markets/Pages/RelatedArticles.aspx?PATH=/EN/advice_centre/articles/Pages/exporting_entering.aspx The second one has much more context related to it, like the breadcrumb is showing the path and the left menu is open at the right place. For this example, I would choose te second one, but some articles may be found in 2 or 3 categories. If you could share your lights on this it would be very appreciated ! Thanks
Technical SEO | | jfmonfette0 -
Two "Twin" Domains Responding to Web Requests
I do not understand this point in my Campaign Set-Up. They are the same site as fas as I understand Can anyone help please? Quote from SEOMOZ "We have detected that the domain www.neuronlearning.eu and the domain neuronlearning.eu both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here." thanks John
Technical SEO | | johnneuron0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
How many steps for a 301 redirect becomes a "bad thing"
OK, so I am not going to worry now about being a purist with the htaccess file, I can't seem to redirect the old pages without redirect errors (project is an old WordPress site to a redesigned WP site). And the new site has a new domain name; and none of the pages (except the blog posts) are the same. I installed the Simple 301 redirects plugin on old site and it's working (the Redirection plugin looks very promising too, but I got a warning it may not be compatible with the old non-supported theme and older v. of WP). Now my question using one of the redirect examples (and I need to know this for my client, who is an internet marketing consultant so this is going to be very important to them!): Using Redirect Checker, I see that http://creativemindsearchmarketing.com/blog --- 301 redirects to http://www.creativemindsearchmarketing.com/blog --- which then 301 redirects to final permanent location of http//www.cmsearchmarketing.com/blog How is Google going to perceive this 2-step process? And is there any way to get the "non-www-old-address" and also the "www-old-address" to both redirect to final permanent location without going through this 2-stepper? Any help is much appreciated. _Cindy
Technical SEO | | CeCeBar0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0