What do you think of SearchMetrics' claim that there are no longer universal ranking factors?
-
I agree that Google's machine learning/AI means that Google is using a more dynamic set of factors to match searcher intent to content, but this claim feels like an overstatement:
Let’s be quite clear: Except for important technical standards, there are no longer any specifc factors
or benchmark values that are universally valid for all online marketers and SEOs. Instead, there
are different ranking factors for every single industry, or even every single search query. And these
now change continuously.Keyword-relevant content, backlinks, etc. still seem to be ranking factors across pretty much all queries/industries. For example, I can't think of a single industry where it would be a good idea to try to rank for [keyword] without including [keyword] in the visible text of the page. Also, websites that rank without any backlinks are incredibly rare (unheard of for competitive terms).
Doubtless some factors change (eg Google may favor webpages with images for a query like "best hairstyle for men" but not for another query), but other factors still seem to apply to all queries (or at least 95%+).
Thoughts?
-
Were they referencing Rank Brain in their article? The statement sounds similar to an explanation given on what Rank Brain is and how it impacts search. It does seem like a bit of hyperbole but I see their point and I agree with it to a certain extent. I believe the purpose of a machine learner is to continuously innovate without human intervention so that improvements are made while you sleep. It's my understanding that Rank Brain does this based on feedback from users. It's the perfect solution to handling the complexity of search, and would result in a continuously changing algorithm.
I do see a lot of websites ranking without backlinks. Try any local home services query - they're mostly propped up by citations which is a little different than your standard backlink.
-
Agreed, I also see their point to some extent. I think Google's ranking factors are much more dynamic than they used to be. Google's rankings are also becoming for more intuitive and less metrics-driven (eg keyword density). SEO studies are increasingly having trouble explaining Google's algorithm. For example, we all know that social shares and engagement metrics correlate strongly with Google rankings, but nobody is quite sure what the mechanism for that is.
"Likewise, if you're a local plumber and the top results have 1 or 2 referring domains but great content, ranking is going to take more focus on quality onsite than the car hire example."
Or, maybe they are ranking in spite of not having links, and if you get great content + 5 links you'll be #1...hard to say!
"what it takes to rank in each one will require different strengths and weaknesses"
Agreed, because Google is getting close to actually measuring what the searcher wants. i.e. Google has some way of knowing (through user interaction data, maybe?) that a person searching for "hair styles 2016" wants a photo-heavy article, but a person searching for "barack obama policies" wants a long form text article. Yet, IMO, keyword in text and backlinks will be important factors in both cases.
-
I wouldn't say that I strictly agree with it but I do see their point.
The way I look at it is quite similar though from a slightly different angle. For any given vertical, where you rank is entirely relative to the other sites presented in that query.
For example, if you're in the car hire industry and all of your competitors have incredibile link profiles and passable onsite factors then for your industry, your link profile is going to be an important ranking factor for you.
Likewise, if you're a local plumber and the top results have 1 or 2 referring domains but great content, ranking is going to take more focus on quality onsite than the car hire example.
Now, obviously the approach to outranking another site shouldn't be to just copy what they do and you should be exploiting their weak points but no amount of great content is going to push your car hire company above a competitor with 2,000 legitimately quality referring domains!
What this all means is that while Google may not be directly measuring each vertical differently, what it takes to rank in each one will require different strengths and weaknesses. This is conjecture so take from it what you will; it's mostly just my 2c and viewpoint on the whole thing.
-
It's marketing hyperbole.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does the more number of ranking pages improve the website ranking?
Hi all, Let's say there is a website with 100 pages ad 95 pages are not ranking for any keywords; but the other 5 pages including homepage are ranking for some keywords. In this scenario, the 95% non-ranking pages does impact the other 5% pages rankings? Or every page holds their credibility in ranking irrespective of other pages in website? Thanks
Algorithm Updates | | vtmoz0 -
Do header tags impact the rankings much?
Hi all, I have gone through some posts and comments where it's been mentioned that header tags will be considered as any other content on page. Is that really true? Writing up more relevant header tags as per the page topics doesn't have any impact? I would like to know the updated importance of header tags in today's SEO. Thanks
Algorithm Updates | | vtmoz0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Having issues claiming a Google+ Business page (phone number not associated with business address)
When attempting to claim my Google+ account, it asks for the phone number. When I enter the number listed on my business listing, it says that number cannot be found... It then tells me to re-enter all my business info. If I do this, will I lose all my existing photos, videos etc.? Has anyone found this?
Algorithm Updates | | DCochrane0 -
Why does my Rank Checker result differ to SERPs
Hello SEOmoz members I've got yet another naive question for you. RankChecker is telling me that my client has risen to pg 1 position 7. Whilst SERPs is telling me they are still on position 14. I know that SERPs is variable depending on many factors, but this holds true for separate searches on other computers in various far flung locations. Please give me some insight into what is happening. I'm waiting to open the bubbly! Thanks
Algorithm Updates | | catherine-2793880 -
Why doesn't everyone just purchase a .org tld?
Hi, I am new-ish to SEO, and something just dawned on me today. I have read in many places that .org domains rank higher (even if slightly) than .coms. Then why doesn't everyone just purchase .org TLDs? For example, in my industry, most good .com domain names are taken, but .orgs are almost all free. Why not purchase a .org and capitalize on exact match search results? seomoz is .org and it's far from being a non-profit 🙂
Algorithm Updates | | Eladla0 -
CTR for Google Rankings
I run a local business, and I'm working on ranking for keyword + city. I currently rank on the first page for just about every keyword I'm working on, but only the top 3 for a little less than half. Because the search volume is so low for each keyword (for most cities Google doesn't have an estimated monthly search volume) the grand total of a few searches a month for each keyword + city combination is where I get my traffic. Although I seem to be getting consistently higher in the rankings, I am curious as to how much more traffic I can expect. I read somewhere that sites that are ranked number one are clicked 50% of the time, number two 20% of the time, number three 15% and from there on it goes down fast. Rank 7 and on is below 1%. Probably around 30% of my keywords are ranked between 7-10 and probably about 20% are ranked 4-6. Are the CTR numbers fairly accurate? I understand that there are a lot of influences on CTR, such as title/description, but generally is that somewhat accurate? If it is, I am missing out on A LOT of traffic. I am pulling about 800 unique visitors a month from Google. If I get in the top 3 for most of my keywords, can I expect significantly more traffic? I ask the question because there are many other things I could be doing with my time to help the business aside from SEO. I don't want to be working constantly on SEO if traffic is only going to increase very little.
Algorithm Updates | | bjenkins240