Can a page be 100% topically relevant to a search query?
-
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent.
Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
-
I consider 100% match purely as theoretically possible. In my modest opinion the visitor determines the relevancy of the landingpage. And it is Google's nobel job to serve the visitor with a page that fits his needs. But in this case no page can be fully satisfying to everybody, due to different search intentions with the same keyword.
When you achieve a high conversion on your page you'v probably written a very relevant page. So let the visitor truly find what he is looking for and Google will notice....
-
Well said, Russ, especially for a "mathy" answer. I am curious, though, would this "ideal document" you describe have a specific word count?
-
Warning, mathy answer follows. This is a generic description of what is going on, not exact, but hopefully understandable.
Yes, there is some theoretical page that is 100% topically relevant if you had a copy of the "ideal document" produced by the topical relevancy model. This would not look like a real page, though. It would look like a jumble of words in ideal relation and distance to one another. However, most topic models are built using sampling and, more importantly, the comparative documents that are used to determine the confidence level that your document's relevancy is non-random is also sampled. This means that there is some MoE (Margin of Error).
As you and your competitors approach 100% topical relevancy, that Margin of Error likely covers the difference. If you are 99.98% relevant, and they are 99.45% relevant, but the MoE is 1%, then a topical relevancy system cant conclude with certainty that you are more relevant than they are.
At this point, the search model would need to rely on other metrics, like authority, over relevance to differentiate the two pages.
-
With the pace at which things are changing and throwing in machine learning in to the ranking factor, I would say it's close to impossible to have 100% topically relevancy for any good period of time.
-
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
I would agree to a point. At its heart, Google probably uses some form of numerical score for a page as it relates to a query. If a page is a perfect match, it scores 100%. I would also suggest that attaining a perfect score is a virtual impossibility.
The scoring system, however, is dynamic. The page may be perfect for a particular query only at a particular point in time.
- Google's algorithm changes daily. What's perfect today may not be perfect tomorrow.
- Semantic search must be dynamic. If Google discovers a new Proof Term or Relevant Term related to the query, and the page in question doesn't contain that term, the page is no longer perfect.
These are only a couple of examples.
For practical purposes, the amount of testing, research, etc. to achieve a perfect score at some point delivers diminishing returns. The amount of effort required to push a page from 95% to 100% isn't worth the effort, especially since Google's algorithm is a secret.
Sometimes good is good enough.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Rank on Moz compared to Ahrefs
So there seems to be a huge philosophical difference behind how Moz and Ahrefs calculates page rank (PA). On Moz, PA is very dependent on a site's DA. For instance, any new page or page with no backlinks for a 90DA site on Moz will have around 40PA. However, if a site has around 40 DA, any new page or page with no backlinks will have around 15PA PA. Now if one were to decide to get tons of backlinks to this 40 DA/15PA page, that will raise the PA of the page slightly, but it will likely never go beyond 40PA....which hints that one would rather acquire a backlink from a page on a high DA site even if that page has 0 links back to it as opposed to a backlink from a page on a low DA site with many, many backlinks to it. This is very different from how Ahrefs calculates PA. For Ahrefs, the PA of any new page or page with no backlinks to it will have a PA of around 8-10ish....no matter what the DA of the site is. When a page from a 40DA site begins acquiring a few links to it, it will quickly acquire a higher PA than a page from a 90DA site with no links to it. The big difference here is that for Ahrefs, PA for a given page is far more dependent on how many inbound links that page has. On the other hand, for Moz, PA for a given page is far more dependent on the DA of the site that page is on. If we were to trust Moz's PA calculations, SEOrs should emphasize getting links from high DA sites....whereas if we were to trust Ahref's PA calculations, SEOrs should focus less on that and more on building links to whatever page they want to rank up (even if that page is on a low DA site). So what do you guys think? Do you agree more with Moz or Ahref's valuation of PA. Is PA of a page more dependent on the DA or more dependent on it's total inbound links?
Algorithm Updates | | ButtaC1 -
Where can I find a list of CTRs by search engine position, beyond the top 20?
I'm trying to find a comprehensive list of CTRs by search enginge position. The most exhaustive list I've found is this one: https://www.advancedwebranking.com/cloud/ctrstudy/ - which shows CTRs by industry for the positions 1-21. I'm interested in seeing the rates for lower positions as well - does anyone in the community know any (preferably recent) resources for this? Thank you.
Algorithm Updates | | PrebenKaas0 -
Help guide pages from subdirectory must be opened in a new tab?
Hi, We have help guide pages for every feature we provide. They been hosted on different sub directory and we linked them from our website pages. Do we need to make these sub directory pages to open in a new tab when clicked from our website pages? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate Content on Product Pages with Canonical Tags
Hi, I'm an SEO Intern for a third party wine delivery company and I'm trying to fix the following issue with the site regarding duplicate content on our product pages: Just to give you a picture of what I'm dealing with, the duplicate product pages that are being flagged have URLs that have different Geo-variations and Product-Key Variations. This is what Moz's Site Crawler is seeing as Duplicate content for the URL www.example.com/wines/dry-red/: www.example.com/wines/dry-red/_/N-g123456 www.example.com/wines/dry-red/_/N-g456789 www.example.com/wines/California/_/N-0 We have loads of product pages with dozens of duplicate content and I'm coming to the conclusion that its the product keys that are confusing google. So we had the web development team put the canonical tag on the pages but still they were being flagged by google. I checked the of the pages and found that all the pages that had 2 canonical tags I understand we should only have one canonical tag in the so I wanted to know if I could just easily remove the second canonical tag and will it solve the duplicate content issue we're currently having? Any suggestions? Thanks -Drew
Algorithm Updates | | drewstorys0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Local Pages Help
Hi All, I have a client who is looking heavily at Google+ Local. He has a main business, with a number of locational franchises. He has created a local listing for each of these franchise pages. The question he has asked is 'How do I improve my rankings for these local listings?' Now some of them seem to rank well without any work performed to improve them, but some are not. My question is, What can we do to improve the rankings of Google+ Local listings? This has changed greatly since I last looked into it, so anyone who can say 'right, this is what you need to do to improve Google+Local listings' would be greatly appreciated!!!! Many thanks Guys!!
Algorithm Updates | | Webrevolve0 -
Should social widgets be the kind that shares/likes a page, or the kind that adds followers to a brand social page?
I'm wondering if the social widgets on my blog should create a share/like referencing the page or should the social widget create a follower to my brands page on a particular social network? Any ideas?
Algorithm Updates | | salesduke0 -
Difference in which pages Google is ranking?
Over the past two weeks I've noticed that Google has decided to change which pages on our site rank for specific keywords. The thing is, this is for keywords that the homepage was already ranking for. Due to our workload, we've made no changes to the site, and I'm not tracking any additional backlinks. Certainly there are no new deep links to these pages. In SEOmoz dashboard (and via tools/manual checking with a proxy) of the 24 terms we have first page ranking for, 9 of them are marked "new to top 50". These are terms we were already ranking for. Google just appears to have switched out the homepage for other pages. I've noticed this across a couple of client sites, too, though none to the extent that I'm seeing on our own. Certainly this isn't a bad thing, as the deeper pages ranking means that they're landing on the content they want first, and I can work to up the conversion rates. It's just caught me by surprise. Anyone else noticing similar changes?
Algorithm Updates | | BedeFahey1