Can a page be 100% topically relevant to a search query?
-
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent.
Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
-
I consider 100% match purely as theoretically possible. In my modest opinion the visitor determines the relevancy of the landingpage. And it is Google's nobel job to serve the visitor with a page that fits his needs. But in this case no page can be fully satisfying to everybody, due to different search intentions with the same keyword.
When you achieve a high conversion on your page you'v probably written a very relevant page. So let the visitor truly find what he is looking for and Google will notice....
-
Well said, Russ, especially for a "mathy" answer. I am curious, though, would this "ideal document" you describe have a specific word count?
-
Warning, mathy answer follows. This is a generic description of what is going on, not exact, but hopefully understandable.
Yes, there is some theoretical page that is 100% topically relevant if you had a copy of the "ideal document" produced by the topical relevancy model. This would not look like a real page, though. It would look like a jumble of words in ideal relation and distance to one another. However, most topic models are built using sampling and, more importantly, the comparative documents that are used to determine the confidence level that your document's relevancy is non-random is also sampled. This means that there is some MoE (Margin of Error).
As you and your competitors approach 100% topical relevancy, that Margin of Error likely covers the difference. If you are 99.98% relevant, and they are 99.45% relevant, but the MoE is 1%, then a topical relevancy system cant conclude with certainty that you are more relevant than they are.
At this point, the search model would need to rely on other metrics, like authority, over relevance to differentiate the two pages.
-
With the pace at which things are changing and throwing in machine learning in to the ranking factor, I would say it's close to impossible to have 100% topically relevancy for any good period of time.
-
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
I would agree to a point. At its heart, Google probably uses some form of numerical score for a page as it relates to a query. If a page is a perfect match, it scores 100%. I would also suggest that attaining a perfect score is a virtual impossibility.
The scoring system, however, is dynamic. The page may be perfect for a particular query only at a particular point in time.
- Google's algorithm changes daily. What's perfect today may not be perfect tomorrow.
- Semantic search must be dynamic. If Google discovers a new Proof Term or Relevant Term related to the query, and the page in question doesn't contain that term, the page is no longer perfect.
These are only a couple of examples.
For practical purposes, the amount of testing, research, etc. to achieve a perfect score at some point delivers diminishing returns. The amount of effort required to push a page from 95% to 100% isn't worth the effort, especially since Google's algorithm is a secret.
Sometimes good is good enough.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lost Wikipedia page and dropped heavily in rankings. How many of you aware of and experienced this?
Hi all, We lost of our Wikipedia page for 2nd time and we dropped in rankings 2nd time too. I got confused first time whether Wikipedia was the actual reason as we had couple of major changes in our website. But recently it's been clear that losing Wikipedia page is the culprit as we have no website changes around these days. How many of you aware of this and experienced this? Please share your views. Hope this info will help you. Thanks
Algorithm Updates | | vtmoz0 -
Looking for Search Engine Ranking Factors more recent than 2013
The 2013 Search Engine Ranking Factors study is a very useful study. However, it was completed more than two years ago, and a lot of algorthim updates have been made since then. Is there a more recent study of this than the one produced in 2013? Any and all information would be valuable. I am also trying to understand the importance of site speed as a ranking factor. Thanks.
Algorithm Updates | | JorgeUmana0 -
Home Page not ranking?
Hey guys, I'm working on a relatively new client and have noticed that all the initial rankings that we have point towards a sub page rather than the home page. Only a branded search appears to bring up the home page which seems really strange. Any ideas? Home page; www.geraldmiller.com Sub page (ranking); www.geraldmiller.com/dwi-dui-defense
Algorithm Updates | | Webrevolve0 -
Member's Badge as Link Building to Homepage or Internal Pages?
Providing members and embeddable badge is a well known link building tactic. Is it better to have the badges from hundreds or even thousands of members link back to the homepage of a website, or a lot of different inner pages? The inner pages would the their individual's profile which sits under a category (such as a service and organisation by location). Member's websites would be related to the content of the website generally. What are the advantages of each? 1. Links to homepage make it easier to rank for competitive keywords on the homepage? If the types of websites were to vary a lot, say a carpet cleaning website and a web designer website, if they all linked to the homepage, would it cause some confusion for the link profile?
Algorithm Updates | | designquotes0 -
Organic listing & map listing on 1st page of Google
Hi, Back then, a company could get multiple listings in SERP, one in Google Maps area and a homepage or internal pages from organic search results. But lately, I've noticed that Google are now putting together the maps & organic listings. This observation has been confirmed by a couple of SEO people and I thought it made sense, but one day I stumble with this KWP "bmw dealership phoenix" and saw that www.bmwnorthscottsdale.com has separate listing for google places and organic results. Any idea how this company did this? Please see the attached image
Algorithm Updates | | ao5000000 -
Local search ranking tips needed
Hi there, I've been working on my clients website for a while now. About a month ago I created him a local business listing in Google. I was wondering if there are any new tips to get his business up the rankings in local search? I've researched and only really found information relevant to the old way Google displayed local search.
Algorithm Updates | | SeoSheikh0 -
Should I block non-informative pages from Google's index?
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are: -Galleries
Algorithm Updates | | UnderRugSwept
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentences My question is whether or not I should put a noindex in the meta tags. I think it would be good because the ratio of quality to low quality pages right now is not good at all. I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?1 -
Why would my product pages no longer be indexed in Google?
Our UK site has 72 pages in our sitemap. 30 of them are product pages which take a productid parameter. Prior to 1st Feb 2011, all pages were indexed in Google but since then all of our product pages seem to have dropped from the index? If I check in webmaster tools, I can see that we have submitted 72 pages and 42 are indexed. I realise we should have some better url structuring and I'm working on that but do you have any ideas on how we can get our product poages back into googles index http://www.ebacdirect.com
Algorithm Updates | | ebacltd0