Can a page be 100% topically relevant to a search query?
-
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent.
Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
-
I consider 100% match purely as theoretically possible. In my modest opinion the visitor determines the relevancy of the landingpage. And it is Google's nobel job to serve the visitor with a page that fits his needs. But in this case no page can be fully satisfying to everybody, due to different search intentions with the same keyword.
When you achieve a high conversion on your page you'v probably written a very relevant page. So let the visitor truly find what he is looking for and Google will notice....
-
Well said, Russ, especially for a "mathy" answer. I am curious, though, would this "ideal document" you describe have a specific word count?
-
Warning, mathy answer follows. This is a generic description of what is going on, not exact, but hopefully understandable.
Yes, there is some theoretical page that is 100% topically relevant if you had a copy of the "ideal document" produced by the topical relevancy model. This would not look like a real page, though. It would look like a jumble of words in ideal relation and distance to one another. However, most topic models are built using sampling and, more importantly, the comparative documents that are used to determine the confidence level that your document's relevancy is non-random is also sampled. This means that there is some MoE (Margin of Error).
As you and your competitors approach 100% topical relevancy, that Margin of Error likely covers the difference. If you are 99.98% relevant, and they are 99.45% relevant, but the MoE is 1%, then a topical relevancy system cant conclude with certainty that you are more relevant than they are.
At this point, the search model would need to rely on other metrics, like authority, over relevance to differentiate the two pages.
-
With the pace at which things are changing and throwing in machine learning in to the ranking factor, I would say it's close to impossible to have 100% topically relevancy for any good period of time.
-
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
I would agree to a point. At its heart, Google probably uses some form of numerical score for a page as it relates to a query. If a page is a perfect match, it scores 100%. I would also suggest that attaining a perfect score is a virtual impossibility.
The scoring system, however, is dynamic. The page may be perfect for a particular query only at a particular point in time.
- Google's algorithm changes daily. What's perfect today may not be perfect tomorrow.
- Semantic search must be dynamic. If Google discovers a new Proof Term or Relevant Term related to the query, and the page in question doesn't contain that term, the page is no longer perfect.
These are only a couple of examples.
For practical purposes, the amount of testing, research, etc. to achieve a perfect score at some point delivers diminishing returns. The amount of effort required to push a page from 95% to 100% isn't worth the effort, especially since Google's algorithm is a secret.
Sometimes good is good enough.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google considers the direct traffic on the pages with rel canonical tags?
Hi community, Let's say there is a duplicate page (A) pointing to original page (B) using rel canonical tag. Pagerank will be passed from Page A to B as the content is very similar and Google honours it hopefully. I wonder how Google treats the direct traffic on the duplicate Page A. We know that direct traffic is also an important ranking factor (correct me if I'm wrong). If the direct traffic is high on the duplicate page A, then how Google considers it? Will there be any score given to original page B? Thanks
Algorithm Updates | | vtmoz0 -
Backlinks to internal pages help website to rank better or vice versa?
Hi Moz community, We have our backlinks mostly pointed to our homepage. We are trying to rank better and not having minimum number of backlinks to our internal pages is one of the things I worry about. Backlinks to homepage alone help in ranking internal pages or backlinks to internal pages help in ranking homepage? Or both required? Thanks
Algorithm Updates | | vtmoz0 -
Header Structure In Product Gallery Page
Hi Everyone, Should product names have an H2 header tag on a gallery page? (H1 already optimized) Why or why not?
Algorithm Updates | | JMSCC0 -
Search traffic plummeting after HTTPS fumble - what to do now?
Hi all, Our website typically gets about 80% of our traffic from organic Google search over thousands of keywords (i.e., no single keyword (or group of) drives a large portion of our traffic). It's a nine year old website, and we have been growing steadily -- including about 30-40% year-over-year growth for the past 9-months. That is, up until Feb 2nd. On February 2nd, we switched to HTTPS. Everything was done per Google's recommendations: pages individually 301'd to HTTPS pages, no security warnings, added the new site in Webmaster Tools, etc. Google started to pick up our new site -- albeit 3 weeks into the transition, traffic was still significantly down. However, the big problem that we discovered was our ad revenues were getting destroyed. We're an ad based business and our CPMs were tanking, some of our ad partners were having problems serving ads, etc. We were losing a lot of money. So, we made the decision to reverse the HTTPS change and go back to HTTP. That was on Feb 22nd. Our traffic started to recover, and our ad rates did recover. However, 2-weeks after switching back -- March 8 -- our traffic started to fall and has continued to do so. Our traffic is now half of what it was a year ago, and only 1/3 of what it was before we made any changes. I am totally at a loss for what to do. I have spent endless hours digging through Webmaster Tools with no real insights. Here's the most I've been able to glean: Google picked up the new HTTPS site a lot faster than it has reverted back to the HTTP. Particularly for AMP pages. We had about 2,000 indexed AMP pages, which were quickly picked up when we switched to HTTPS, but since changing back to HTTP Google has been slow to re-index the HTTP. Only 935 AMP indexed pages now. According to Webmaster Tools, our overall ranking position has not been affected (the overall average). However, in a sampling of keywords I notice that a number of keywords seem to have been dropped completely from ranking, while others show the same rank position but Google seems to only be showing us in the results intermittently -- e.g., rank is unchanged, but impressions and clicks are much lower. I do not know what to do at this point, and sadly, I'm starting to get desperate for some help. I feel like all the hard work of almost a decade is slipping away and I have no idea how to change course. I've done absolutely everything I can think of from a technical standpoint. Am I being penalized for abandoning the switch to HTTPS? Should I now try and reverse course again, and switch BACK to HTTPS? Is this a temporary bobble that Google's algo will 'forget'? It's a super high quality website with long, unique, detailed articles. Not spammy and we have never had a manual action against us. I don't know what to do. Please help! Here's a link to the website. Thank you in advance.
Algorithm Updates | | tustind0 -
How to implement meta descriptions for accelerated mobile pages with WordPress?
Is there a way in which meta descriptions can be altered on AMP pages? The meta description is not pulling from the original page for each post, and current plugins such as Yoast/Glue for Yoast & AMP by Automattic do not appear to offer customisation of titles/meta descriptions. Any suggestions would be much appreciated!
Algorithm Updates | | SamClicks0 -
Branded vs non-branded query
So there's an obvious difference between a branded and non-branded search term, but I'm interested in the SERPs that are shown as a result. Branded search only results in 7 listings on the first page - obviously because branded search is generally more navigational in nature and the lower results get minimal CT. Are their any technical differences beyond this? Also, how does google define a branded search term? Because a search for Vodafone or Dell show reduced results, but Coca Cola does not. Thanks guys 🙂
Algorithm Updates | | underscorelive0 -
Impressions and search queries help
Hello, In the past month alone, my website www.shiftins.com has had a significant drop, about 50%, in my overall impressions and search queries (info from Google WMT). Now my target keywords have maintained their rankings but it looks like my site has dropped off the map for every other search term. I have not received any unnatural link warnings or anything of that sort so I don't think I have been penalized. Does anyone have an idea of what might have caused the decline and is there a solution to restore my prior status?
Algorithm Updates | | raph39880 -
Ecommerce good/bad? Showing product description on sub/category page?
Hi Mozers, I have a ecommerce furniture website, and I have been wondering for some time if showing the product descriptions on the sub/category page helps the website. If there is more content displayed on the subcategory, it should be more relevant, right? OR does it not matter, as it is duplicate content from the product page. I think showing the product descriptions on non-product pages is hurting my design/flow, but i worry that if I am to hide product content on sub/category pages my traffic will be hurt. Despite my searches I have not found an answer yet. Please take a look at my site and share your thoughts: http://www.ecustomfinishes.com/ Chris 27eVz
Algorithm Updates | | longdenc_gmail.com0