Clicks are the ultimate factor to stick the page on position?
-
Hi all,
We know many factors contribute to make a page rank at (top) position like somewhere in top 5 results. I have seen some of our pages suddenly spike to that positions and locked there. They been receiving clicks too. Will they be dropped if they don't get estimated clicks? I think many factors contribute to make a page rank higher but clicks are the one factor which makes the page consistently rank at its best position. What do you say?
Thanks
-
Hi,
Yes you are right CTR (click through rate) is important factor in SEO too. If users clicks on your site's link in SERPS then Google will think that your page is relevant and you will get better rank.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sizable decrease in amount of pages indexed, however no drop in clicks, impressions, or ranking.
Hi everyone, I've run into a worrying phenomenon in GSC and im wondering if anyone has come across something similar. Since August, I have seen a steady decline in the number of pages that are indexed from my site, from 1.3 million down to about 800,000 in two months. Interestingly, my clicks/impressions continue to increase gradually (on the same pace they have been for months) and I see no other negative side affects resulting from this drop in coverage. In total I have 1.2 million urls that fall into one of three categories, "Crawled - currently not indexed", "Crawl anomaly", and "Discovered - currently not indexed" Some other notes - all of my valid, error, and excluded pages are https://www. , so I don't believe there is an issue with different versions of the same site being submitted. Also, my rankings have not changed so I tentatively believe that this is unrelated to the Medic Update. If anyone else has experienced this or has any insight to the problem I would love to know. Thanks!
Algorithm Updates | | Jason-Reid0 -
Noindex follow on checkout pages in 2017
Hi,
Algorithm Updates | | DGAU
My website really consists of 2 separate sites. Product site:
• Website with product pages.
• These product pages have SEO optimised content. Booking engine & checkout site:
• When a user clicks 'Book' on one of the product pages on the aforementioned product site they go to a seaparate website which is a booking engine and checkout.
• These pages are not quality, SEO optimised content, they only perform the function of booking and buying. Q1) Should I set 'noindex follow' via the meta tag on all pages of the 'Booking engine and checkout' site?
ie. Q2) should i add anything to the book buttons on the product site? I am hoping all this will somehow help concentrate the SEO juice onto the Product Site's pages by declaring the Booking engine and Checkout sites pages to be 'not of any content value'.0 -
Google Adding / Manipulating Page Meta Titles?
We have a client who is experiencing some heavy google modification to the title tags being displayed on the search engine. It is adding "- 0 Reviews" to an ecommerce site. Obviously a bad start. There were no instances of these keywords anywhere on any of these pages, header tag or otherwise (on only a handful of the affected pages there was a single commented out image with an alt tag 0 reviews - but it was commented out and since removed) We have attempted to rewrite the title multiple times and it will modify the title but still include the non-relevant addition. Has anyone ever experienced anything like this?
Algorithm Updates | | Spindle0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Page details in Google Search
I noticed this morning a drop in the SERPs for a couple of my main keywords. And even though this is a little annoying the more pressing matter is that Google is not displaying the meta title I have specified for the majority of my sites pages, despite one being specified and knowing my site has them in place. Could this sudden change to not using my specified title be the cause of the drop, and why would they be being displayed by Google in the first place, when they are there to be used. The title currently being displayed inthe SERPs is not anything that has been specified in the past or from the previous latest crawl etc. Any insight would be appreciated. Tim
Algorithm Updates | | TimHolmes0 -
Does Schema.org markup create a conflict with Power Reviews' standard microformat markup for e-commerce product pages?
Does anyone have experience implementing Schema.org markup on e-commerce websites that are already using Power Reviews (now Bazaar)? In Google's documentation they say that it's generally not a good idea to use two types of semantic markup for the same item (reviews in this case), but I wouldn't think that there would be a problem marking up other items on the page with Schema such as price, stock status, etc... Anyone care to provide some insight? Also in a related topic, have you all noticed that Google has really dialed back the frequency in which they display rich snippets for product searches? A few weeks ago the site that I'm referring to had hundreds of products that were displaying snippets, now it seems that only about 10% (roughly) of them are still showing. Thanks everybody.
Algorithm Updates | | BrianCC0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Server Down for Few Hours went from Page 1 to Page 6?
We were on Page 1 - our server went down for about 4-6 hours and then we dropped to page 6. Would the server being down for this amount of time affect our position? Any advice would be much appreciated.
Algorithm Updates | | webdesigncwd0