Clicks are the ultimate factor to stick the page on position?
-
Hi all,
We know many factors contribute to make a page rank at (top) position like somewhere in top 5 results. I have seen some of our pages suddenly spike to that positions and locked there. They been receiving clicks too. Will they be dropped if they don't get estimated clicks? I think many factors contribute to make a page rank higher but clicks are the one factor which makes the page consistently rank at its best position. What do you say?
Thanks
-
Hi,
Yes you are right CTR (click through rate) is important factor in SEO too. If users clicks on your site's link in SERPS then Google will think that your page is relevant and you will get better rank.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do orphan pages take away link juice?
Hi, Just wondering about this whether the orphan pages take away any link juice? We been creating lot of them these days only to link from external sites as landing pages on our site. So, not linking from any part of our website; just linking from other websites. Also, will they get any link juice if they are linked from our own blog-post? Thanks
Algorithm Updates | | vtmoz1 -
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
More or less pages from Homepage? Linking 3rd hierarchy level pages from Homepage.
Hi Moz community, With the concept of preserving link juice, many websites stopped linking too many pages from homepage. We even removed our 3rd hierarchy level pages removed from our homepage. We didn't notice much change in rankings. Recently I have gone through some SEO articles where some experts suggested to link low level pages from homepage which indicates to Google the way we respect and prioritise those pages but not just homepage and very next level pages. This also works in internal linking it seems. Is this true? Can we add such low level pages from homepage? Which actually works Thanks
Algorithm Updates | | vtmoz0 -
Should plural keyword variations get their own targeted pages?
I am in the middle of changing a website from targeting just a single keyword on all pages to instead having each page target its own keyword/phrase. However, I'm a little conflicted on whether or not plural forms and other suffix (-ing) variations are different enough to get their own pages. SERP show different results for each keyword searched. Also, relevancy reports for the keywords score some differently and some the same. Is it best to instead use these as secondary and third level keywords on the same page as the main keyword for a page? See example below: OPTION A (Use each for different pages): Page 1 - Construction Fence Page 2 - Construction Fences Page 3 - Construction Fencing Page 4 - Construction Site Fence Page 5 - Construction Site Fences Page 6 - Construction Site Fencing ... OPTION B (Use as variations on same page): Page 1 - Construction Fence, Construction Fences, Construction Fencing Page 2 - Construction Site Fence, Construction Site Fences, Site Construction Fencing ... Any help is greatly appreciated. Thanks!
Algorithm Updates | | pac-cooper0 -
Doorway Algorithm Update Affecting Location Based Pages?
Hi all, I read this article concerning the doorway algorithm update - http://searchengineland.com/google-to-launch-new-doorway-page-penalty-algorithm-216974 This quote is what got my attention: "How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions: Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience? Are the pages intended to rank on generic terms yet the content presented on the page is very specific? Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic? Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality? Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?" We utilize location based pages for ourselves and a few clients too. **Example Case: ** -We attempt to rank for "keyword city/state" - "keyword city/state" - "keyword city/state" The keywords will often be the same such as "AC Repair" or "Physical Therapy" etc. with city / state combination such as "Tulsa, OK" "Seattle, WA" etc. The goal is to rank locally for those terms (NAP is applicable in some circumstances). Does the above case classify as a Doorway page? According to that definition, it does. However, this is a business that services that area. Some don't have physical address there but they do service that area (whether it be AC Repair or Website Design). Please advise me as to what a doorway page is exactly & if my practice is in-line. Thanks, Cole
Algorithm Updates | | ColeLusby0 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
What's the correct format when you Disavow a single page? with or without www.?
Hi Y'all. Can't seem to find an article on disavowing a single page. Do i use A, B, or submit both A and B? Example: A. http://disavowexample.com B. http://www.disavowexample.com Which one does Google prefer? I know for some I just find the canonical url of the page (which show www,) but wanted your expert advice! Thanks
Algorithm Updates | | Shawn1240