Anyone suspect that a site's total page count affects SEO?
-
I've been trying to find out the underlying reason why so many websites are ranked higher than mine despite seemingly having far worse links. I've spent a lot of time researching and have read through all the general advice about what could possibly be hurting my site's SEO, from page speed to h1 tags to broken links, and all the various on-page SEO optimization stuff....so the issue here isn't very obvious.
From viewing all of my competitors, they seem to have a much higher number of web pages on their sites than mine does. My site currently has 20 pages or so and most of my competitors are well in the hundreds, so I'm wondering if this could potentially be part of the issue here. I know Google has never officially said that page number matters, but does anyone suspect that perhaps page count matters towards SEO and that competing sites with more total pages than you might have an advantage SEOwise?
-
John Mueller of Google recently discussed this topic...
https://www.seroundtable.com/google-higher-page-count-seo-26633.html
Amazon has millions and millions of pages. They can be beaten for commercial queries with a 20 page site. If your 20 pages are bursting with some of the internet's best information for your topic, and visitors engage your site and ask for it by name, then you can beat Amazon with 20 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
How local is local SEO?
If I manage get a client ranked for a localised organic search term on a county level. For example: "keyword - West Midlands" or "keyword - Hertfordshire" How high will the website rank for all the cities and districts within that county? I am going to give this a go but I was wondering if anyone else has had any experience with this?
Algorithm Updates | | Adnan.Hassan.Khan0 -
How big is the effect of having your site hosted in the country you're targeting?
Other than having a ccTLD domain and assigning your target country in Google Webmaster Tools' "geotargeting" feature, how big is the effect of having your site hosted in the country you're targeting? Is it really necessary? or it is just a small signal? Thanks in advance! 🙂
Algorithm Updates | | esiow20131 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
How Do I Optimize with Google's Video Search?
Hi everyone, I am looking here https://developers.google.com/webmasters/videosearch/schema and I don't fully understand. Could someone please explain, step by step, what I have to do to optimize for Google video search? I.e. Step 1 do this Step 2 do this. I don't fully understand Thank you!
Algorithm Updates | | jhinchcliffe0 -
Ideas on why Pages Per Visit Dropped?
Week over week our pages per visit continue to drop. Any ideas on where to look to diagnose?
Algorithm Updates | | Aggie0 -
SEO Budgets, the million dollar question???
Hi All, I am currently looking to revamp my SEO strategy inline with Google's latest Panda and Penguin updates, and looking to appoint a new agency. With SEO changing so much over the years and so many players in the marketplace quoting all sorts, I simply need to determine the kind of money I need to be spending on my SEO, 2) what i should be getting for the money, or different budget levels what I need to be focusing on in priority order, a top ten in sorts Should i be looking to increase or decrease my spend over the long term. I am only a small business with a turnover of about 50 - 80k and need to really cement my strategy so it work long term but also shows a steady return. I have one guy quoting $99 a month, one £250 and one £750, you can probably see my problem. Thanks in advance.
Algorithm Updates | | etsgroup0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0