Can increasing website pages decrease domain authority?
-
Hello Mozzers!
Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
-
I certainly think that gradually adding pages and focusing on quality will help. The problem is that the devil really is in the details. The size of your current site, the type of pages you currently index, your link profile, the type of niche/industry you're in... all of these things matter, to some degree. So, what works for one site might be a problem for another.
Easing into it is definitely going to mitigate your risks, and I think focusing on the most high-impact pages while leaving the other filters/sorts/etc. out of the index is a good idea. Whether this strategy is going to provide real value over the time is the bigger question. Ultimately, I think internal search pages have been devalued a lot, even on reputable sites. I've been through this with a former client - they have a perfectly legitimate business model and provide good value to users, but Google sees them as a directory and much of their index is necessarily search results. Over time, even though they've never been penalized, they've just seen a steady decline, because it takes more than that to rank now.
-
Thank you for your response Peter. I have been thinking of an alternative and this is what I have come up with:
1. We no index, follow all our attribute filters
2. We slowly add landing pages using filter combinations manually one at a time, with good content and build their PA over time.
3. These new pages would contain **index, follow **tags and would also be listed on our site map.xml file
This way we wont have numerous automated landing pages, rather few targeted landing pages with links and content.
Your opinion on this approach would be much appreciated
-
Generally speaking, Google's view of internal search pages has dimmed over time, and they tend to view them as thin. It used to be common practice to use those pages to rank for category and sub-category terms, but Panda has changed a lot of that.
That's not to say it never works, or that if you add enough unique content, you can't create value. Given that your DA is low, though, and it sounds like all the new content you'd be rolling out would effectively be search results within your own site, I'd be cautious.
-
Once I get to the point that I would not do something on my own sites. I am unable to give further advice.
Try it and see what happens if you think this is a good thing to do. I don't.
-
What do you think about adding noindex, no follow to all the layered navigation filters and slowly adding one landing page at a time with good content and building their Page authority. Since filters will be noindex, no follow, these landing pages can be submitted to good through the xml sitemap. Would this be a better strategy in your opinion?
How would you handle this if it were your project?
-
I have no opinion on this. It might work, it might not. I would not do this on any of my sites.
I think you are starting with a small amount of seed content and spreading it very thin through many pages. I can't imagine how this would produce a good experience for users.
-
Thank you for your valuable input guys. This is really helping me to clarify some concepts. I would like to refer to the real life case that this discussion is about
Our strategy is to use our layered navigation filters to create numerous landing pages. As you can imagine, a combination of these filters can create many many pages. We are using the following tactics to make them search engine friendly:
1. Using nofollow, noindex tags in the header, if more than one option is selected for the same filter, to control crawl depth
2. Using nofollow, noindex tags in the header, if more than two different filters are selected, to control crawl depth.
3. Adding unique content to these pages
4. Creating a unique meta title/description for each page
5. Using rel next prev for pagination, to consolidate link equity and preferably index the first page in the series
As you can imagine, even with the condition of nofollow, noindex pages with more than two filter combinations, we get many many pages. After reading your comments, I am wondering whether this strategy will work for us, especially since we are a new site with 25 DA. Do you think this strategy will work for us? Do you suggest an alternative?
-
Yeah, one thing I think is critically important is to try to divorce yourself from your own creation and think in terms of what Google finds valuable. We all think our sites are the greatest and every page we create is a masterpiece, even when we'd ignore or trash the same kind of page on someone else's site. When you're talking about a 100X increase, brutal honesty with yourself is very important.
-
I have added large numbers of pages to websites and the result of that has often been a decrease in rankings even if the content was golden. Why? As Dr. Pete says... the authority and link value of your site gets spread out into a larger number of pages.
If you add an enormous number of pages and have a puny amount of links delivering spider activity to your site, google will start to forget your deep pages if they are spidered infrequently. If you add 10,000 new pages then you better have a few dozen permanent links of about PR4 or PR5 hitting nodes located deep within that mass of 10,000 pages. That will force a constant stream of spiders deep into those pages and they will have to chew their way out to escape, indexing pages as they go. Remove those links and the stream of spiders stops and google might forget those pages if they are on a site of less than moderate strength.
The only way that you get your rankings back after adding a huge mass of pages is if your content is engaged, shared, and linked enough to earn it back. Every page on your site adds a bit of weight, it has to be supported with authority.
Huge powerful sites, even those with lots of very high quality, highly engaged content, can be hit by Panda. I had a bunch of republished and thin pages on one of my sites and it lost rankings in a Panda update. I deleted lots of those pages and noindexed others and rankings came back.
-
I'd have to ask about how we specifically measure DA in this case, but the broader answer is "Yes", it can absolutely decrease your authority. There was a time when more pages just mean more opportunities to rank, but that time is long gone, IMO. Even before Panda, there was an increasing risk of dilution - your authority (and even specifically your PageRank) can only spread so thin. If your link profile is relatively weak, and you expand by 100X, each page is going to get less and less authority. You have more theoretical opportunities to rank, but each opportunity has a much smaller chance. Of course, it's more complex than that, but that's the bottom line.
After Panda, the calculation changed a lot. Now, you're not only diluting your content, but if it's thin enough, you risk Google taking action that could harm your entire site. So, EGOL's simple question is critically important. Also, note that "unique" doesn't not mean valuable in Google's eyes (or search users). It's easy to string words together to create something unique, but if that's not adding value it may still be seen as "thin".
-
We are using our layered navigation in Magento to create landing pages. These landing pages contain filtered products with unique content. I was wondering if having many pages can cause dilution in domain authority?
-
If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Are you adding gold or crap? Gold? Crap?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you see changes in page authority quarterly?
Is there any definitive answer to when page authority can change? I'm trying to see if it's a trackable metric but not sure since (I believe) it's something that takes time. Is there any article/reference that speaks to the fact that the page authority can take time to change? I do know that changes to the page, competitors pages and a multitude of factors go into the score, but I'm trying to see if there is a 'simple' answer for the timing of the scoring.
Moz Pro | | AvexHomes0 -
Why doesn't Moz crawl whole pages of our website to report All On-Page issues?
Hi friends & mozzers, How can't Moz crawl whole pages of our website: https://www.4atvtires.com/ to report All Serious On-Page issues. We have more than 15000 product pages. And how could it be possible that Moz isn't able to crawl whole, just got crawl report upto 258 pages of our website, and also I can experience the same in Google webmaster ?? Please help to fix this issue as early as possible. Regards,
Moz Pro | | BigSlate
Rann0 -
What to look for with Domain Authority?
Hi, Our site has fluctuated between 20 and 22 DA for six months, but today it jumped to 27. Is that a sign we are doing something right or is 5 points within the DA's fluctuations pretty standard? Thanks, Ruben
Moz Pro | | KempRugeLawGroup0 -
Why is my domain not being crawled anymore?
I just noticed that right around 12/1/2012, SEOMoz stopped crawling all but two pages out of the 400 or so on my website at www.TrustworthyCare.com . I speculate that this is probably due to some dumb mistake I made at that time, but I can't for the life of me figure out what that mistake was. Before that, the weekly crawls included all 400 or so pages. I wonder whether it's something that changed in our .htaccess file. Here's how that file looks now; can anyone see what is wrong there, or perhaps offer other suggestions if it doesn't look like anything is wrong in it? Thanks! Tim PS - I'm a small business owner, not an SEO or software engineer. PPS - I found and read this page, but I've pretty much tried the things described there (I think): https://seomoz.zendesk.com/entries/409821-why-isn-t-my-site-being-crawled-you-re-not-crawling-all-my-pages ================================= RewriteCond %{HTTP_HOST} ^aservantsheartcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartcaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartcaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartgeriatriccare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartgeriatriccare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartgeriatriccaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartgeriatriccaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantshearthomecare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantshearthomecare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartseniorcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartseniorcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartservices.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartservices.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^careforparents.com$ [OR]RewriteCond %{HTTP_HOST} ^www.careforparents.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^eldercareradio.com$ [OR]RewriteCond %{HTTP_HOST} ^www.eldercareradio.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^helpforyourparents.com$ [OR]RewriteCond %{HTTP_HOST} ^www.helpforyourparents.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^privatedutyseniorcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.privatedutyseniorcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegocaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegocaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegocaremanager.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegocaremanager.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegogeriatriccaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegogeriatriccaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegogeriatriccaremanager.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegogeriatriccaremanager.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^servantsheartcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.servantsheartcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^servantshearthomecare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.servantshearthomecare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^servantsheartseniorcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.servantsheartseniorcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlccare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlccare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlcseniorcenter.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlcseniorcenter.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlcseniorhomecare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlcseniorhomecare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlcseniorservices.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlcseniorservices.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] #php_value upload_max_filesize 8MRewriteCond %{HTTP_HOST} ^trustworthycare.com$RewriteRule ^(.)$ "http://www.trustworthycare.com/$1" [R=301,L] RewriteCond %{HTTP_REFERER} !^$RewriteCond %{HTTP_REFERER} !^http://blog.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://blog.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://test.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://test.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.blog.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.blog.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.test.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.test.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com/images/files_for_service_inquiries/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com/images/files_for_service_inquiries$ [NC]RewriteCond %{HTTP_REFERER} !^http://sandbox.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://sandbox.trustworthycare.com$ [NC]RewriteRule ..(jpg|jpeg|gif|png|bmp)$ - [F,NC] RewriteCond %{HTTP_HOST} ^ashsc.com$ [OR]RewriteCond %{HTTP_HOST} ^www.ashsc.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] # BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.5" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.5" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.5" # END W3TC Browser Cache# BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule . - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.trustworthycare.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.5) [NC] RewriteCond "%{DOCUMENT_ROOT}/sitectrl/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/sitectrl/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPressRewriteCond %{HTTP_HOST} ^privatedutycare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.privatedutycare.com$RewriteRule ^/?$ "http://www.ageassistance.com" [R=301,L] =================================
Moz Pro | | tcolling0 -
Can Google see all the pages that an seomoz crawl picks up?
Hi there My client's site is showing around 90 pages indexed in Google. The seomoz crawl is returning 1934 pages. Many of the pages in the crawl are duplicates, but there are also pages which are behind the user login. Is it theoretically correct to say that if a seomoz crawl finds all the pages, then Google has the potential to as well, even if they choose not to index? Or would Google not see the pages behind the login? And how come seomoz can see the pages? Many thanks in anticipation! Wendy
Moz Pro | | Chammy0 -
Can I set Up Multiple Campaigns For Same Website?
I would like to set up one campaign for my main site example.com and another campaign for example.com/blog. Is it possible? I don't want crawling to overlap for these two separate campaigns.
Moz Pro | | jombay0 -
Only crawling one page
Hi there, A campaign was crawling fine, but at the last crawl, for some reason, SEOmoz can only crawl one page... any ideas? If I run a custom crawl I still access all of the site's pages.
Moz Pro | | harryholmes0070 -
Why are these pages considered duplicate page content?
A recent crawl diagnostic for a client's website had several new duplicate page content errors. The problem is, I'm not sure where the error comes from since the content in the webpage is different from one another. Here's the pages that SEOMOZ reported to have duplicate page content errors: http://www.imaginet.com.ph/wireless-internet-service-providers-term http://www.imaginet.com.ph/antivirus-term http://www.imaginet.com.ph/berkeley-internet-name-domain http://www.imaginet.com.ph/customer-premises-equipment-term The only thing similar that I see is the headline which says "Glossary Terms Used in this Site" - I hope that the one sentence is the reason for the error. Any input is appreciated as I want to find out the best solution for my client's website errors. Thanks!
Moz Pro | | TheNorthernOffice790