Can increasing website pages decrease domain authority?
-
Hello Mozzers!
Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
-
I certainly think that gradually adding pages and focusing on quality will help. The problem is that the devil really is in the details. The size of your current site, the type of pages you currently index, your link profile, the type of niche/industry you're in... all of these things matter, to some degree. So, what works for one site might be a problem for another.
Easing into it is definitely going to mitigate your risks, and I think focusing on the most high-impact pages while leaving the other filters/sorts/etc. out of the index is a good idea. Whether this strategy is going to provide real value over the time is the bigger question. Ultimately, I think internal search pages have been devalued a lot, even on reputable sites. I've been through this with a former client - they have a perfectly legitimate business model and provide good value to users, but Google sees them as a directory and much of their index is necessarily search results. Over time, even though they've never been penalized, they've just seen a steady decline, because it takes more than that to rank now.
-
Thank you for your response Peter. I have been thinking of an alternative and this is what I have come up with:
1. We no index, follow all our attribute filters
2. We slowly add landing pages using filter combinations manually one at a time, with good content and build their PA over time.
3. These new pages would contain **index, follow **tags and would also be listed on our site map.xml file
This way we wont have numerous automated landing pages, rather few targeted landing pages with links and content.
Your opinion on this approach would be much appreciated
-
Generally speaking, Google's view of internal search pages has dimmed over time, and they tend to view them as thin. It used to be common practice to use those pages to rank for category and sub-category terms, but Panda has changed a lot of that.
That's not to say it never works, or that if you add enough unique content, you can't create value. Given that your DA is low, though, and it sounds like all the new content you'd be rolling out would effectively be search results within your own site, I'd be cautious.
-
Once I get to the point that I would not do something on my own sites. I am unable to give further advice.
Try it and see what happens if you think this is a good thing to do. I don't.
-
What do you think about adding noindex, no follow to all the layered navigation filters and slowly adding one landing page at a time with good content and building their Page authority. Since filters will be noindex, no follow, these landing pages can be submitted to good through the xml sitemap. Would this be a better strategy in your opinion?
How would you handle this if it were your project?
-
I have no opinion on this. It might work, it might not. I would not do this on any of my sites.
I think you are starting with a small amount of seed content and spreading it very thin through many pages. I can't imagine how this would produce a good experience for users.
-
Thank you for your valuable input guys. This is really helping me to clarify some concepts. I would like to refer to the real life case that this discussion is about
Our strategy is to use our layered navigation filters to create numerous landing pages. As you can imagine, a combination of these filters can create many many pages. We are using the following tactics to make them search engine friendly:
1. Using nofollow, noindex tags in the header, if more than one option is selected for the same filter, to control crawl depth
2. Using nofollow, noindex tags in the header, if more than two different filters are selected, to control crawl depth.
3. Adding unique content to these pages
4. Creating a unique meta title/description for each page
5. Using rel next prev for pagination, to consolidate link equity and preferably index the first page in the series
As you can imagine, even with the condition of nofollow, noindex pages with more than two filter combinations, we get many many pages. After reading your comments, I am wondering whether this strategy will work for us, especially since we are a new site with 25 DA. Do you think this strategy will work for us? Do you suggest an alternative?
-
Yeah, one thing I think is critically important is to try to divorce yourself from your own creation and think in terms of what Google finds valuable. We all think our sites are the greatest and every page we create is a masterpiece, even when we'd ignore or trash the same kind of page on someone else's site. When you're talking about a 100X increase, brutal honesty with yourself is very important.
-
I have added large numbers of pages to websites and the result of that has often been a decrease in rankings even if the content was golden. Why? As Dr. Pete says... the authority and link value of your site gets spread out into a larger number of pages.
If you add an enormous number of pages and have a puny amount of links delivering spider activity to your site, google will start to forget your deep pages if they are spidered infrequently. If you add 10,000 new pages then you better have a few dozen permanent links of about PR4 or PR5 hitting nodes located deep within that mass of 10,000 pages. That will force a constant stream of spiders deep into those pages and they will have to chew their way out to escape, indexing pages as they go. Remove those links and the stream of spiders stops and google might forget those pages if they are on a site of less than moderate strength.
The only way that you get your rankings back after adding a huge mass of pages is if your content is engaged, shared, and linked enough to earn it back. Every page on your site adds a bit of weight, it has to be supported with authority.
Huge powerful sites, even those with lots of very high quality, highly engaged content, can be hit by Panda. I had a bunch of republished and thin pages on one of my sites and it lost rankings in a Panda update. I deleted lots of those pages and noindexed others and rankings came back.
-
I'd have to ask about how we specifically measure DA in this case, but the broader answer is "Yes", it can absolutely decrease your authority. There was a time when more pages just mean more opportunities to rank, but that time is long gone, IMO. Even before Panda, there was an increasing risk of dilution - your authority (and even specifically your PageRank) can only spread so thin. If your link profile is relatively weak, and you expand by 100X, each page is going to get less and less authority. You have more theoretical opportunities to rank, but each opportunity has a much smaller chance. Of course, it's more complex than that, but that's the bottom line.
After Panda, the calculation changed a lot. Now, you're not only diluting your content, but if it's thin enough, you risk Google taking action that could harm your entire site. So, EGOL's simple question is critically important. Also, note that "unique" doesn't not mean valuable in Google's eyes (or search users). It's easy to string words together to create something unique, but if that's not adding value it may still be seen as "thin".
-
We are using our layered navigation in Magento to create landing pages. These landing pages contain filtered products with unique content. I was wondering if having many pages can cause dilution in domain authority?
-
If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Are you adding gold or crap? Gold? Crap?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase in Rankings, but search visibility is decreasing
I just started updating my site with the very basics, focus keyword, meta description, and page titles. I see that I am going up in my targeted keywords, however my search visibility has dropped from 3.34% to 2.78%. My assumption is that my keyword choice is a little off, however I see that I have increased in keyword rankings and dropped very little in what I was ranking for previously, plus the keyword volume seems relatively the same. In fact I've added far more keywords than I have dropped. Curious why my search visibility has dropped, however my keyword rankings only seem to increase.
Moz Pro | | kthomasd0 -
My website was at the top of Google search for some years... suddenly I almost can't reach first page! Moz ranks my website better than the competitors... what might be going one? Could anybody help me out? Thanks!
Hello Guys! My website was at the top of Google search for some years... suddenly I almost can't reach first page! Moz ranks my website better than the competitors... what might be going one? Could anybody help me out? Moz rank us grade A... the competitors B or C .. I think we have better back links than they do... Would you need any kind of data or report to help me here? Thanks!
Moz Pro | | wesleyms0 -
What's the best way to switch a campaign from sub-domain tracking to root domain tracking?
I realized after the fact that one of my campaigns also has a mobile website sub-domain that I'd like to track (ex: http://m.website.com). How can I switch this campaign over to root domain tracking without deleting everything and starting over?
Moz Pro | | CFW-SEO1 -
Domain / Page Authority - logarithmic
SEOmoz says their Domain / Page Authority is logarithmic, meaning that lower rankings are easier to get, higher rankings harder to get. Makes sense. But does anyone know what logarithmic equation they use? I'm using the domain and page authority as one metric in amongst other metrics in my keyword analysis. I can't have some metrics linear, others exponential and the SEOmoz one logarithmic.
Moz Pro | | eatyourveggies0 -
How Do I deal with duplicate page titles for pages on eCommerce site
Hi We have an ecommerce site selling physical products. There are a few areas where the products run into two pages. I have used canonical meta tags and next and prev meta tags too. Despite this SEOMOZ reports are still displaying these as warnings for duplicate page titles. An example would be /brand_name/range/ <link rel="next" href="/brand_name/range/?page=2" /> <link rel="<a class="attribute-value">canonical</a>" href="/band_name/range/"/> /brand_name/range/?page=2 <link rel="<a class="attribute-value">prev</a>" href="/brand_name/range/" /> <link rel="<a class="attribute-value">canonical</a>" href="/band_name/range/?page=2"/> Should I be doing something different?
Moz Pro | | wouldBseoKING0 -
Twitter Page Authority Score?
I've been doing some competitive research in Open Site Explorer and many of our competitors have Twitter accounts very similar to ours. Their Twitter pages are usually one of the pages with linking to their website with the most Page Authority. The incoming links from Twitter are a "no follow" as you would guess. This has been the case for a large number of well ranking sites I have looked at. www.dremed.com also has a Twitter account at: https://twitter.com/#!/DREmed . However, Open Site Explorer does not list the Twitter link as an incoming link at all ( or if it does it has no Page Authority ). The Twitter account page seems very similar in nature to other competing Twitter pages. I'm not sure why it does not ALSO pull a high Page Authority score??? Do you know why this might be? Best, Justin
Moz Pro | | justinjeffries0 -
Issue in number of pages crawled
i wanted to figure out how our friend Roger Bot works. On the first crawl of one of my large sites, the number of pages crawled stopped at 10000 (due to the restriction on the pro account). However after a few weeks, the number of pages crawled went down to about 5500. This number seemed to be a more accurate count of the pages on our site. Today, it seems that Roger Bot has completed another crawl and the number is up to 10000 again. I know there has been no downtime on our site, and the items that we fixed on our site did not reduce or increase the number of pages we had. Just making sure there are no known issues with Roger Bot before I look deeper into our site to see if there is an issue. Thanks!
Moz Pro | | cchhita0 -
Group by Domain sorting is broken.
When sorting links with "Group by Domain" it only works for the 1st page. Pages 2 - X do not group the results. Am I doing something wrong? Can we simply set it to "view all" to avoid paging?
Moz Pro | | DanielElmore0