Can increasing website pages decrease domain authority?
-
Hello Mozzers!
Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
-
I certainly think that gradually adding pages and focusing on quality will help. The problem is that the devil really is in the details. The size of your current site, the type of pages you currently index, your link profile, the type of niche/industry you're in... all of these things matter, to some degree. So, what works for one site might be a problem for another.
Easing into it is definitely going to mitigate your risks, and I think focusing on the most high-impact pages while leaving the other filters/sorts/etc. out of the index is a good idea. Whether this strategy is going to provide real value over the time is the bigger question. Ultimately, I think internal search pages have been devalued a lot, even on reputable sites. I've been through this with a former client - they have a perfectly legitimate business model and provide good value to users, but Google sees them as a directory and much of their index is necessarily search results. Over time, even though they've never been penalized, they've just seen a steady decline, because it takes more than that to rank now.
-
Thank you for your response Peter. I have been thinking of an alternative and this is what I have come up with:
1. We no index, follow all our attribute filters
2. We slowly add landing pages using filter combinations manually one at a time, with good content and build their PA over time.
3. These new pages would contain **index, follow **tags and would also be listed on our site map.xml file
This way we wont have numerous automated landing pages, rather few targeted landing pages with links and content.
Your opinion on this approach would be much appreciated
-
Generally speaking, Google's view of internal search pages has dimmed over time, and they tend to view them as thin. It used to be common practice to use those pages to rank for category and sub-category terms, but Panda has changed a lot of that.
That's not to say it never works, or that if you add enough unique content, you can't create value. Given that your DA is low, though, and it sounds like all the new content you'd be rolling out would effectively be search results within your own site, I'd be cautious.
-
Once I get to the point that I would not do something on my own sites. I am unable to give further advice.
Try it and see what happens if you think this is a good thing to do. I don't.
-
What do you think about adding noindex, no follow to all the layered navigation filters and slowly adding one landing page at a time with good content and building their Page authority. Since filters will be noindex, no follow, these landing pages can be submitted to good through the xml sitemap. Would this be a better strategy in your opinion?
How would you handle this if it were your project?
-
I have no opinion on this. It might work, it might not. I would not do this on any of my sites.
I think you are starting with a small amount of seed content and spreading it very thin through many pages. I can't imagine how this would produce a good experience for users.
-
Thank you for your valuable input guys. This is really helping me to clarify some concepts. I would like to refer to the real life case that this discussion is about
Our strategy is to use our layered navigation filters to create numerous landing pages. As you can imagine, a combination of these filters can create many many pages. We are using the following tactics to make them search engine friendly:
1. Using nofollow, noindex tags in the header, if more than one option is selected for the same filter, to control crawl depth
2. Using nofollow, noindex tags in the header, if more than two different filters are selected, to control crawl depth.
3. Adding unique content to these pages
4. Creating a unique meta title/description for each page
5. Using rel next prev for pagination, to consolidate link equity and preferably index the first page in the series
As you can imagine, even with the condition of nofollow, noindex pages with more than two filter combinations, we get many many pages. After reading your comments, I am wondering whether this strategy will work for us, especially since we are a new site with 25 DA. Do you think this strategy will work for us? Do you suggest an alternative?
-
Yeah, one thing I think is critically important is to try to divorce yourself from your own creation and think in terms of what Google finds valuable. We all think our sites are the greatest and every page we create is a masterpiece, even when we'd ignore or trash the same kind of page on someone else's site. When you're talking about a 100X increase, brutal honesty with yourself is very important.
-
I have added large numbers of pages to websites and the result of that has often been a decrease in rankings even if the content was golden. Why? As Dr. Pete says... the authority and link value of your site gets spread out into a larger number of pages.
If you add an enormous number of pages and have a puny amount of links delivering spider activity to your site, google will start to forget your deep pages if they are spidered infrequently. If you add 10,000 new pages then you better have a few dozen permanent links of about PR4 or PR5 hitting nodes located deep within that mass of 10,000 pages. That will force a constant stream of spiders deep into those pages and they will have to chew their way out to escape, indexing pages as they go. Remove those links and the stream of spiders stops and google might forget those pages if they are on a site of less than moderate strength.
The only way that you get your rankings back after adding a huge mass of pages is if your content is engaged, shared, and linked enough to earn it back. Every page on your site adds a bit of weight, it has to be supported with authority.
Huge powerful sites, even those with lots of very high quality, highly engaged content, can be hit by Panda. I had a bunch of republished and thin pages on one of my sites and it lost rankings in a Panda update. I deleted lots of those pages and noindexed others and rankings came back.
-
I'd have to ask about how we specifically measure DA in this case, but the broader answer is "Yes", it can absolutely decrease your authority. There was a time when more pages just mean more opportunities to rank, but that time is long gone, IMO. Even before Panda, there was an increasing risk of dilution - your authority (and even specifically your PageRank) can only spread so thin. If your link profile is relatively weak, and you expand by 100X, each page is going to get less and less authority. You have more theoretical opportunities to rank, but each opportunity has a much smaller chance. Of course, it's more complex than that, but that's the bottom line.
After Panda, the calculation changed a lot. Now, you're not only diluting your content, but if it's thin enough, you risk Google taking action that could harm your entire site. So, EGOL's simple question is critically important. Also, note that "unique" doesn't not mean valuable in Google's eyes (or search users). It's easy to string words together to create something unique, but if that's not adding value it may still be seen as "thin".
-
We are using our layered navigation in Magento to create landing pages. These landing pages contain filtered products with unique content. I was wondering if having many pages can cause dilution in domain authority?
-
If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Are you adding gold or crap? Gold? Crap?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my page rank disapointing
Hi fairly new here, so just getting used to everything one questions please. Just ran a crawl test of the website and this page http://www.livingphilosophy.org.uk/teaching-philosophy/index.htm came back with a page authority of 1. Other pages have a rank of 18 through 26 scratched my head for a few hours and came up with no ideas. thanks andy
Moz Pro | | livingphilosophy0 -
Domain Authority
Has there been an update re DA since there has been signifanct drops the lat 24 hours?
Moz Pro | | NineFiftyNine.net0 -
What could cause your Domain Authority score to drop?
We have 3 websites that are on 3 different campaigns. Our flagship website is www.cannontrading.com. The Domain Authority score on it was 42. When I first (Mid November 2012) started, we had 295 keywords allocated for Cannontrading.com and 5 other keywords for allocated to our other websites... e-futures.com and e-mini.com. What I wanted to do was optimize the other two pages and I did this by taking taking 50 the keywords that were not ranked in the top 50 from Cannontrading.com and assigned them to the our other two sites. Could this have caused the drastic drop in the score? Our keyword rankings for the "Top 3" has dropped from 93 to 34 since November 2012. We did have a company that did article submissions for us up till October 2012. Could this have caused our drop in the Domain Authority Score and our Keyword Rankings? I didn't make any other changes to Cannon website beside taking away those keywords and given them to our other sites. Any suggestions or thoughts would really help me. Regards, David
Moz Pro | | ACann0 -
I want to create a report of only de duplicate content pages as a csv file so i can create a script to canonicalize them.
I want to create a report of only de duplicate content pages as a csv file so i can create a script to canonicalize them. So i get something like: http://example.com/page1, http://example.com/page2, http://example.com/page3, http://example.com/page4, Because I now have to open each in "Issue: Duplicate Page Content", and this takes a lot of time. The same for duplicate page title.
Moz Pro | | nvs.nim0 -
Root domain or sub domain
When I crawl my site as a root domain, I get more errors is my campaign than when I set my site as a sub domain. Which one is the correct way: root domain or subdomain. My site is www.aa-rental.com
Moz Pro | | tanveer10 -
How can competition outrank you if your site has better Domain/Page Authority, More links, and More Social sharing?
Say you have a site that has better Domain/page authority, more links, more social media sharing, and a lot more indexed pages (thanks to blogging) than the competition. Of course all of these metrics are based off of data from SEOMoz open site explorer tool which I am not sure if it produces accurate data. 1. Other than exact match domains or the age of a domain what would be other reasons why competition would outrank you? 2. Can anyone suggest other ways to help increase a sites domain/page authority besides creating more indexed pages, link building, etc..?
Moz Pro | | webestate0 -
Experencing page authority issues after a 301 redirect
We just completed a build of a new site and used 301 redirects to retain our page authority. In the first week all the interior pages reported a page authority of 1 after 2 or so weeks the page authority began to look more accurate but they were still not as high as the original pages. The strange thing is that when you click on the link to a page the page authority populates correctly but when the page finally finished loading the PA goes back down. Has anyone ever experienced this and if so how did you fix it? Thanks!
Moz Pro | | Jo_vortx.com0 -
On-Page Keyword Optimization Question
First let me say I want to improve the text of the site I am working on focusing on the site visitor in the first instance. I run the "On-Page Keyword Optimization" The page fails on "Avoid Keyword Stuffing in Document... ...Occurrences of Keyword 48" well over the limit of 15. The occurrence include those in the site navigation and strapline, but it was my understanding that Google was aware of nav areas/areas common to most other pages on the site and that keywords in these areas weren't viewed as being part of the page content. The keyword is the main keyword for the company, and the page is the home page i.e. "acme widgets" the others are "acme widgets for the home"... well you get the idea: The page breaks down as follows: 5 instances in primary nav 1 instance strapline 3 instances secondary nav Remainder in page body I am told by the tool to reduce to 15 instances, so should I? Have 9 instances in the nav and other areas and 6 or so on the page Have 9 instances in the nav and other areas and 15 or so on the page
Moz Pro | | GrouchyKids0