Does adding lots of new content on a site at one time actually hurt you?
-
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
-
I agree with all colleagues above, I cant see how your web site will be penalised due to lots of pages uploaded at the same time.
However Adding Too Many Pages Too Quickly May Flag A Site To Be Reviewed Manually. This means thought that you will add hundreds of thousand of link a night. Here is the related via by Matt Cutts:
Hope you find this useful!
-
It is a real estate site and the content is a directory of the various condos available in their community. The pages are all unique and have real valuable content, so I don't think there will be any issues with content quality.
There is new content and blogging that occurs regularly on the site. I think that the client's concern comes from some old concepts that if we're only adding content infrequently, but in mass, that it may be seen as spammy.
-
I agree with Jesse. Earlier this year we added a new data-driven section to our website that included (believe it or not) 83,000 pages, all unique in content since the information is highly technical in nature. No associated penalties have resulted from this.
-
I agree with Jesse for the most part. I think the key is: what kind of content we are talking about? Adding tons of low-value, thin content pages to a site all at once (or even gradually) is probably going to diminish the authority of existing content. I do think that adding thousands of pages that have no page authority to a site that contains pages with a decent amount of authority could, theoretically, dilute the authority of the existing pages depending on site architecture, internal linking and the ratio of existing pages versus new pages. However, I would expect this to be only temporary, and if the new content is great quality, should be nothing to worry about long term.
-
Thanks Jesse, that was my thought exactly. If anything, I see incrementally adding the content as a negative thing, since it will lead to a less than complete user experience.
-
No truth to that whatsoever. That's weird paranoia.
If there was some sort of problem WITH the content, maybe. But there would be no penalty for all new content added.
I've done total site overhauls plenty of times and they get indexed quick with no penalties.. (although I will say the speed of this seems to be in flux, but I digress.)
Don't let the client worry about this. Think about any website that initially launches: why would Google penalize that?
Hope this helps. Paranoia is often the toughest challenge when it comes to dealing with clients/site owners.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving established :COM site to a .ART domain
Hi! We have an existing website that has a .com TLD with our brand name, which is completely unrelated to any of the terms we want to rank for except for the brand search of our company of course. We have an online shop and the .com site has been online for a good few years. The business activity is related to art, in fact some of our customers would search for "name of artists + art" and we appear in results. From what I have read, Google is not going to give better rankings for a .art domain name, but will the extension be counted as a potential keyword and relevancy to users searches based on example above? Does anyone have any experience with regards to this consideration? Thanks!
Algorithm Updates | | bjs20100 -
Google Search Analytics desktop site to losing page position compared to the mobile version of the site
Looking at Google Search Analytics page position by device. The desktop version has seen a dramatic drop in the last 60 days compared to the mobile site. Could this be caused by mobile first indexing? Has Google had any releases that might have caused this?
Algorithm Updates | | merch_zzounds0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Site´s Architecture - Categories . What´s the best in my case?
My Dear friends of MOZ, I´ve got you a case that has been driving me crazy for 2 weeks, Im doing an SEO audit for big brand that sells electronics. Since they sell all kind of electronics, and are very popular the site is quite big and has several categories. Now...Im working particularly in a kind of micro-site that sells two kind of products that are very similar but not the same. Lets say in this site they are selling super-light-weight-Laptops and tablets, so if you look the site its a Laptop/Tablet site. But the site is not under a laptop/tablet directory, some pages are under laptop and others in Tablet directory . For example : Home page URL: /light-laptops/home.asp ; Products general page page URL is light-pads/products.asp ; and each single product page is under laptops or pads according the type of product. From my point of view, they should create a new directory called /light-laptops-pads/ and single directories for products, and case studies, etc.. Since they want to show both products together when you click in products (off course they will be creating sub-directories for the two types of products). At the begining I thought they were really mistaken, but now that I see that all light-pad content is in one folder and light-laptops content is in another, and the site jumps from one category to the other I am a little bit confused. PLEASE HELP ME PD: I want to make clear that general categories like products, case studies , contact us, solutions pages are in some cases under /light-pad/ directory and in other cases under /light-laptops / directory PLEASE PARDON MY ENGLISH!
Algorithm Updates | | facupp10 -
When was the last algorithm update? One of my pages has dropped significantly this week
One of my pages dropped 22 places last week and I'm not sure why - can any body give me some suggestions to why this might have happened?
Algorithm Updates | | lindsayjhopkins0 -
Google spitting out old data as new alerts
Am I just unlucky or are others seeing this too? I have several google alerts. For the past 6 months, google keeps sending crap along with good stuff. its a bit like their search results. There are three types of Alerts they send that I'm not impressed with. 1. Alerts that are from unintelligible splogs that take real news stories and rewrite them with unintelligible garbage that makes no sense at all. Sometimes, they serve up new alerts from the same splogs I saw several months ago, that I felt sure they would have zapped by now. 2. Old stories, that have been around for months. I just received one that was from January, from TechDirt, a big site that must get a huge amount of attention from google. 3. Irrelevant stories because they love to show how smart they are by splitting my alert keyword text into multiple words, but it gives useless results. This is the kind of stuff that crappy search engines like AltaVista used to do. Is google reverting to the childhood of search with all these changes?
Algorithm Updates | | loopyal0 -
The related: query for one of my urls makes no sense
I'm trying to compete regarding keyword X. Currently, I'm on first page, 7-8th position. If, for each one of the urls listed in first page for such keyword, I search for related:[url], I get similar results for all of them, but mine. Mine shows inconsistent results, none of which related to the same topic as the other 9 in the top 10. Looking at them, the only hypothesis I am able to formulate is that, somehow, google is linking the url to its paid banners in big media. However, such banners go through an adserver and/or are declared as nofollow. Is there any obvious reason that could be causing this? I wonder if we are on page 1 even though we're considered pretty-much 'off-topic' regarding the keyword.
Algorithm Updates | | jleanv240 -
What is the critical size to reach for a content farm to be under google spot?
We're looking for building a content farm, as an igniter for another site, so there will be some duplicate content. Is it a good or a bad strategy in terms of SEO.
Algorithm Updates | | sarenausa0