Does adding lots of new content on a site at one time actually hurt you?
-
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
-
I agree with all colleagues above, I cant see how your web site will be penalised due to lots of pages uploaded at the same time.
However Adding Too Many Pages Too Quickly May Flag A Site To Be Reviewed Manually. This means thought that you will add hundreds of thousand of link a night. Here is the related via by Matt Cutts:
Hope you find this useful!
-
It is a real estate site and the content is a directory of the various condos available in their community. The pages are all unique and have real valuable content, so I don't think there will be any issues with content quality.
There is new content and blogging that occurs regularly on the site. I think that the client's concern comes from some old concepts that if we're only adding content infrequently, but in mass, that it may be seen as spammy.
-
I agree with Jesse. Earlier this year we added a new data-driven section to our website that included (believe it or not) 83,000 pages, all unique in content since the information is highly technical in nature. No associated penalties have resulted from this.
-
I agree with Jesse for the most part. I think the key is: what kind of content we are talking about? Adding tons of low-value, thin content pages to a site all at once (or even gradually) is probably going to diminish the authority of existing content. I do think that adding thousands of pages that have no page authority to a site that contains pages with a decent amount of authority could, theoretically, dilute the authority of the existing pages depending on site architecture, internal linking and the ratio of existing pages versus new pages. However, I would expect this to be only temporary, and if the new content is great quality, should be nothing to worry about long term.
-
Thanks Jesse, that was my thought exactly. If anything, I see incrementally adding the content as a negative thing, since it will lead to a less than complete user experience.
-
No truth to that whatsoever. That's weird paranoia.
If there was some sort of problem WITH the content, maybe. But there would be no penalty for all new content added.
I've done total site overhauls plenty of times and they get indexed quick with no penalties.. (although I will say the speed of this seems to be in flux, but I digress.)
Don't let the client worry about this. Think about any website that initially launches: why would Google penalize that?
Hope this helps. Paranoia is often the toughest challenge when it comes to dealing with clients/site owners.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Drop Following Negative Article in New York Times
I have two sites that were mentioned in a negative article in The New York Times a couple weeks ago. They saw a good increase in traffic, but on the sixth both of them saw sudden unexplained Google drops. Both seemed on the average position from search console doubling overnight. I run similar websites that have seen no such drops. The only thing these two have in common are being mentioned in the same negative article. Normally I would expect the mention from a major news outlet to make the sites more authoritative in Google's eyes. Is this a coincidence or a possible manual penalty? They still rank number one for their respected brand names, but everything else has suffered. Did Google make any recent algorithm changes or do you think someone at Google may have read the article and decided the sites needed to be demoted?
Algorithm Updates | | PostAlmostAnything0 -
How to check which site performing well in google organic?
Hi All, Is it possible to check sites via any tool which sites performing good in google organic? Any site ... Is it possible via Alexa? My Concern is majorly for UK Ecommerce site... Thanks!
Algorithm Updates | | pragnesh96390 -
Time taken for Google Algorithm updates to show affect in Middle East?
Hello everyone, Just a quick question. Can anyone give me a safe estimate of how much time it could take for a Google Algorithm Update to show its effect in the Middle East after roll out? Maybe you guys can direct me to a post to read through and learn more about it myself. Your input will be highly appreciated. Regards, Talha
Algorithm Updates | | MTalhaImtiaz0 -
Duplicate Content
I was just using a program (copyscpape) to see if the content on a clients website has been copied. I was surprised that the content on the site was displaying 70% duplicated and it's showing the same content on a few sites with different % duplicated (ranging from 35%-80%) I have been informed that the content on the clients site is original and was written by the client. My question is, does Google know or understand that the clients website's content was created as original and that the other sites have copied it word-for-word and placed it on their site? Does he need to re-write the content to make it original? I just want to make sure before I told him to re-write all the content on the site? I'm well aware that duplicate content is bad, but i'm just curious if it's hurting the clients site because they originally created the content. Thanks for your input.
Algorithm Updates | | Kdruckenbrod0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Is a New Visits ratio of 39% a really bad thing?
I do a lot of work for a large estate agency based almost solely in London. They get a considerable amount of traffc and all other stats, on the whole, are always positive. The only thing that is decreasing regularly is the percentage of new traffic. My understanding of user behaviour for this market is that no one in their right mind would make an enquiry or arrange a booking without a) looking at the property at least twice themeselves (once to before the enquiry and once before a viewing) and b) more than likely show a partner. Plus the site is well laid out and useful so I believe users are favouring our site over the comparison sites. So questions: Should I be panicing What is the most efficent way of increasing new visits? Things to note: The HTML titles throughout the site are a bit of a mess - key word rich but too long and inconsistent. Could this be a contributing factor to the CTR? Also in the past month we appeared in over 4k different queries but our non branded impressions are down 22%. Could more concise, less keyword stuffed HTML titles help this? Do I need to look at the page titles to ensure that they contain the exact phrases that are in decline? Any help will be greatly appreciated!
Algorithm Updates | | SoundinTheory0 -
Seo results are down. Is my "all in one seo pack" to blame?
My website www.noobtraveler.com has shown a dip of 40% since Penguin's last update in November. I also transferred hosting at time, but I was wondering if I'm over optimizing with the all in one seo pack. I would appreciate it if someone could do a quick sweep and share their thoughts. Thanks!
Algorithm Updates | | Noobtraveler0 -
Duplicate content penalisation?
Hi We are pulling in content snippets from our product blog to our category listing pages on our ecommerce site to provide fresh, relevant content which is working really well. What I am wondering is if we are going to get penalised for dupicate content as both our our blog and ecommerce site are on the same ip address? If so would moving the blog to a separate server and / or a separate domain name be a wise move? Thanks very much
Algorithm Updates | | libertybathrooms0