Does adding lots of new content on a site at one time actually hurt you?
-
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
-
I agree with all colleagues above, I cant see how your web site will be penalised due to lots of pages uploaded at the same time.
However Adding Too Many Pages Too Quickly May Flag A Site To Be Reviewed Manually. This means thought that you will add hundreds of thousand of link a night. Here is the related via by Matt Cutts:
Hope you find this useful!
-
It is a real estate site and the content is a directory of the various condos available in their community. The pages are all unique and have real valuable content, so I don't think there will be any issues with content quality.
There is new content and blogging that occurs regularly on the site. I think that the client's concern comes from some old concepts that if we're only adding content infrequently, but in mass, that it may be seen as spammy.
-
I agree with Jesse. Earlier this year we added a new data-driven section to our website that included (believe it or not) 83,000 pages, all unique in content since the information is highly technical in nature. No associated penalties have resulted from this.
-
I agree with Jesse for the most part. I think the key is: what kind of content we are talking about? Adding tons of low-value, thin content pages to a site all at once (or even gradually) is probably going to diminish the authority of existing content. I do think that adding thousands of pages that have no page authority to a site that contains pages with a decent amount of authority could, theoretically, dilute the authority of the existing pages depending on site architecture, internal linking and the ratio of existing pages versus new pages. However, I would expect this to be only temporary, and if the new content is great quality, should be nothing to worry about long term.
-
Thanks Jesse, that was my thought exactly. If anything, I see incrementally adding the content as a negative thing, since it will lead to a less than complete user experience.
-
No truth to that whatsoever. That's weird paranoia.
If there was some sort of problem WITH the content, maybe. But there would be no penalty for all new content added.
I've done total site overhauls plenty of times and they get indexed quick with no penalties.. (although I will say the speed of this seems to be in flux, but I digress.)
Don't let the client worry about this. Think about any website that initially launches: why would Google penalize that?
Hope this helps. Paranoia is often the toughest challenge when it comes to dealing with clients/site owners.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
Number or percentage of new visitors impact Google rankings?
Hi all, Does the number/percentage of new visitors (from different IPs and countries) impact the Google rankings? If there are more number of new visitors, will Google favours the website in rankings considering the fact that new visitors are better than returning/same visitors? Thanks
Algorithm Updates | | vtmoz0 -
Different versions of keywords. Which one to optimize?
For some keywords, we have slightly different versions available. For example: people search for 'webhosting', but also for 'web hosting'. Same for 'cloudserver' and 'cloud server'. I used google trends to compare the keywords and find the more popular one. But in different countries, different keywords are searched more often. So we can't really optimize for the BEST version of the keyword, since there is no real BEST version. What would you suggest to do? We could also just develop pages for both keywords, but that could end in duplicated content. Would be an option to use canonicalization then, but this would not really improve ranking for one of the versions. I'm very confused. If anyone has a good idea on how to optimize here, please let me know. Thank you in advance! Best regards
Algorithm Updates | | hosttech_ch
Klemens1 -
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Why is old site not being deindexed post-migration?
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
Algorithm Updates | | ggpaul5620 -
Do images count as duplicate content.
Hi We have a site called buypropertyanywhere. We sell properties all over the world but at times we use the same images of complexes. Would Google see this as copying from each other therefore duplicate content. Thanks in advance
Algorithm Updates | | Feily0 -
Why is there no compiled list of the different types of search results on Google, and what the content qualifications are to generate those results?
Seems to me that this list should exist out there somewhere, but I can't seem to find it. Am I just not as good of a Googler as I thought I was?
Algorithm Updates | | Draftfcb0 -
Google spitting out old data as new alerts
Am I just unlucky or are others seeing this too? I have several google alerts. For the past 6 months, google keeps sending crap along with good stuff. its a bit like their search results. There are three types of Alerts they send that I'm not impressed with. 1. Alerts that are from unintelligible splogs that take real news stories and rewrite them with unintelligible garbage that makes no sense at all. Sometimes, they serve up new alerts from the same splogs I saw several months ago, that I felt sure they would have zapped by now. 2. Old stories, that have been around for months. I just received one that was from January, from TechDirt, a big site that must get a huge amount of attention from google. 3. Irrelevant stories because they love to show how smart they are by splitting my alert keyword text into multiple words, but it gives useless results. This is the kind of stuff that crappy search engines like AltaVista used to do. Is google reverting to the childhood of search with all these changes?
Algorithm Updates | | loopyal0