Penalty for adding too much content too quickly?
-
Hi there,
We released around 4000 pieces of new content, which all ranked in the first page and did well. We had a database of ~400,000 pieces and so we released the entire library in a couple of days (all remaining 396,000 pages).
The pages have indexed.
The pages are not ranking, although the initial batch are still ranking as are a handful (literally a handful) of the new 396,000. When I say not ranking - I mean not ranking anywhere (gone up as far as page 20), yet the initial batch we'd be ranking for competitive terms on page 1.
Do Google penalise you for releasing such a volume of content in such a short space of time? If so, should we deindex all that content and re-release in slow batches? And finally, if that is the course of action we should take is there any good articles around deindexing content at scale.
Thanks so much for any help you are able to provide.
Steve
-
Thanks for replying. The site is getinspired365 dot com.
We saw a spike of 11,000, then 29,000 then back down a steady ~1500.
Yes, we have structured our sitemap such that there is 7 sitemaps (one for authors of 15,000) and then 5 for our quotes (40,000 each) and one for our topics (2000). Looking at it around 90% has successfully been indexed. This was done around 2 months ago and as I say it has pretty much all been indexed but it is not ranking - at all. However, our first batch of content is ranking and ranking really well. It is as though this new content has some sort of penalty and is therefore not ranking in Google but I am not sure 1. What the penalty is and 2. How to fix it? I want to deindex the entire site and start again, and just add the content in much smaller batches but I am not sure how best to do that.
thanks
-
I doubt so. Can you share a link?
Did you publish an updated sitemap?
Do you see a spike in "Pages crawled per day" in "Google WMT/Search Console", in Crawl->Crawl Stats?
400k is a lot, it may take some time to crawl all of them
Did you structure your sitemap as a tree? if you did, adding the 400k new pages to a sub node of the sitemap, you can check in Crawl->Sitemaps how many of those pages are already indexed, and if the figure is growing or not on a day/week basis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content suggestions from MOZ
Hello, I checked moz content suggestions and for one of my keywords “Burgundy bike tours”. It gives me expressions such as “Burgundy France” and “Burgundy wine”. My question is whether I should include the exact expression “Burgundy wine” in a sentence or if include burgundy somewhere in my text and wine somewhere else if it is fine ? PS : What is the real difference between marketmuse and moz ? and why do they sometimes give different suggestions ?
Intermediate & Advanced SEO | | seoanalytics0 -
Disavow without penalty
Hi fellow Mozians, I have come up with a doubt today which I would appreciate your thoughts on. I have always been convinced that the disavowal tool can be used at any time as part of your backlink monitoring activities- if you see a dodgy backlink coming in you should add it to your disavowal file if you can't get it removed (which you probably can't). That is to say that the disavowal tool can be used pre-emptively to make sure a dodgy link does do your site any harm. However, this belief of mine has taken a bit of a beating this morning as another SEO suggested that the disavowal tool only has en effect if acompanied by a reconsideratiosn request, and that you can only file a reconsideration request if you have some kind of manual action. This logic describes that you can only disavowal when you have a penalty. This theory was backed up by this moz article from May 2013:
Intermediate & Advanced SEO | | unirmk
https://moz.com/blog/google-disavow-tool
The comments didnt do much to settle my doubts. This Mat Cutts video, from November 2013 seems to confirm my belief however:
https://www.youtube.com/watch?time_continue=86&v=eFJZXpnsRsc It seems perfectly reasonable that Google does allow pre-emptive disavowal-ing, not just because of the whole negative seo issue, but just because nasty links do happen naturally. Not all SEOs spend all their waking hours building links which they know they will have to disavowal later shoudl a penalty hit at some point, and it seems reasonable that an SEO should be able to say- "Link XYZ is nothing to do with me!" before Google excercises retribution. If, for example you get hired working for a company that HAD a penalty due to spammy link building in the past that has been lifted; but you see that Google periodically discovers the occasional spammy link it seems fair that you should be able to tell google that you want to voluntarily remove any "credit" that that link is giving you today, so as to avoid a penalty tomorrow. Your help would be much appreciated. Many thanks indeed. watch?time_continue=86&v=eFJZXpnsRsc0 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Product Syndication and duplicate content
Hi, It's a duplicate content question. We sell products (vacation rental homes) on a number of websites as well as our own. Generally, these affiliate sites have a higher domain authority and much more traffic than our site. The product content (text, images, and often availability and rates) is pulled by our affiliates into their websites daily and is exactly the same as the content on our site, not including their page structure. We receive enquiries by email and any links from their domains to ours are nofollow. For example, all of the listing text on mysite.com/listing_id is identical to my-first-affiliate-site.com/listing_id and my-second-affiliate-site.com/listing_id. Does this count as duplicate content and, if so, can anyone suggest a strategy to make the best of the situation? Thanks
Intermediate & Advanced SEO | | McCaldin0 -
Image and Content Management
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images. Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc... I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
Intermediate & Advanced SEO | | KJ-Rodgers0 -
Login Page = Duplicate content?
I am having a problem with duplicate content with my log in page QuickLearn Online Anytime - Log-in
Intermediate & Advanced SEO | | QuickLearnTraining
http://www.quicklearn.com/maven/login.aspx
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BAM-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BRE-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTAF
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTDF What is the best way to handle it? Add a couple sentences to each page to make it unique? Use a rel canonical, or a no index no follow or something completely different? Your help is greatly appreciated!0 -
Multi-language, multi-country localized website with duplicate content penalty
My company website is multi-language and multi-country. Content created for the Global (English-language only, root directory) site is automatically used when no localization exists for the language and country choice (i.e. Brazil). I'm concerned this may be harming our SEO through dupe content penalties. Can anyone confirm this is possible? Any recommendations on how to solve the issue? Maybe the canonical tag? Thanks very much!
Intermediate & Advanced SEO | | IanTreviranus0