Not Adding Fresh Content Daily Did Got Me Penalized?
-
One of my website used to post like a 1000 words articles every 4-5 (say like 12 x 300 words articles each in a week) days in a week. The process went till 3 months. Then suddenly we stopped adding content to it for a flat 15 days due to unavailability of content writer. Suddenly a major drop took place. Now we have been adding the same amount of quality content but the ranking doesn't seem to be improving. Is it a penalty?
-
If you go the "quality content" route and can only produce a small number of new articles per month, the best thing to do is to produce only "evergreen" content that can be recycled out to the front page. That will give the appearance of activity and diversity - at least to visitors with a good memory who have not been visiting your site for a long time.
Also, if you have a page of "news" where you link to articles on other websites about industry trends or interesting topics. That can develop a following of thousands of people who visit your site frequently just to check that page - or subscribe to your feed.
-
It will certainly get picked up by site crawlers for audit purposes, but would Google object to it? It depends what it is, where it is and how necessary it is. If it's a line of spam (for example) just to add in keywords, then this might cause you issues, but it depends on the quality of the rest of the page and content.
-Andy
-
No problem sir!
-
Mine isn't a sort of blog post. Its a demand of the website where i have to introduce 3 new pages. Content was originally written no spin, rewrite or so. One more thing i would like to addup, all my pages have a line in common (15-20 words) out of the 300 words. Can it affect the content duplicate issue?
-
That's the one - thanks Patrick
-Andy
-
Hi there
Just a quick side note - the post Andy is referencing above is located here.
Hope this helps! Good luck!
-
Looks like Phanthon 2 has effect my site.
-
Well, check analytics to look for when the drop happened and then look here on MOZ to see if it has coincided with anything.
-Andy
-
I truly understand but its a demand of website to have short articles. Because are writing product descriptions. So to make sure we at least have 300 words of goof content written on each page.
-
Hi Jawahar,
The best way to spot a penalty is to look at your analytics and see if any drops coincide with any algorithm updates.
I would echo a point that EGOL made earlier on another post, that daily content in this manner is probably not as beneficial as posting one very high quality article once a week. Of course, it depends what you are writing about, but shorter articles like this wouldn't generally do as much for you.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
How to avoid duplicate content
Hi there, Our client has an ecommerce website, their products are also showing on an aggregator website (aka on a comparison website where multiple vendors are showing their products). On the aggregator website the same photos, titles and product descriptions are showing. Now with building their new website, how can we avoid such duplicate content? Or does Google even care in this case? I have read that we could show more product information on their ecommerce website and less details on the aggregator's website. But is there another or better solution? Many thanks in advance for any input!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Content Internal Linking ?
Should we internally link new content to old content using anchor tags (keywords) related to pages from all new blogposts or should be keep rotating the blogposts like link from some blog posts & not from others. What ratio should we maintain. Right now i keep 2 links maximum from a 300 words posts or 3 in 500 words posts maximum. But linking from each new blog posts will be good?
Intermediate & Advanced SEO | | welcomecure0 -
Duplicate content on yearly product models.
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
Intermediate & Advanced SEO | | DCochrane0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0