Content update on 24hr schedule
-
Hello!
I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page.
-
Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine)
-
Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask)
-
On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.)
-
Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page?
Thank you all mozzers!
-
-
When you say 1300 landing pages are coming online every night that doesn't mean 1300 new pages are being created does it? Based on the rest of your comment I'm taking it to mean that 1300 pages, which were already live and accessible to Google, are being updated and the content is changing if appropriate.
In terms of the specific situation I describe above, that should be fine - there shouldn't be a problem with having a system for keeping your site up-to-date. However, each of the below things, if true, would be a problem;
-
You are adding 1300 new pages to your site every night
-
This would be a huge increase for most sites, particularly if it was happening every night, but as I say above I don't think this is the case
-
You are actually scraping key information to include on your site
-
You mention an API so it may be that users are submitting this content to your site for you to use but if you are scraping the descriptions from some sites, and reviews from others that is what would be viewed as spammy and it seems like the biggest point of risk I've seen in this thread.
-
-
Something else occurred to me. So, our api rewrites EVERYTHING every night. So technically 1300 landing pages are coming online EVERY night, and the content isn't really changing. If that a problem?
To sorta explain, this is a review site for other websites/apps. Our API scrapes the description from the app/site, as well as ratings from app stores etc and then publishes that onto our page. So, generally the content isnt really changing, its just updating. Thoughts on that?
-
Thank you!!! Thats great info.
-
Hi,
As said below by Robin... I'm suggesting you think about the frequency that would be better for users/readers/clients. In the end, Google is another reader.
Hope it helps.
Best luck.
GR -
Hi, I think you've already got a couple of good answers here but just to throw in my thoughts; to me this would all come down to how much value you're getting for the volume of content you're creating.
It sounds to me like you have 1.3k product landing pages, and you're producing 80 articles a month, plus maybe you're indexing the review pages too?
I think frequency here becomes secondary to how much each of these things are adding. If you are indexing the reviews pages for specific products, those pages could just be diluting your site equity. Unless they are performing a valuable function I'd consider canonicalising them to the product pages. As the others have said, having product pages that regularly update with new reviews shouldn't be a problem but with all the content you're adding to the site you could be relying on Google indexing these changes far more quickly than it actually is.
If you're adding a large number of articles every month - are those articles cannibalising other pages, or each other? The way I'd try to gauge if it's too much is whether the pages are getting traffic, whether you're having a lot of flip-flopping in the keywords you're targeting, and whether you're starting to get issues with Google indexing all of your pages. Similar to the review pages, if the articles are providing value to your readers, getting you links or getting you a decent amount of traffic then grand, if they aren't generating much I'd consider producing less or removing/redirecting poorly performing articles after a little while to preserve site equity and help focus Google's crawl.
On the note of posting frequency, I would agree with Gaston that it's about what's right for your readers. If a lot of article-worthy content comes out at the same time, I'd post about it then and there, if this is just content you're coming up with and adding and timing doesn't matter, spreading it throughout the month makes sense in terms of staying fresh, getting the articles indexed, and honestly not having to rush deadlines/delay release.
-
Yeah so basically we are bumping up all the static content on our review pages. The reviews are updating daily. And to clarify when you say "wouldn't work in your favor" you mean we aren't getting any benefit from the content, it isn't negatively impacting us correct?
-
Thank you very much! Can you clarify number 3?
-
1. No, not really. It mostly depends on the percentage of content that isn't yours and can be viewed somewhere else. If reviews are 90% of the page and they're original content from another site that won't work in your favor though. But in this case, I'm assuming you're working around that.
2. No.
3. I would say No.
4. It depends, as long as you're not creating duplicate content at scale you should be fine.
-
Hi there!
- No, at all. There is no issue there, as long as changes do make sense.
- Noup, there is no such thing as "too much content".
- Think to Google other of your readers/clients. Wich frequency would be better for them?
- No, there aren't any negatives as long as you keep the content coherent and don't create duplicate content
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
At scale way to check content in google?
Is there any tools people know about where I can verify that Google is seeing all of our content at scale. I know I can take snippets and plug them into Google to see if we are showing up, but this is very time consuming and want to know across a bulk of pages.
Intermediate & Advanced SEO | | HashtagHustler0 -
Phantom 3 Update?
My site got demolished by this update and I really don't know why and would appreciate if any Mozzers could help me out to understand this. I just found out about this update that happened today and I am kind of shocked and at a loss on what happened. If someone would PM me, I would really appreciate it Thanks!
Intermediate & Advanced SEO | | steve450580 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
April Google Update?
Since April 16 (when Jews ate Matzah) Google hurt one of our clients badly. They are well-known and beloved brand with hundreds of employees and locations across USA.
Intermediate & Advanced SEO | | Elchanan
I can’t see any signal of organic update, or penalty (neither Google Places). No message on GWT Nothing has been changed on and off site. All keywords' ranking are looking like this All tools showing good analysis: MOZ, Barracuda, MajesticSeo Content is good and not duplicated, etc. Do one of you is aware of significant Google update?
What do you think/suggest?0 -
Question on Moving Content
I just moved my site from a Wordpress hosted site to Squarespace. We have the same domain, however, the content is now located on a different URL (again, same base domain). I'm unable to easily set up 301 redirects for the old content to be mapped to the new content so I was wondering if anyone had any recommendations for a workaround. Basically, I want to make sure google knows that Product A's page is now located at this new URL. (www.domain.com/11245 > www.domain.com/product-a). Maybe it's something that I don't have to worry about anymore because the old content is gone? I mean, I have a global redirect set up that no matter what you enter after the base domain, it now goes to the homepage but I just want to make sure I'm not missing something here. Really appreciate your help!
Intermediate & Advanced SEO | | TheBatesMillStore1 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
What is a good content for google?
When we start to study SEO and how google see our webpage, one important point is to have good content. But, for beginners like me, we get lost on this. Is not so black and white: what for you is a good content? the text amount matters? there is any trick that all good content websites need to have?
Intermediate & Advanced SEO | | Naghirniac0