Content update on 24hr schedule
-
Hello!
I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page.
-
Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine)
-
Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask)
-
On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.)
-
Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page?
Thank you all mozzers!
-
-
When you say 1300 landing pages are coming online every night that doesn't mean 1300 new pages are being created does it? Based on the rest of your comment I'm taking it to mean that 1300 pages, which were already live and accessible to Google, are being updated and the content is changing if appropriate.
In terms of the specific situation I describe above, that should be fine - there shouldn't be a problem with having a system for keeping your site up-to-date. However, each of the below things, if true, would be a problem;
-
You are adding 1300 new pages to your site every night
-
This would be a huge increase for most sites, particularly if it was happening every night, but as I say above I don't think this is the case
-
You are actually scraping key information to include on your site
-
You mention an API so it may be that users are submitting this content to your site for you to use but if you are scraping the descriptions from some sites, and reviews from others that is what would be viewed as spammy and it seems like the biggest point of risk I've seen in this thread.
-
-
Something else occurred to me. So, our api rewrites EVERYTHING every night. So technically 1300 landing pages are coming online EVERY night, and the content isn't really changing. If that a problem?
To sorta explain, this is a review site for other websites/apps. Our API scrapes the description from the app/site, as well as ratings from app stores etc and then publishes that onto our page. So, generally the content isnt really changing, its just updating. Thoughts on that?
-
Thank you!!! Thats great info.
-
Hi,
As said below by Robin... I'm suggesting you think about the frequency that would be better for users/readers/clients. In the end, Google is another reader.
Hope it helps.
Best luck.
GR -
Hi, I think you've already got a couple of good answers here but just to throw in my thoughts; to me this would all come down to how much value you're getting for the volume of content you're creating.
It sounds to me like you have 1.3k product landing pages, and you're producing 80 articles a month, plus maybe you're indexing the review pages too?
I think frequency here becomes secondary to how much each of these things are adding. If you are indexing the reviews pages for specific products, those pages could just be diluting your site equity. Unless they are performing a valuable function I'd consider canonicalising them to the product pages. As the others have said, having product pages that regularly update with new reviews shouldn't be a problem but with all the content you're adding to the site you could be relying on Google indexing these changes far more quickly than it actually is.
If you're adding a large number of articles every month - are those articles cannibalising other pages, or each other? The way I'd try to gauge if it's too much is whether the pages are getting traffic, whether you're having a lot of flip-flopping in the keywords you're targeting, and whether you're starting to get issues with Google indexing all of your pages. Similar to the review pages, if the articles are providing value to your readers, getting you links or getting you a decent amount of traffic then grand, if they aren't generating much I'd consider producing less or removing/redirecting poorly performing articles after a little while to preserve site equity and help focus Google's crawl.
On the note of posting frequency, I would agree with Gaston that it's about what's right for your readers. If a lot of article-worthy content comes out at the same time, I'd post about it then and there, if this is just content you're coming up with and adding and timing doesn't matter, spreading it throughout the month makes sense in terms of staying fresh, getting the articles indexed, and honestly not having to rush deadlines/delay release.
-
Yeah so basically we are bumping up all the static content on our review pages. The reviews are updating daily. And to clarify when you say "wouldn't work in your favor" you mean we aren't getting any benefit from the content, it isn't negatively impacting us correct?
-
Thank you very much! Can you clarify number 3?
-
1. No, not really. It mostly depends on the percentage of content that isn't yours and can be viewed somewhere else. If reviews are 90% of the page and they're original content from another site that won't work in your favor though. But in this case, I'm assuming you're working around that.
2. No.
3. I would say No.
4. It depends, as long as you're not creating duplicate content at scale you should be fine.
-
Hi there!
- No, at all. There is no issue there, as long as changes do make sense.
- Noup, there is no such thing as "too much content".
- Think to Google other of your readers/clients. Wich frequency would be better for them?
- No, there aren't any negatives as long as you keep the content coherent and don't create duplicate content
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
During major update rankings update seem to be on pause ?
Hello, I have read in the past that during a major update google puts all his ressources in the update and it seems that they don't update search results anymore. Has someone noticed that too ? How long does it take for an update to be rolled out fully and have everything get back to normal ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
Update content or create a new page for a year related blog post?
I have a page called 'video statistics 2013' which ranks really well for video stat searches and drives in a lot of traffic to the site. Am I best to just change the title etc to 2014 and update the content, or create a totally new page? The page has 2013 in the URL as well which may be a problem for just updating?
Intermediate & Advanced SEO | | JonWhiting0 -
Above the Fold Content
How important is the placement of unique content "Above the Fold". Will attention grabbing images suffice or must their be a lot of unique text?
Intermediate & Advanced SEO | | casper4340 -
How do you archive content?
In this video from Google Webmasters about content, https://www.youtube.com/watch?v=y8s6Y4mx9Vw around 0:57 it is advised to "archive any content that is no longer relevant". My question is how do you exactly do that? By adding noindex to those pages, by removing all internal links to that page, by completely removing those from the website? How do you technically archive content? watch?v=y8s6Y4mx9Vw
Intermediate & Advanced SEO | | SorinaDascalu1 -
Aggregators outranking me for my own content
WARNING : The follow question is for an adult website. If you are at work, near children or are offended by such material, DO NOT CLICK Hey guys, This one has had me stumped for awhile. I operate www.deviantclip.com. Its a very old and trusted domain by google with loads of history. However, in the past year, Google has been giving me the cold shoulder. One major problem I've noticed is that I've lost all longtail traffic. Its even gotten to the point where aggregators are outranking me in google for my own custom titles and content. **Example A : ** Google Link 1 This search has my own sitename in the title and my site ranks somewhere on page 2 or further. **Example B : ** Google Link 2 This content originated from our site and has a unique title, yet we're dead last in the serps. I submitted my site for reconsideration a few times, and the outcome everytime is that Google tells me they have not applied any manual penalty. There are a TON of issues to adress with this site, but obviously, getting my own content to rank first is the primary problem I would like to fix. Your time and advice is greatly appreciated. If you need furter info, don't be afraid to ask.
Intermediate & Advanced SEO | | CrakJason0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0