Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it better to drip feed content?
-
Hi All,
I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles.
Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content.
Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?**
Thank you in advance.
-
Good luck. Like I said, this is just me being silly. I pray to my Google shrine twice a day and this is what it tells me.
All at once or drip feed, either way your content gets up there!
-
Thanks guys for your help. Think I'm going to publish it all at once. Was originally in agreement with Bill but after doing a bit of reading it's probably safe to say that the SE's prioritise good content over content age. I've noticed blogs having slightly inflated PR because of the regular content but it's unlikely I'll be able to keep up regular posts and as a result any benefit derived from drip feeding would fall away when I run out of articles. If it doesn't work I'm calling my lawyer on you guys, hehe kidding :)))))
-
I don't think there is any right or wrong answer to this question. More of a preference.
For me, I like to drip my content.
In my own silly mind, it looks more natural to the search engines rather than dumping a bunch of content on your site.
I also think it keeps the search engines coming back to your site as you posting content through the months and years rather than all at one time.
Mind you. I have no scientific basis for this... just my own anal retentivity. LOL
-
When I have new content I can't wait to get it indexed. So even if I am not promoting it yet on the homepage I will put up links to it on relevant pages just to get spiders into it.
Five articles is no worry.
-
There is no advantage to holding back from a search engine perspective. The only reason I can think of to hold back relates to promotion opportunities for the articles. You could publish one article each week, tweet it and otherwise generate interest around the weekly article. If that is not of interest to you, then I would publish all five articles.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Best practices for publishing sponsored content
Hello, Our website hosts sponsored content from different brands. Should we be listing the sponsor either on the frontend and/or through markup? - Would either way have any sort of an impact? The content itself is already clearly marked as 'sponsored content' but we were more interested in listing the specific sponsor. Also, we’re assuming the outbound links would need to be marked rel="sponsored" but are there any other best practices we should be implementing? Any insight would be appreciated.
On-Page Optimization | | Ben-R
Thank you in advance.
Best,0 -
What to do with "trendy" content that is no longer relevant?
Hi all, My company is in the fashion/jewelry industry and we regularly create short content describing the latest trends in jewelry. We do not include any sort of date reference on the content, which means that a searcher who gets to our site has no way of knowing if this is a trend from 2008 or 2016. Does anyone have any experience with the best way to handle this? I want to remain relevant for our customers. It seems like a big disservice to our customers to show them a "trend" which trended 5 years ago. Is there a benefit to keeping this content around or would it be better to cycle it off the site after 6 months or so? Thanks for any advice or experience you have! R.
On-Page Optimization | | FireMountainGems1 -
Duplicate page titles and Content in Woocommerce
Hi Guys, I'm new to Moz and really liking it so far!
On-Page Optimization | | jeeyer
I run a eCommerce site on Wordpress + WooCommerce and ofcourse use Yoast for SEO optimalisation I've got a question about my first Crawl report which showed over 600 issues! 😐 I've read that this is something that happens more often (http://moz.com/blog/setup-wordpress-for-seo-success). Most of them are categorized under:
1. Duplicate Page Titles or;
2. Duplicate Page Content. Duplicate Page Titles:
These are almost only: product category pages and product tags. Is this problem beeing solved by giving them the right SEO SERP? I see that a lot of categories don't have a proper SEO SERP set up in yoast! Do I need to add this to clear this issue, or do I need to change the actual Title? And how about the Product tags? Another point (bit more off-topic) I've read here: http://moz.com/community/q/yoast-seo-plugin-to-index-or-not-to-index-categories that it's advised to noindex/follow Categories and Tags but isn't that a wierd idea to do for a eCommerce site?! Duplicate Page Content:
Same goes here almost only Product Categories and product tags that are displayed as duplicate Page content! When I check the results I can click on a blue button for example "+ 17 duplicates" and that shows me (in this case 17 URLS) but they are not related to the fist in any way so not sure where to start here? Thanks for taking the time to help out!
Joost0 -
When writing content for a website what is the optimal copy length?
My site is currently in the mist of a redesign and I’d like us to compile some recommendations on the length of copy for a page to rank well but can't seem to find any up to date articles on this.Does anyone have any suggestions, comments, or feedback?Thank you.
On-Page Optimization | | PorshaAndrea0 -
URL Path. What is better for SEO
Hello Moz people, Is it better for SEO to have a URL path like this: flowersite.com/anniversary_flowers/dozen_roses OR flowersite.com/dozen_roses Is it better to have the full trail of pages in the URL?
On-Page Optimization | | CKerr0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Best practice for franchise sites with duplicated content
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues. All sites are hosted on the same server therefor the same IP address All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate Almost all sites have the same design (A few of the groups we work with have multiple design options) Any suggestions would be greatly appreciated. Thanks Again Aaron
On-Page Optimization | | Shipyard_Agency0