Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it better to drip feed content?
-
Hi All,
I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles.
Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content.
Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?**
Thank you in advance.
-
Good luck. Like I said, this is just me being silly. I pray to my Google shrine twice a day and this is what it tells me.
All at once or drip feed, either way your content gets up there!
-
Thanks guys for your help. Think I'm going to publish it all at once. Was originally in agreement with Bill but after doing a bit of reading it's probably safe to say that the SE's prioritise good content over content age. I've noticed blogs having slightly inflated PR because of the regular content but it's unlikely I'll be able to keep up regular posts and as a result any benefit derived from drip feeding would fall away when I run out of articles. If it doesn't work I'm calling my lawyer on you guys, hehe kidding :)))))
-
I don't think there is any right or wrong answer to this question. More of a preference.
For me, I like to drip my content.
In my own silly mind, it looks more natural to the search engines rather than dumping a bunch of content on your site.
I also think it keeps the search engines coming back to your site as you posting content through the months and years rather than all at one time.
Mind you. I have no scientific basis for this... just my own anal retentivity. LOL
-
When I have new content I can't wait to get it indexed. So even if I am not promoting it yet on the homepage I will put up links to it on relevant pages just to get spiders into it.
Five articles is no worry.
-
There is no advantage to holding back from a search engine perspective. The only reason I can think of to hold back relates to promotion opportunities for the articles. You could publish one article each week, tweet it and otherwise generate interest around the weekly article. If that is not of interest to you, then I would publish all five articles.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I redirect or add content, to 47 Pages?
We have an insurance agency website with 47 pages that have duplicate/low content warnings. What's the best way to handle this? I'm I right in thinking I have 2 options? Either add new content or redirect the page? Thanks in advance 🙂
On-Page Optimization | | laurentjb1 -
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
How Good or Bad is having a blog feed(s) on the homepage?
Hello everyone, I was wondering if I can get some different opinion about having a blog feed on the homepage. Image, title, excerpt I have several feeds on mine which I do not believe it hurts and has helped my rankings but I wanted some superior SEO brains to weigh in. https://www.brightvessel.com Is it good for SEO? When would it be bad? How many posts would be considered too much? On my blog, have the most recent posts which have some of the same feeds. Which is making me question the duplicated content. https://www.brightvessel.com/blog/ Thanks! Judd
On-Page Optimization | | brightvessel0 -
Maximum page size for better seo results?
Does really page size affect the results in search engines? And, what is the maximum in this case?
On-Page Optimization | | Eslam-yosef0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
Internal Linking - in content vs navigation menu
Would like to get some thoughts on whether navigation menus or in-content links are best for internal linking, from an SEO standpoint. A few thoughts to get started with: For sites with a lot of content, you can have a navigation menu linking to your higher-level pages, then in-content links to deeper pages on your site. For smaller sites, this is not an option, as the navigation menu will probably link to all your important pages. You could add in-content links, but Google only counts the first link on the page, so the in-content links would be ignored if you'd already linked yp the page in your top nav menu. I can think of several possible reasons navigation menu links could be less desirable than in content links from a Google perspective. (They are sitewide boilerplate content without context.) If you setup your navigation structure based on what is best for the user, small sites don't have much wiggle room to optimize internal link structure, as all their money pages will be linked to from the top nav menu. Do you think Google prefers in content links to navigation menu links? If so, how do you get around the fact that for many sites, all their money pages are being linked to from their main navigation menu?
On-Page Optimization | | AdamThompson0 -
Is content aggregation good SEO?
I didn't see this topic specifically addressed here: what's the current thinking on using content aggregation for SEO purposes? I'll use flavors.me as an example. Flavors.me lets you set up a domain that pulls in content from a variety of services (Twitter, YouTube, Flickr, RSS, etc.). There's also a limited ability to publish unique content as well. So let's say that we've got MyDomain.com set up, and most of the content is being drawn in from other services. So there's blog posts from WordPress.com, videos from YouTube, a photo gallery from Flickr, etc. How would Google look at this scenario? Is MyDomain.com simply scraped content from the other (more authoritative) sources? Is the aggregated content perceived to "belong" to MyDomain.com or not? And most importantly, if you're aggregating a lot of content related to Topic X, will this content aggregation help MyDomain.com rank for Topic X? Looking forward to the community's thoughts. Thanks!
On-Page Optimization | | GOODSIR0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0