Content Publishing Volume/Timing
-
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online.
Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities).
My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release.
Are there pitfalls I should avoid in terms of pushing out so much back content at once?
-
Converting all of those articles will take time.
I would design the site architecture and template and then immediately publish each article as soon as it is ready. This will get the articles flowing out into the search engines and get the money flowing in.
-
Hi Andrew,
I would definitely avoid throwing everything at Google all at once. This won't give any article time to gain traction and severely limit your chances to share everything through social channels.
There isn't a magic timescale where you should publish this over, but if there is that much, then you should be looking at months rather than days or weeks.
Leave the season-sensitive articles until those seasons to maximise on the impact they can have.
I would also update any articles that might have out-dated information, so look at these before they go live.
-Andy
-
Personally I would release it over a decided period. This way it would seem that your content is being continuously added rather than a massive once off DUMP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content change and variations in ranking
Hello, I have create a new webpage and asked google in the webmaster tool to crawl it. Within minutes it is ranked at a certain spot. I did make changes to it to increase the ranking and right away I could see variations in ranking either up or down ? I have done the same same thing for a page that has been existing on my website for many years. I changed the content, asked the webmaster tool to re-crawl it. It got the new content within minutes but the ranking doesn't seem to change. Maybe my content isn't good enough but I doubt. Could it be that on old pages it takes a couple weeks to see ranking changes whereas on new page it is instantaneous. Has anyone experienced something similar ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Page auto directing to /#/id0 but no 301 in place?
I'm a little perplexed and hope someone technically savvy can help. Wordpress site. Our page: www.curveball-media.co.uk/animation Redirects to: www.curveball-media.co.uk/animation/#/id0 I cannot see any reason for this. No 301s, nothing.
Intermediate & Advanced SEO | | curveballmedia0 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Translated Content on Country Domains
Hi, We have blogs set up in each of our markets, for example http://blog.telefleurs.fr, http://blog.euroflorist.nl and http://blog.euroflorist.be/nl. Each blog is localized correctly so FR has fr-FR, NL has nl-NL and BE has nl-BE and fr-BE. All our content is created or translated by our Content Managers. The question is - is it safe for us to use a piece of content on Telefleurs.fr and the French translated Euroflorist.be/fr, or Dutch content on Euroflorist.nl and Euroflorist.be/nl? We want to avoid canonicalising as neither site will take preference. Is there a solution I've missed until now? Thanks,
Intermediate & Advanced SEO | | seoeuroflorist
Sam0 -
Does anyone know how dynamic/personalized website content affects SEO?
A client using Marketo has inquired about personalizing their website content to be personalized based on a persona. To be clear, I'm talking about key website pages, maybe even the Home page, not PPC/campaign specific landing pages. For example, areas of on the site would change to display content differently to a CEO vs a sales person. I'm new to marketing automation and don't exactly know how this piece works. Hoping someone here has experience or can provide pros/cons guidance. How would search engines work with this type of page? Here's Marketo's site explaining what it does: https://docs.marketo.com/display/public/DOCS/Web+Personalization+-+RTP
Intermediate & Advanced SEO | | Flock.Media0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Same article published 3 times--do we still benefit from the links?
Hi, A reporter recently mentioned us in a leading publication, and that article was picked up by two other big publications. Do we benefit from all three links, or do we only benefit from the link once since it is the same article?
Intermediate & Advanced SEO | | nicole.healthline0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0