Content Publishing Volume/Timing
-
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online.
Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities).
My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release.
Are there pitfalls I should avoid in terms of pushing out so much back content at once?
-
Converting all of those articles will take time.
I would design the site architecture and template and then immediately publish each article as soon as it is ready. This will get the articles flowing out into the search engines and get the money flowing in.
-
Hi Andrew,
I would definitely avoid throwing everything at Google all at once. This won't give any article time to gain traction and severely limit your chances to share everything through social channels.
There isn't a magic timescale where you should publish this over, but if there is that much, then you should be looking at months rather than days or weeks.
Leave the season-sensitive articles until those seasons to maximise on the impact they can have.
I would also update any articles that might have out-dated information, so look at these before they go live.
-Andy
-
Personally I would release it over a decided period. This way it would seem that your content is being continuously added rather than a massive once off DUMP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bulk redirect or only a few pages at a time
Dear all, I would very much like to have your advise about whether or not to implement bulk 301 redirects. We have 3 retail websites with the same technical architecture, namely: Netherlands-example.nl Belgium-example.be France-example.fr These three websites are all bilingual, namely: Netherlands-example.nl/nl Netherlands-example.nl/fr Belgium-example.be/nl Belgium-example.be/fr France-example.fr/nl France-example.fr/fr We’re going to do a CMS update and therefore we have to change a bulk of 301 redirects: Part 1: For France (France-example.fr) URL’s in the Dutch language (France-example.fr/nl) will be redirected to Belgium (Belgium-example.be/nl). It’s a matter of about 8.000 redirects. Part 2: For the Netherlands (Netherlands-example.nl) URL’s in the French language (Netherlands-example.nl/fr ) will be redirected to Belgium (Belgium-example.be/fr). It’s also a matter of about 8.000 redirects. Question:
Intermediate & Advanced SEO | | footsteps
What will be the best way to implement these redirects? Fully implement part 1 first (8.000 redirects) and then a couple of weeks/months later a full implement of part 2? Or will it be better to implement small batches like 200-500 per 2 weeks? I’d like to hear your opinion. Thanks in advance. Kind regards, Gerwin0 -
What is considered duplicate content?
Hi, We are working on a product page for bespoke camper vans: http://www.broadlane.co.uk/campervans/vw-campers/bespoke-campers . At the moment there is only one page but we are planning add similar pages for other brands of camper vans. Each page will receive its specifically targeted content however the 'Model choice' cart at the bottom (giving you the choice to select the internal structure of the van) will remain the same across all pages. Will this be considered as duplicate content? And if this is a case, what would be the ideal solution to limit penalty risk: A rel canonical tag seems wrong for this, as there is no original item as such. Would an iFrame around the 'model choice' enable us to isolate the content from being indexed at the same time than the page? Thanks, Celine
Intermediate & Advanced SEO | | A_Q0 -
Help with https// redirects
Hey there
Intermediate & Advanced SEO | | Jay328
I have a client who just moved from a self hosted CMS to Adobe Catalyst (don't ask!)
The problem: Their url indexed with google is https://domain.com, Adobe Catalyst does not support third party SSL certificates or https domains. Now when people google them https://domain.com shows up in search, HOWEVER it does not have a trusted certificate and a pop up window blocks the site. They are a mortgage company so SSL is really not needed. What can I do to get google to recognize the site at http: vs. https? Would this be something in GWMT? Thanks!0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Googleon/off tag does it work
Hi I am currently working on a page where I have some of the content across all pages. Rewriting it to make it unique is not an option I am afraid. I came across a tag called Googleon/off that will tell google not to index a certain part of a give webpage but will this ensure that it is not seen as dupplicate content? https://developers.google.com/search-appliance/documentation/610/admin_crawl/Preparing
Intermediate & Advanced SEO | | AndersDK0 -
Penalized for Similar, But Not Duplicate, Content?
I have multiple product landing pages that feature very similar, but not duplicate, content and am wondering if this would affect my rankings in a negative way. The main reason for the similar content is three-fold: Continuity of site structure across different products Similar, or the same, product add-ons or support options (resulting in exactly the same additional tabs of content) The product itself is very similar with 3-4 key differences. Three examples of these similar pages are here - although I do have different meta-data and keyword optimization through the pages. http://www.1099pro.com/prod1099pro.asp http://www.1099pro.com/prod1099proEnt.asp http://www.1099pro.com/prodW2pro.asp
Intermediate & Advanced SEO | | Stew2220 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0