Content Publishing Volume/Timing
-
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online.
Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities).
My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release.
Are there pitfalls I should avoid in terms of pushing out so much back content at once?
-
Converting all of those articles will take time.
I would design the site architecture and template and then immediately publish each article as soon as it is ready. This will get the articles flowing out into the search engines and get the money flowing in.
-
Hi Andrew,
I would definitely avoid throwing everything at Google all at once. This won't give any article time to gain traction and severely limit your chances to share everything through social channels.
There isn't a magic timescale where you should publish this over, but if there is that much, then you should be looking at months rather than days or weeks.
Leave the season-sensitive articles until those seasons to maximise on the impact they can have.
I would also update any articles that might have out-dated information, so look at these before they go live.
-Andy
-
Personally I would release it over a decided period. This way it would seem that your content is being continuously added rather than a massive once off DUMP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
Google Not Indexing App Content
Hello Mozzers I recently noticed that there has been an increase in crawl errors reported in Google Search console & Google has stopped indexing our app content. Could this be due to the fact that there is a mismatch between the host path name mentioned within the android deeplink (within the alternate tag) and the actual URL of the page. For instance on the following desktop page http://www.example.com.au/page-1 the android deeplink points to http://www.example.com.au/android-app://com.example/http/www.example.com.au/4652374 Please note that the content on both pages (desktop & android) is same.Is this is a correct setup or am I doing something wrong here? Any help would be much appreciated. Thank you so much in advance.
Intermediate & Advanced SEO | | InMarketingWeTrust0 -
What is considered duplicate content?
Hi, We are working on a product page for bespoke camper vans: http://www.broadlane.co.uk/campervans/vw-campers/bespoke-campers . At the moment there is only one page but we are planning add similar pages for other brands of camper vans. Each page will receive its specifically targeted content however the 'Model choice' cart at the bottom (giving you the choice to select the internal structure of the van) will remain the same across all pages. Will this be considered as duplicate content? And if this is a case, what would be the ideal solution to limit penalty risk: A rel canonical tag seems wrong for this, as there is no original item as such. Would an iFrame around the 'model choice' enable us to isolate the content from being indexed at the same time than the page? Thanks, Celine
Intermediate & Advanced SEO | | A_Q0 -
SEO time
I wanto to be in the top of the google search. I am usiing a lot of SEO tools but... I have done it during one month. Do I have to wait more?
Intermediate & Advanced SEO | | CarlosZambrana0 -
Time for Google to change the emphasis?
Why doesn't Google recommend that links are nofollow as standard, via HTML5, etc., with follow being added if the link is on a quality site (defined by PR, or whatever.) and adds value. Wouldn't this save alot of time? Then they could whack all the sites with coding that doesn't comply, couldn't they? Also, instead of enabling negative SEO, why doesn't Google simply focus on wiping out the sites developed simply to pass on PR. I'm sure we could all send them a few thousand suggestions!
Intermediate & Advanced SEO | | McTaggart0 -
Any advice for my website http://cvcsports.com?
I run the website http://cvcsports.com for myself and my parents. We offer custom varsity jackets for athletes/companies/etc. We rank first in Google for "letterman jackets" and near the top for "varsity jackets". I really want to reach #1 for "varsity jackets" (we were briefly #1 a few days ago but didn't stay there). Does anyone have any advice on what I can do to achieve that? Thanks in advance for the tips!
Intermediate & Advanced SEO | | BrandonDoyle0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280