Delay release of content or fix after release
-
I am in the midst of moving my site to a new platform. As part of that I am reviewing each and every article for SEO - titles, URLs, content, formatting/structure, etc, etc. I have about 200 articles to move across and my eventual plan is to look at each article and update for these factors.
I have all the old content moved across to the new server as-is (the old server is still the one to which my domain's DNS records point). At a high level I have two choice:
- Point DNS to the new server, which will expose the same content (which isn't particularly SEO-friendly) and then work through each article, fixing the various elements to make them more user friendly.
- Go through each article, fixing content, structure, etc and THEN update DNS to point to the new server.
Obviously the second option adds time before I can switch across. I'd estimate it will take me a few weeks to get through the articles. Option 1 allows me to switch pretty soon and then start going through the articles and updating them.
An important point here is the new articles already have new (SEO-friendly) URLs and titles on the new server. I have 301 redirections in place pointing from the old to new URLs. So, it's "only" the content of each article that will be changing on the new server, rather than the URLs, etc.
So, I'd be interested in any suggestions on the best approach - move across to the new server now and then fix content or wait till all the content is done and then switch to the new server.
Thanks.
Mark
-
I would definitely at least clean up the article HTML and structure before launching the pages, since you don't want people who might land on them before they're updated to have a weird experience. As far as optimizing them for SEO, I think you could go ahead and make the pages live and roll out edits as you make them. Prioritizing the pages based on highest-traffic/best-converting first is the way to go. If switching your platform is going to make your site easier to crawl, you definitely want to do that sooner rather than later - plus, having the new pages live will allow them to start accumulating some links even before you make keyword-related changes.
In general with a major change like this I recommend changing as few other things as possible simultaneously. It's OK to make more gradual changes, and it gives Google fewer things to get used to at one time.
-
If search engines did not catch up with changes we make and improve our ranking for positive changes, there'd be little point to Search Engine Optimization.
If Google is already seeing your pages anyway and the move will only make them better (even if they are still not where you'd like them to be), then you can go ahead and move them if you like, as long as the move will not create a confusing situation for the people looking at the pages.
As you fix the pages to your satisfaction, wait for them to be crawled again or resubmit them using Fetch as Google to possibly get them crawled faster. [And as far as H2 tags, if that is your main worry, I wouldn't worry too much--they probably won't make much difference.]
-
Thank you for the response, Linda. So, this is a slightly tricky one because I don't have a specific deadline per se, but also want to build a plan that gets me over to the new server as soon as possible, without falling into a trap of the switchover date just "floating". Let me put it this way.
I have the following "phases" for each of the articles (as reminder, I have around 200 such articles):
- Create all articles: Using the planned titles, categories and URLs but with no content.
- Move content across from old site to the new articles. Done with straight cut-and-paste (don't ask about importing - long story :)). This gets the data into WordPress posts as-is, but includes HTML markup from the old CMS, doesn't correctly use styles (some articles look pretty messy) and doesn't have a consistent use of H2 tags (H1 is the title). Most articles look "OK" but a) some are messy but readable for the human eye and b) the lack of H2 tags means there's no structure from an SEO-perspective.
- Clean up article HTML/structure. Review each article, cleaning up the HTML and ensuring the content still makes sense and reads well. HTML clean-up includes removing HTML relevant to the old CMS and making sure I have article structure through use of H2 tags
- Review each article for SEO. Will be using the Yoast SEO plugin and making changes recommended. The keywords are already decided (the URLs and titles in step 1 reflect those decisions) so for each article I will be reviewing the rest of the content and making sure it looks acceptable from an SEO perspective,
I am currently done with step 2 (all articles moved across, albeit some looking somewhat untidy and without any document structure). I am starting to work through step 3 now, but this is a time-consuming process.
I guess what this all boils down to is if I switch across will search engines "catch up" later, when I revise the content for structure and SEO changes. The existing site is not good - so, as it stands, search engines don't look on the site kindly.
One option is to just bite the bullet and move across (I'd see benefits from the title and URL changes, with the associated 301 redirects in place) and subsequently do steps 3 and 4. I'd actually like to do that but ONLY if I can be confident the search engines will end up in the same place as they would if I just waited till step 4 is done.
Another option is to finish step 3, move to the new server and then start updating articles for SEO (step 4).
Thanks.
Mark
-
Why are you switching? If there is no reason to be in a rush, then I'd wait and make the change when everything is ready--a few weeks isn't that long.
If there is a particular reason for haste (like you were having technical problems with the old platform or a lot of your traffic is mobile and you want to make the April 21 Google deadline), then I think it depends on the state of the content.
If it is not perfect but still makes sense with the new titles and URLs, I'd do the update for your most important content and switch. If it is terrible, I'd wait. There is no point getting traffic for bad content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Improve First Contentful Paint (FCP) and First Input Delay (FID)
Please suggest me how to improve First Contentful Paint (FCP) and First Input Delay (FID) for my website (http://www.mastersindia.co/) on mobile. All the java scripts moved into footer, added async and defer on JS. Also, apply all the possible ways suggested buy the Google pagespeed insights tool but we did not see more improvement. We tried to defer and async all the CSS but while doing this our website gets break. Please help me to solve it. thread-18712370-27405935606745616.png
Intermediate & Advanced SEO | | AnilTanwarMI0 -
Are feeds bad for duplicate content?
One of my clients has been invited to feature his blog posts here https://app.mindsettlers.com/. Here is an example of what his author page would look like: https://app.mindsettlers.com/author/6rs0WXbbqwqsgEO0sWuIQU. I like that he would get the exposure however I am concerned about duplicate content with the feed. If he has a canonical tag on each blog post to itself, would that be sufficient for the search engines? Is there something else that could be done? Or should he decline? Would love your thoughts! Thanks.
Intermediate & Advanced SEO | | cindyt-17038
Cindy T.0 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Duplicate content question? thanks
Hi, Im my time as an SEO I have never come across the following two scenarios, I am an advocate of using unique content, therefore always suggest and in cases demand that all content is written or re-written. This is the scenarios I am facing right now. For Example we have www.abc.com (has over 200 original recipes) and then we have www.xyz.com with the recipes but they are translated into another language as they are targeting different audiences, will Google penalize for duplicate content? The other issue is that the client got the recipes from www.abc.com (that have been translated) and use them in www.xyz.com aswell, both sites owned by the same company so its not pleagurism they have legal rights but I am not sure how Google will see it and if it will penalize the sites. Thanks!
Intermediate & Advanced SEO | | M_81