Delay release of content or fix after release
-
I am in the midst of moving my site to a new platform. As part of that I am reviewing each and every article for SEO - titles, URLs, content, formatting/structure, etc, etc. I have about 200 articles to move across and my eventual plan is to look at each article and update for these factors.
I have all the old content moved across to the new server as-is (the old server is still the one to which my domain's DNS records point). At a high level I have two choice:
- Point DNS to the new server, which will expose the same content (which isn't particularly SEO-friendly) and then work through each article, fixing the various elements to make them more user friendly.
- Go through each article, fixing content, structure, etc and THEN update DNS to point to the new server.
Obviously the second option adds time before I can switch across. I'd estimate it will take me a few weeks to get through the articles. Option 1 allows me to switch pretty soon and then start going through the articles and updating them.
An important point here is the new articles already have new (SEO-friendly) URLs and titles on the new server. I have 301 redirections in place pointing from the old to new URLs. So, it's "only" the content of each article that will be changing on the new server, rather than the URLs, etc.
So, I'd be interested in any suggestions on the best approach - move across to the new server now and then fix content or wait till all the content is done and then switch to the new server.
Thanks.
Mark
-
I would definitely at least clean up the article HTML and structure before launching the pages, since you don't want people who might land on them before they're updated to have a weird experience. As far as optimizing them for SEO, I think you could go ahead and make the pages live and roll out edits as you make them. Prioritizing the pages based on highest-traffic/best-converting first is the way to go. If switching your platform is going to make your site easier to crawl, you definitely want to do that sooner rather than later - plus, having the new pages live will allow them to start accumulating some links even before you make keyword-related changes.
In general with a major change like this I recommend changing as few other things as possible simultaneously. It's OK to make more gradual changes, and it gives Google fewer things to get used to at one time.
-
If search engines did not catch up with changes we make and improve our ranking for positive changes, there'd be little point to Search Engine Optimization.
If Google is already seeing your pages anyway and the move will only make them better (even if they are still not where you'd like them to be), then you can go ahead and move them if you like, as long as the move will not create a confusing situation for the people looking at the pages.
As you fix the pages to your satisfaction, wait for them to be crawled again or resubmit them using Fetch as Google to possibly get them crawled faster. [And as far as H2 tags, if that is your main worry, I wouldn't worry too much--they probably won't make much difference.]
-
Thank you for the response, Linda. So, this is a slightly tricky one because I don't have a specific deadline per se, but also want to build a plan that gets me over to the new server as soon as possible, without falling into a trap of the switchover date just "floating". Let me put it this way.
I have the following "phases" for each of the articles (as reminder, I have around 200 such articles):
- Create all articles: Using the planned titles, categories and URLs but with no content.
- Move content across from old site to the new articles. Done with straight cut-and-paste (don't ask about importing - long story :)). This gets the data into WordPress posts as-is, but includes HTML markup from the old CMS, doesn't correctly use styles (some articles look pretty messy) and doesn't have a consistent use of H2 tags (H1 is the title). Most articles look "OK" but a) some are messy but readable for the human eye and b) the lack of H2 tags means there's no structure from an SEO-perspective.
- Clean up article HTML/structure. Review each article, cleaning up the HTML and ensuring the content still makes sense and reads well. HTML clean-up includes removing HTML relevant to the old CMS and making sure I have article structure through use of H2 tags
- Review each article for SEO. Will be using the Yoast SEO plugin and making changes recommended. The keywords are already decided (the URLs and titles in step 1 reflect those decisions) so for each article I will be reviewing the rest of the content and making sure it looks acceptable from an SEO perspective,
I am currently done with step 2 (all articles moved across, albeit some looking somewhat untidy and without any document structure). I am starting to work through step 3 now, but this is a time-consuming process.
I guess what this all boils down to is if I switch across will search engines "catch up" later, when I revise the content for structure and SEO changes. The existing site is not good - so, as it stands, search engines don't look on the site kindly.
One option is to just bite the bullet and move across (I'd see benefits from the title and URL changes, with the associated 301 redirects in place) and subsequently do steps 3 and 4. I'd actually like to do that but ONLY if I can be confident the search engines will end up in the same place as they would if I just waited till step 4 is done.
Another option is to finish step 3, move to the new server and then start updating articles for SEO (step 4).
Thanks.
Mark
-
Why are you switching? If there is no reason to be in a rush, then I'd wait and make the change when everything is ready--a few weeks isn't that long.
If there is a particular reason for haste (like you were having technical problems with the old platform or a lot of your traffic is mobile and you want to make the April 21 Google deadline), then I think it depends on the state of the content.
If it is not perfect but still makes sense with the new titles and URLs, I'd do the update for your most important content and switch. If it is terrible, I'd wait. There is no point getting traffic for bad content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap and content question
This is our primary sitemap https://www.samhillbands.com/sitemaps/sitemap.xml We have a about 750 location based URL's that aren't currently linked anywhere on the site. https://www.samhillbands.com/sitemaps/locations.xml Google is indexing most of the URL because we submitted the locations sitemap directly for indexing. Thoughts on that? Should we just create a page that contains all of the location links and make it live on the site? Should we remove the locations sitemap from separate indexing...because of duplicate content? # Sitemap Type Processed Issues Items Submitted Indexed --- --- --- --- --- --- --- --- --- 1 /sitemaps/locations.xml Sitemap May 10, 2016 - Web 771 648 2 /sitemaps/sitemap.xml Sitemap index May 8, 2016 - Web 862 730
Intermediate & Advanced SEO | | brianvest0 -
Responsive Content
At the moment we are thinking about switching to another CMS. We are discussing the use of responsive content.Our developer states that the technique uses hidden content. That is sort of cloaking. At the moment I'm searching for good information or tests with this technique but I can't find anything solid. Do you have some experience with responsive content and is it cloaking? Referring to good articles is also a plus. Looking forward to your answers!
Intermediate & Advanced SEO | | Maxaro.nl0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
How do I fix my sitemap?
I have no idea how this happened, but our sitemap was http://www.kempruge.com/sitemap.xml, now it's http://www.kempruge.com/category/news/feed/ and google won't index it. It 404's. Obviously, I had to have done something wrong, but I don't know what and more importantly, I don't know how to find it in the backend of wordpress to change it. I tried a 301 redirect, but GWT still 404'd it. Any ideas? And, it's been like this for a few weeks, I've just neglected it, so I can't just reset the site without losing a lot of work. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Content Landing Page
Hey Mozzers, I wanted to get some opinions on here. I'm going to be building out the content on my site a lot of the next couple of months, and have recently started thinking about creating a content landing page. For those not familiar with the concept it's the idea of building this page that basically just pulls together all the content you've written on a specific subject & serves as hopefully a link magnet & destination for people interested in the topic. So my question is this, I am just outlining all of the different posts & areas that I want to cover on specific topics & it is a lot. I'm talking ~20 posts on each subject. Do you think that would be too much content to try & get on one page? Should I break it down to a more finite 5-7 links to high quality articles per page, or create basically this monster guide that links to all these different articles I'll create. Looking forward to getting your opinion, Chris
Intermediate & Advanced SEO | | chris.kent0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Which duplicate content should I remove?
I have duplicate content and am trying to figure out which URL to remove. What should I take into consideration? Authority? How close to the root the page is? How clear the path is? Would appreciate your help! Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Fixing Google Places once Banned
I have a lot of clients who have somehow botched up their Google Places listing, and now are not showing up in local search results. In one particular case, they were using 2 different Gmail accounts and submitted their listing twice by accident. It appears Google has banned them from local search results. How does one undo steps like this and start fresh? Thanks!
Intermediate & Advanced SEO | | ocsearch0