Will changing content managment systems affect rankings?
-
We're considering changing our content management system. This would probably change our url structure (keep root domain name, but specific product pages and what not would have different full urls). Will our rankings be affected if we use different urls for current pages? I know we can do 401 redirects, but anything else I should consider?
Thanks,
Dan
-
Thank you very much.
-
Thanks for the response. My main concern was the loss of rankings. I thought the change in URL's may negatively affect rankings.
-
When you move to a new CMS there could be changes in your.....
-- page URLs
-- title tags
-- onpage optimization markup
-- persistent navigation
-- anchor text on persistent navigation links
-- category, page, tag, etc treatment
-- other changes
If you are planning to move to a new CMS your goal should be to kick your SEO up a notch instead of worrying if you are going to lose something. If you are in fear of loss then that is the call for more study or to have a professional CMS and SEO expert review your plans for moving the site or even do the move for you. The price might not be as high as you think.
Don't move your site thinking that you are taking a risk. Move your site with a plan if upping your game.
-
Hi Dan,
Changing your content management system is among the most common processes to affect search visibility. However, there are definitely ways to change your CMS while minimizing the chance of losing ranking. 301 redirects need to be used to change your old URLs to new ones, while top referring link sources should be analyzed and perhaps contacted for notification of the new URL, so your inbound links won’t be affected. Pages which cannot be mapped need custom 404 error pages, preferably with a user-friendly message to let visitors know the status of the page. Once your new CMS system is in place, make sure you pay close attention to your analytics for broken links and 404 errors so these can be dealt with immediately. Keeping all this in mind, and referring to TopRank’s blog post for further understanding, will help keep your rankings secure while you update your website.
-
I think you mean 301 redirects. The best way to not lose any rankings at all is making sure each URL has a 301 redirect to the new URL. Otherwise, all your links will just return an error.
Usually this is the best way to ensure link juice is being passed to your new URLs. It is tedious but needs to be done.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any risks involved in removing a sub-domain from search index or completely taking down? Ranking impact?
Hi all, One of our sub-domains has thousands of indexed pages but traffic is very less and irrelevant. There are links between this sub-domain to other sub domains of ours. We are planning to take this subdomain completely. What happens if so? Google responds for this with a ranking change? Thanks
Web Design | | vtmoz0 -
Will HTTPS Effect SERPS Depending on Different Page Content?
I know that HTTPS can have a positive influence on SERPS. Does anyone have any thoughts or evidence of this effect being different depending on the page content? For example, I would think that for e-commerce sites HTPS is a must, and I guess the change in rankings would be more significant. But what about other situations, AMP pages for example? Of if you run Adsense, or Affiliate links? Or if your page contains a form?
Web Design | | GrouchyKids1 -
Analytics year to year comparisons when Url extensions change?
We manage a website which we recently changed from Drupal to Word Press. In the change, we dropped a small part of the previous URLs - the end extension - .php For example /attractions-rates.php is now
Web Design | | Teamzig
/attractions-rates with no .php. We eliminated the .php to make the URL simpler. How is it possible (and easiest) to do a year to year comparison as Google sees the pages as different? They didn't for the first 8 days (we could see both) but now the pages with the .php extension shows zeros. The content of the page is exactly the same only the .php is different. We know we can manually go back to last year's reports and do side by side but that is time consuming. Hoping there is a filter or process we can use to gen a report? Thanks, Jim0 -
Image with 100% width/height - bad ranking?
Hi, we have some articles like this: http://www.schicksal.com/Orakel/Freitag-13 The main image has a width of 100% and a height of 100%. Today, I've discovered that GWT Instant Preview has some troubles with rendering the page. We have CSS rules to deliver the image with the right dimensions. If a bot like google is not sending any screen height / width we assume the screen size is 2560x1440. Does this harm the ranking of the page? (Content starts below the fold/image) What is a "default" screen size for google? How do they determine if something is "above the fold"? Any tips or ideas? Best wishes, Georg.
Web Design | | GeorgFranz0 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
How much does on-site duplicated content affect SERPs?
Hi, We've recently gotten into Moz, with our E-commerce websites, and discovered that it's crawler takes note of about 2500 pages which it thinks are the same (duplicated). We've now begun to completely rewrite every description of every product (including Meta Title/Description) so that this number may be reduced. Since this is the biggest issue Moz spots I'm wondering what the effect of fixing it will be on our position in the SERP (mainly Google). Does anybody have some stories or experience about this topic? Thanks in Advance! 🙂 Alexander
Web Design | | WebmasterAlex0 -
Hey on some of my report cards its saying im not using rel canonical correctly how do i change this on my site?
on some of my report cards its saying certain things featured on my services page are actually linking to my blog or something. and its saying im not using rel canonical correctly. can you help me out?
Web Design | | ClearVisionDesign0 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0