Website Redesign & Ensuring Minimal Traffic/Rankings Lost
-
Hi there,
We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense.
So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost?
Thank you so much for your help.
-
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned
use a complete search and replace when necessary across the entire site just to make sure everything’s in place.
I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month.
Redirect mapping process
If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.
The redirect mapping file is a spreadsheet that includes the following two columns:
- Legacy site URL –> a page’s URL on the old site.
- New site URL –> a page’s URL on the new site.
When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.
Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.
Increasing efficiencies during the redirect mapping process
Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.
Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping.
https://moz.com/blog/website-migration-guide
Appendix: Useful tools
Crawlers
- Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
- Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
- Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
- Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
- On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.
Handy Chrome add-ons
- Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
- User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
- Ayima Redirect Path: A great header and redirect checker.
- SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
- Scraper: An easy way to scrape website data into a spreadsheet.
Site monitoring tools
- Uptime Robot: Free website uptime monitoring.
- Robotto: Free robots.txt monitoring tool.
- Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
- SEO Radar: Monitors all critical SEO elements and fires alerts when these change.
- UltraDNS TOOLS change to DNS
Site performance tools
- NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required.
- PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
- Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
- Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.
- DareBoost very helpful & accurate as well. finding everything you need to know.
Structured data testing tools
- Google’s structured data testing tool & Google’s structured data testing tool Chrome extension
- Bing’s markup validator
- Yandex structured data testing tool
- Google’s rich results testing tool
Mobile testing tools
Backlink data sources
I hope this helps,Tom
-
You may want to:
1. Update your mycompany.com sitemap
2. Create an additional sitemap for your blog that sits on blog.mycompany.com
3. List both sitemaps or sitemap index files in your root robots.txt file
4. "Submit" the sitemaps to Google through Google Search Console. (I say "submit" because you really just point them to the URL. Their crawlers should find it regardless, however, this might make the discovery process swifter.)
In Google Search Console, you'll need to make sure you have claimed ownership (& verified) at the domain level. This will include your domain and new subdomain. It's up to you if you want to also claim ownership at the URL-prefix property so that blog.mycompany.com is broken out separately and can have the new blog sitemap added there. https://support.google.com/webmasters/answer/34592
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I increase my website speed?
Hi, I hope you are doing good. My website speed is too much slow. Mobile speed is 12 and Desktop speed is 39. Please check my website speed.
Web Design | | Bigbrand
&
What can I do for my website to increase speed?
What is best way to increase website speed. Here is my website: https://www.myqurantutor.com/0 -
404 errors & old unused pages
I am using shopify and I need to delete some old pages which are coming up as 404 errors (product no longer available!) does anyone know where you go to delete these pages which are no longer needed?
Web Design | | carleyb0 -
Magento 2.1 Multi Store / SEO
This is quite technical but I'm hoping a Magento expert can clear this up for me. Currently my company has two websites on separate Opencart platforms. What I'm doing now is building a Magento website and using the multi store function as well as a few modules to combine the two sites, the aim being that the link juice is shared and I can focus my SEO efforts on the one site instead of two, thus reducing my workload while maintaining the benefits. This is the intended layout: www.domain.com www.domain.com/us I have created a sub-folder (not a subdomain) as this seems to be the best way to share link juice between the new, combined sites (as well as 301s from the old, redundant site). At the moment I have created 2 separate websites, stores and store views (see attached) and have configured it according to the Magento guide, so I know that technically this is correct but I need to make sure that I have done it correctly in relation to SEO. Is the sub-folder set up correctly for instance? Currently the only files to populate that sub-folder are a htaccess, error log and index.php (see attached). Also, is there anything I could be missing in relation to SEO within the parameters of what I am trying to achieve? Additionally, only one store view appears in the "change store view" section of the home page. This is causing me to question if I have set it up correctly, because I had assumed both store views would appear even if they were under different websites (attached). OR do I simply use the same website and create two stores and store views? Do I also need to create a separate database for each website/store/store view? I would very much appreciate if someone could help out here. Thank you. In1Gi7t pyfM03y nUQoMz1
Web Design | | moon-boots0 -
Ranking of non-homepage leads to decrease in website ranking?
Hi all, Google picks up a non-homepage to rank for primary keyword where homepage is actually optimised to rank for same keyword. This means Google is ignoring the actual page and ranking other page. Does this scenario means that we are ranking lower as the homepage is not considered here? We may rank much better if homepage is preferred by Google? Thanks
Web Design | | vtmoz0 -
Server performance and loss of rankings
Hi all Wondering if anyone knows how much poor server performance is needed to affect your Google rankings. A few sites of mine were performing fine until about 11am today, since then the site traffic is down about 99% (eeek) at the same time the server was playing up with a server load about 40 - it's 12core, 64ram. I was having database issues at the time too. Surely Google doesn't work that fast in demoting sites for poor performance does it? If not need I need to ponder other reasons why i have no traffic anymore. Thanks, Carl
Web Design | | carl_daedricdigital0 -
Given the lastest Google update, should I rewrite my Flash site or try to present an alternative HTML/CSS site?
I have a site that was created using Flash. The reasoning behind this was, at the time, that I didn't care if the site ranked or not (portfolio site). Now I would like to drive traffic to the site from SE's. Given the Penguin update, should I rewrite my Flash site in HTML/CSS or present an alternative site for bots and browsers that don't support Flash? My concern is that by presenting an alternative site to bots and non Flash supporting browsers that the SE's will see potentially see this as cloaking. Thoughts and advice would be much appreciated.
Web Design | | mj7750 -
Why is Google sending traffic to our homepage, not our optimized pages?
Hello Forum, My team and I just completely redid a yoga eCommerce site, including its SEO. The old version of the site didn't feature page-specific optimization and, as a result, Google's search results for our keywords almost always directed visitors to the homepage. For example, a Google search for the term "yoga bolster" sent users to the homepage, not the product category page for yoga bolsters. After redoing the site and optimizing specific pages (i.e. the yoga bolster page is now optimized for the keyword "yoga bolster"), the Google search results are still taking users to the homepage, not the optimized page. (i.e. if you search for yoga bolster, find our search result, and click the search result link, you're taken to the homepage, not the bolster page) It's only been about 36 hours since we've launched the new website and submitted it to Google's webmaster tools. Does anyone know why Google is still sending people to our homepage and not the keyword-optimized pages we created? Is this a timing issue?
Web Design | | pano0 -
HTML5, semantic web & SEO
HTML5 is supposed to revolutionize the way browsers, web clients and services are supposed to "understand" information on the web. I have been planning on converting my site to HTML5 ever since it went into a working draft last spring, however I wanted to know if upgrading to HTML5 would offer any SEO benefits or if it would actually have a negative effect on how my site is perceived on the web. I guess my real question here is "Do search engines recognize HTML5 sectioning?" Is content found in semantic sections like <header>, <footer>, <nav>, <aside>, treated any different than content inside generic HTML4 containers like, or ? </aside> </nav> </footer> </header>
Web Design | | TahoeMountain400