Website Redesign & Ensuring Minimal Traffic/Rankings Lost
-
Hi there,
We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense.
So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost?
Thank you so much for your help.
-
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned
use a complete search and replace when necessary across the entire site just to make sure everything’s in place.
I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month.
Redirect mapping process
If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.
The redirect mapping file is a spreadsheet that includes the following two columns:
- Legacy site URL –> a page’s URL on the old site.
- New site URL –> a page’s URL on the new site.
When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.
Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.
Increasing efficiencies during the redirect mapping process
Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.
Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping.
https://moz.com/blog/website-migration-guide
Appendix: Useful tools
Crawlers
- Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
- Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
- Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
- Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
- On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.
Handy Chrome add-ons
- Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
- User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
- Ayima Redirect Path: A great header and redirect checker.
- SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
- Scraper: An easy way to scrape website data into a spreadsheet.
Site monitoring tools
- Uptime Robot: Free website uptime monitoring.
- Robotto: Free robots.txt monitoring tool.
- Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
- SEO Radar: Monitors all critical SEO elements and fires alerts when these change.
- UltraDNS TOOLS change to DNS
Site performance tools
- NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required.
- PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
- Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
- Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.
- DareBoost very helpful & accurate as well. finding everything you need to know.
Structured data testing tools
- Google’s structured data testing tool & Google’s structured data testing tool Chrome extension
- Bing’s markup validator
- Yandex structured data testing tool
- Google’s rich results testing tool
Mobile testing tools
Backlink data sources
I hope this helps,Tom
-
You may want to:
1. Update your mycompany.com sitemap
2. Create an additional sitemap for your blog that sits on blog.mycompany.com
3. List both sitemaps or sitemap index files in your root robots.txt file
4. "Submit" the sitemaps to Google through Google Search Console. (I say "submit" because you really just point them to the URL. Their crawlers should find it regardless, however, this might make the discovery process swifter.)
In Google Search Console, you'll need to make sure you have claimed ownership (& verified) at the domain level. This will include your domain and new subdomain. It's up to you if you want to also claim ownership at the URL-prefix property so that blog.mycompany.com is broken out separately and can have the new blog sitemap added there. https://support.google.com/webmasters/answer/34592
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting up analytics for a website redesign
Hey all, so in the past when I make changes to a site, I make the changes, review the analytics in the wake of the changes, analyze and go from there. Little things here and there, no biggie. With my new company, we're doing a full website redesign from scratch (Currently on Wordpress, moving to custom). They are asking me about analytics and reporting and I was hoping to get some insight here. When the new site is ready, they are launching it at www2.ourdomain.com and sending 25% of traffic to ourdomain.com to that with the other 75% going to www.ourdomain.com (current site). So two questions- how would you go about setting up analytics for that? And how do you ensure the www2 version doesn't get indexed but stay in Google's good graces? If you de-index your "home page" that 25% are seeing I can't imagine that's helpful for SEO. Hopefully that makes sense! Trying to look at how to A/B test to ensure the new site is working and converting before pushing all traffic to it.
Web Design | | DanDeceuster0 -
Do search engines see copy/keywords when it appears only at the bottom of a page?
My client is looking to improve their SEO, and to date I've written meta data and made some initial recommendations. Thing is, on some of their pages, the body copy appears at the bottom of the page, past links and big, splashy images. My question is, will search engines even see that copy to crawl it for keywords? Thanks!
Web Design | | MarcieHill0 -
Community Discussion: UX & SEO – Your experience?
We've been looking at the relationship between SEO & UX a bit more closely lately on the blog. Our good pal Cyrus started the wheels turning with a tweet: https://twitter.com/CyrusShepard/status/748296076411625473 ...and that morphed into a Whiteboard Friday idea, which was filmed and posted here: https://moz.com/blog/ux-vs-seo-whiteboard-friday We shared the story of one site that enjoyed rapid growth and that subsequently battled with managing that UX/SEO relationship on Thursday. And it's hard, right? UX and SEO teams often operate independently of one another, and may make decisions that affect one another's work. Sometimes it's a "hindsight is 20/20" situation. Sometimes the answer is so radical and impactful that you may want to settle for a "safe" alternative. I'd imagine many of you have encountered some big issues with user experience and search optimization in your day-to-day over the years. What's the most difficult situation you've encountered with this? How did you resolve it? (I'd bet money on there being some really creative solutions out there :). Is there a particularly challenging situation you're struggling with now that you'd want to share & crowdsource ideas for?
Web Design | | FeliciaCrawford3 -
Can a cloud based firewall affect my search ranking?
Hi, I recently implemented a firewall on my website to prevent hacking attacks. We were getting a crazy amount of people per day trying to brute force our website. I used the sucuri cloud proxy firewall service which they claim because of the super fast caching actually helps SEO. I was just wondering is this true? Because we're slowly falling further and further down the SERPS and i really don't know why. If not, is there any major google update recently I don't know about? Thanks, Robert
Web Design | | BearPaw880 -
Requirements for mobile menu design have created a duplicated menu in the text/cache view.
Hi, Upon checking the text cache view of our home page, I noticed the main menu has been duplicated. Please see: http://webcache.googleusercontent.com/search?q=cache:http://www.trinitypower.com&strip=1 Our coder tells me he created one version for the desktop and one for the mobile version. Duplicating the menu cannot be good for on page SEO. With that said, I have had no warnings reported back from Moz. Maybe the moz bots are not tuned to looks for such a duplication error. Anyway, the reason the coder created a different menu for mobile in order to support the design requirements. I did not like the look and feel of the responsive version created based on the desktop version. Hi solution to this problem is to convert the Mobile version menu into ajax. what do you guys think? Thanks, Jarrett
Web Design | | TrinityPower0 -
HTML5 & the doc outline algorithm
Hi My web team are currently working on an updated site using Drupal and have asked me the following question: Is more than one H1 tag with the same value an issue for SEO with HTML5 and the doc outline algorithm? Can anyone help with this please? I appriciate any responses. Thanks in advance. Chris
Web Design | | Fasthosts0 -
How to Minimize HTTP Requests?
What are the best and simplest programs to minimize HTTP Requests? for JS Images CSS My site is built on a Bitrix CMS
Web Design | | HMCOE0 -
Map Search Tools to integrate on Accommodation Website
Hello all, Can any recommend a Map Search tool that i can integrate into my accommodation website. Ideally I want to be able to pin my clients on the map search with links back to their listing page on my website and provide an alternative search facility for clients looking for accommodation. Am covering the South Africa region specifically. Am assuming that i could go down the Google Maps route but would really like to know what alternatives there are on offer. Also what SEO considerations do i need to think about when adding this to my website. Thanks in advance for any help.
Web Design | | SamanthaRiggien0