Website Redesign & Ensuring Minimal Traffic/Rankings Lost
-
Hi there,
We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense.
So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost?
Thank you so much for your help.
-
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned
use a complete search and replace when necessary across the entire site just to make sure everything’s in place.
I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month.
Redirect mapping process
If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.
The redirect mapping file is a spreadsheet that includes the following two columns:
- Legacy site URL –> a page’s URL on the old site.
- New site URL –> a page’s URL on the new site.
When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.
Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.
Increasing efficiencies during the redirect mapping process
Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.
Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping.
https://moz.com/blog/website-migration-guide
Appendix: Useful tools
Crawlers
- Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
- Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
- Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
- Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
- On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.
Handy Chrome add-ons
- Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
- User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
- Ayima Redirect Path: A great header and redirect checker.
- SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
- Scraper: An easy way to scrape website data into a spreadsheet.
Site monitoring tools
- Uptime Robot: Free website uptime monitoring.
- Robotto: Free robots.txt monitoring tool.
- Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
- SEO Radar: Monitors all critical SEO elements and fires alerts when these change.
- UltraDNS TOOLS change to DNS
Site performance tools
- NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required.
- PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
- Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
- Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.
- DareBoost very helpful & accurate as well. finding everything you need to know.
Structured data testing tools
- Google’s structured data testing tool & Google’s structured data testing tool Chrome extension
- Bing’s markup validator
- Yandex structured data testing tool
- Google’s rich results testing tool
Mobile testing tools
Backlink data sources
I hope this helps,Tom
-
You may want to:
1. Update your mycompany.com sitemap
2. Create an additional sitemap for your blog that sits on blog.mycompany.com
3. List both sitemaps or sitemap index files in your root robots.txt file
4. "Submit" the sitemaps to Google through Google Search Console. (I say "submit" because you really just point them to the URL. Their crawlers should find it regardless, however, this might make the discovery process swifter.)
In Google Search Console, you'll need to make sure you have claimed ownership (& verified) at the domain level. This will include your domain and new subdomain. It's up to you if you want to also claim ownership at the URL-prefix property so that blog.mycompany.com is broken out separately and can have the new blog sitemap added there. https://support.google.com/webmasters/answer/34592
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google rank a "Site:yourexamplesite.com" Query
Hi All, Sorry for the potentially confusing title. I am trying to find out how google ranks the pages of your site when you search "site:yourwebsite.com". When I did this with my website I was surprised what pages showed up on the first page, there were sub-category pages in the top 5 results and top level category pages that weren't on the first page. I have been unable to find information as to how google returns these results, is it the same algorithm/factors that make pages rank highly in a regular search, or does it have something to do with how recently google crawled these pages. Any feedback would be helpful. Additionally, if anyone has worked through a similar scenario I would be interested to know if there were any insights you gained from finding out which of your pages google returned first. Thanks for the help! Jason
Web Design | | Jason-Reid0 -
Drop in rankings after AMP implementation because of lack of facebook comments
Hi, we are amplifying our site, but one of the things we can´t include on our AMP version is the Facebook comment box. Some of our articles have hundreds of comments on them and we noticed that Google was crawling those comments and using them as a ranking signal (the more comments the better we discovered). Now we are wondering if these articles would drop if we launch the AMP version without the comment box. As this would reduce the written content on those pages a lot. Anybody tested this before or has an idea on that would work out? Thanks for your help!
Web Design | | guidetoiceland1 -
Seperating Different Parts Of The Website
Hi There, I have a client with two parts to his business both for different types of customer with different language and copy needed. At the moment they have one website and I am trying to figure out the most search engine friendly way to present these different types. So for example if a client came in looking for service A, he would see the home page for service A and if he came in looking for service B he would come across the home page for service B. I know I could have seperate service pages for each service he provides, but I think it would be off putting to come to a home page of a site and see completely unrelated services on one page. I hope I am explaining myself here. As far as I can see the options are:- subdomains for servicea.examplesite.com, serviceb.examplesite.com and a split page (see attachement) where you click which you are interested in (don't like this idea) seperate websites a home page which shows all the services (too confusing) Any advice would be most grateful. Regards Neil MpYSKqN
Web Design | | nezona0 -
What are the most common reasons for a website being slow to load
I've been advised that too many requests are being sent (presumably to the server?), how can I reduce these and were else should I look to increase speed?
Web Design | | FBS1 -
Image with 100% width/height - bad ranking?
Hi, we have some articles like this: http://www.schicksal.com/Orakel/Freitag-13 The main image has a width of 100% and a height of 100%. Today, I've discovered that GWT Instant Preview has some troubles with rendering the page. We have CSS rules to deliver the image with the right dimensions. If a bot like google is not sending any screen height / width we assume the screen size is 2560x1440. Does this harm the ranking of the page? (Content starts below the fold/image) What is a "default" screen size for google? How do they determine if something is "above the fold"? Any tips or ideas? Best wishes, Georg.
Web Design | | GeorgFranz0 -
Pagerank and SERP rankings downhill after site update
Our site underwent a major update in September 2012. We put the entire site in WordPress and did away with our static pages. Then, in February 2013, we moved our shopping cart pages from a subdomain to our main domain (in WordPress). In both cases, we had to implement a massive 301 redirect through htaccess as most of our URLs changed with the update. Our site consists of the shopping cart (WooCommerce), blog, and supporting pages. We noticed traffic starting to drop around the last week of November (2012) and it has steadily declined ever since. None of our shop pages have a pagerank with virtually all them showing a gray bar with question mark. Only the shop homepage has some pagerank -- that too from 4 previously to 2 now. Some of the words we used to rank very well for before, we don't even show in the first five pages anymore. At first, we thought it was a temporary situation that would self correct over time, but it doesn't seem to get better at all. All said, we have lost over 80% of our traffic from Google organic. Upon repeated reviews, the 301 redirects seem to be done correctly and we don't see any serious mistakes that could cause such a huge drop. So the question is are we missing something? Are we not looking at the right places? Any ideas where we might start looking? We're simply looking for ideas and a fresh perspective.
Web Design | | bizmanuals0 -
Http://www.domain.com/services/city or http://www.domain.com/city
Hi, I have a website created on a WordPress Platform. My site is in the "Service" industry and we perform the same service for many different "cities." My question pertains to the Navigation Bar and SEO. Is it better to have a "Service" tab on the navigation bar - that has sub navigation that lists all the cities on a drop down menu. When this happens the URL string looks like http://www.domain.com/service/city The other choice would be to create individual TABS on the NAV bar, by doing this the URL string would look - http://www.domain.com/city . I could be wrong, but I am assuming the http://www.domain.com/city is better than http://www.domain.com/services/city for SEO purposes, ....if I am correct is there a way to make SUB Menu URLS appear as http://www.domain.com/city ? Any input as always would be appreciated best regards, Jimmy
Web Design | | jimmy02251 -
New Website Redesign: Any Design Comments or SEO Suggestions?
Hi! We recently launched our new website after MONTHS of work. Now that it is live, we are looking to fine tune the design and SEO efforts. This is our new website. And for reference, this is our old website. Any and all comments on design and SEO would be greatly appreciated. Thank you for your help! Mike
Web Design | | Mike.Goracke0