Website Redesign and Migration to Squarespace killed my Ranking
-
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words.
I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible.
Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live!
...Then I watched in dismay as our ranking fell into oblivion.
I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice.
Is this the case? And, if so, how do I fix it?
Our website url is www.kanataskinclinic.ca
Thanks
-
Really sorry to hear the new site is still struggling, Nicolas. In some quick indexing tests, it certainly appears that your blog posts are being indexed, and I'm seeing them in the search results for the specific post titles. [See screenshot attached] It's possible this may have picked up in the 10 days since you posted this.
I'm assuming you submitted your site's xml sitemap to the correct www version of your Google Search Console? What does the sitemaps report indicate as far as the number of pages indexed compared the number submitted?
Certainly one of the tradeoffs of a site tool like Squarespace is that you have far less control of the code to implement technical SEO, but it shouldn't be so problematic that you lose rankings completely.
If you're interested, I'd be happy to share a short Skype chat to try to narrow down the issues. You can send me a private message through my account here at Moz.
Paul
-
Why thank you! Did some of that info help you out as well?
p.
-
Hi Paul,
Thanks for your detailed response. And sorry for the delay in my reply. I am currently focused on updating my knowledge on SEO as much as possible so that I can figure out what happened to our website ranking.
I have re-directed all the major pages on my website, and continue to redirect pages as I come across them, but there are fewer and fewer and I don't think they're very important.
I did discover some other troubling problems, though: I tried using the "view as text" feature of Google Cache to see how Google sees our site....And it looks terrible!
- There is a lot of duplicate content, including page titles, which is horrifying.
- Duplicate and even triplicate content arises where I used their carousels and sliders,
- The images do not have their original filenames, just some generic SquareSpace "static" name.
- There is little, if any alt text, and I'm not always sure where it comes from, as I didn't put it there.
To make matters worse, even though our website has been live for over a month, and I have submitted it to Google for indexing several times already, my blog posts still do not show up on Google. Even the ones that are featured on our home page. Even if you type their titles right into Google's search bar.
Ugh! It took me several months to build our new website, and I was very proud of the result. It looks beautiful. But, if it's that ugly to Google, I'm going to have to look for other options.
-
Paul, you rock!
-
Most welcome - happy to see another Canuck hereabouts My mom's just down the road from you in Arnprior.
If you were able to get the initial broken links review corrected within two weeks, that's great - not much of the authority should have dissipated.
However to note - the 404s in Search Console aren't the only ones that need to be fixed - they're just the ones Google has noticed and alerted you to so far. The problem with relying on GSC to tell you what to fix is that, by the time it has shown up in the Console, Google has already hit a 404 and been told it's a missing page. It's totally reactive, instead of proactive.
I'd strongly recommend you also use the other methods I mentioned to get more of the old URLs found & redirected as well before Google notices they're borked. It's also ideal if you can try to get the owners of the most valuable sites that linked to you in the past to update their links to point directly to the new pages. It's a bit of a battle, but even just a few updated can make a difference.
As far as how long until "link juice starts flowing" by which I assume you mean "rankings and traffic start to return" - the only real answer is "it depends", I'm afraid. You'll want to submit your new sitemap to GSC so that you can track the progress of the indexing of your new site's pages (on Squarespace, the URL to submit is www.kanataskinclinic.ca/sitemap.xml). You should also do this for Bing's Webmaster Tools.
Since URLs and content have changed, it's going to take some time for your site to fully re-index and for Google to understand the value of the new content. Could be from a few weeks to a month. It would also help to submit a few of the main section pages using the Fetch as Google tool in GSC and to get some new, strong incoming links to the site's pages. Good social links, and at least one or two new ones from relevant sites.
Lemme know if you have further ??s
Paul
-
Just to clarify, our new website was just launched two weeks ago. Hopefully, I haven't lost too much power from the old links?
-
Hi P.
Thank you so much for your informative reply.
I took your advice and went into my Google Search Console and checked for 404 errors. Google made it easy by listing all of the broken URLs. They even thoughtfully allowed me to download the list into a Google Doc so that I could keep track of my work as I fixed them.
I then went into my SquareSpace control panel and added 301 redirects for the broken pages. It was surprisingly easy to do, and they provided very clear, step-by-step instructions to help.
After doing this, I checked my Google Search Console to see if anything had changed. I was pleasantly surprised to see that the redirects were working immediately. Wow, fast!
Now that this problem is resolved, how soon does the link juice start flowing? In other words, how soon do you think it will affect our ranking?
-
And just to add - there is a shelf life on recovering all that page and link equity. The longer those old URLs 404, the more of the power of the old pages will erode away.
Two month is a long time - don't dally on getting those redirects started immediately.
P.
-
Regardless of the possible issues with the new design, yes, if you changed to new URLs on the new site and didn't implement correct 301-redirects from all the old URLs, you have essentially thrown away all the ranking authority and inbound links contributed by all of your old pages except the home page.
Since most homepages only rank for a small portion of the total number of terms for an established website, that's the primary cause of your immediate problem.
To fix it, you have some hard work ahead of you to capture as many of the old URLs as possible and write redirects to the new URLs. These old pages can be captured in a number of ways. The easiest initial method is to look up all the 404 errors in you Google Search Console, sort them by date, then start fixing all the ones after the date of the site change.
You can also use your Analytics data - create a report of all the page URLs of your site that received traffic in the year before the change, then sort them by highest traffic to prioritise where to start creating rewrites. You can also capture the current 404 errors in your Analytics data for high-priority pages to get redirected.
For a final more high tech solution, you can use Screaming Frog SEO Crawler to crawl the archive.org WayBack Machine version of your site to capture as many old URLs as possible.
Hope all that makes sense?
Paul
-
Not using 301s could be a big part of the problem. Do your old backlinks all point to existing pages on the new domain?
-
Hello,
According to Wayback machine, you've migrated your website after April,13.
https://web.archive.org/web/20160413071802/http://www.kanataskinclinic.ca/
We can see clearly that you have changed everything ! Design, photos... So you've changed the UX ! Text is important but Google takes user engagement into account, and if users are not reassured by your new design, you will never get back your positions ! In old design buttons are clearly identifiable, it's more simple to navigate, some menus are youseful like "Why choose us ?"... I think you'd better improve old design and navigation and forget the new one !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Core Web Vitals hit Mobile Rankings
Hey all, Ever since Google announced "Core Web Vitals" are mobile rankings have nose-dived. At first, I thought it was optimisation changes to the page titles we had made which might still be part of the issue. However, Desktop rankings actuallyy increased for the same pages where mobile decreased. There is the plan to introduce a new ranking signal into the Google algorithm called the "core web vitals: and this was discussed around late May. even though it's supposed to get fully indexed into a ranking signal later this year or early next; I think Google continuously test and release this items before any official release. If you weren't aware, there is a section in Google Webmaster Tools related to "core web visits", which looks at:1. Loading2. Interactivity3. Visual StabilityThis overlays some of the other basic requirements of a good website and mobile experience. Taking a look at our Google Search Console, it appears to be the following:1. Mobile- 1,006 poor URLs, 100URLs need improvement and 475 good URLs.2. desktop- 0 poor URLs, 379 need improvements and 1,200 good URLsSOURCE: https://search.google.com/search-console/core-web-vitals?resource_id=https%3A%2F%2Fwww.griffith.ie%2FIn the report, we can see two distinct issues with the mobile pages:CLS Issue: more than 0.25 (mobile)- 1,006 casesLCP issue: longer than 4secs (mobile) - 348 case_CLS (Cumulative Layout Shift)This is a developer issue, and needs fixing. It's basically when a mobile screen jumps for the user. It is explained in this article: https://web.dev/cls/Seems to be an issue with all pages. **LCP (Largest Contentful Paint)_**Again, another developer fix that needs to be implemented. It's connected to page speed, and can be viewed here: https://web.dev/lcp/Looking at GCS, it looks like the blog content is mostly to blame.It's worth fixing these issues and again looking at the other items on page speed score tests:1. Leverage browser caching- https://gtmetrix.com/reports/griffith.ie/rBtvUC0F2. https://developers.google.com/speed/pagespeed/insights/?url=griffith.ie- mobile score for home page is 16/100, https://www.griffith.ie/people/thamil-venthan-ananthavinayagan is 15/100I think here is the biggest indicator of the issue at hand. Has anybody else noticed their mobile rankings go down and desktop stay the same of increase.Kind regards,
Web Design | | robhough909
Rob0 -
Seperating Different Parts Of The Website
Hi There, I have a client with two parts to his business both for different types of customer with different language and copy needed. At the moment they have one website and I am trying to figure out the most search engine friendly way to present these different types. So for example if a client came in looking for service A, he would see the home page for service A and if he came in looking for service B he would come across the home page for service B. I know I could have seperate service pages for each service he provides, but I think it would be off putting to come to a home page of a site and see completely unrelated services on one page. I hope I am explaining myself here. As far as I can see the options are:- subdomains for servicea.examplesite.com, serviceb.examplesite.com and a split page (see attachement) where you click which you are interested in (don't like this idea) seperate websites a home page which shows all the services (too confusing) Any advice would be most grateful. Regards Neil MpYSKqN
Web Design | | nezona0 -
Making a website menu + structure + hierarchies + kw research
When making a new website structure I assume we all think about SEO but at the same time as we can't forget UX.
Web Design | | ceranoktan
How much of your KW research would you implement in menu \ website structure?
What do you think is important to think about when making a website structure? Thank you in advance, BR,
Ceran0 -
Changing website to Mobile site
Hi All, We are building a new website and would take it live in next few weeks. I want help to understand what all should we consider.. The site structure would be same with additional features but the URL path and site name is same...Should i take any precausions before shutting down the old site and getting the new one live? Thanks
Web Design | | jomin740 -
Should Our Mobile Responsive Version of our Ecommerce Site include the on Page content to Help with Rankings
Hello All, We are soon to launch our new redesigned website along with a mobile responsive version but i have noticed we currently don't include the on page Content we have on the mobile version which we have on the desktop version to help with rankings etc. I am not sure how google does mobile research with regards to rankings. We have designed our responsive version to be as user friendly as possible at the expense of having to much clutter/content but I am wondering now , if we will rank on mobile if all our on page content isn't present. Just wondered if we should include it at the bottom of the pages with say a "Read more" extension to help avoid clutter? Any advice greatly appreciated thanks Pete
Web Design | | PeteC120 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
Best Approach to Rank For Multiple Locations With Similar Targeted Keywords
I'm trying to determine the best way to set up a website to rank for a similar set of keyword phrases in three different cities. The keyword phrases I want to rank for are all pretty much the same with the only difference being the city associated with the keyword phrase. For example, "Austin water restoration" vs "San Antonio water restoration" vs "Houston water restoration". Each city needs about 7 or 8 pages of unique content to accurately target the group of keywords I'm trying to rank for. My initial thought was to write up unique content for each city and have each city act a site within the main site. For example, the main navigation for xyz.com/austin would be Austin specific, so when you land on xyz.com/austin and go to Services - Water Restoration, it would be all Austin specific content. The same would be true for San Antonio and Houston. The only problem with this approach is that I have to build up the page authority for a lot of different pages. It would be much easier to build up the page authority for one Water Restoration page and just insert a little "Areas we serve" on the page that includes "Austin, San Antonio, and Houston" and maybe work the coverage area in again at the bottom of the page somewhere. However, it would be much more difficult to work "Austin, San Antonio, and Houston" into the title tags and H1s though, and I couldn't logically work the cities into the content as much either. That would be a downside to this approach. Any thoughts on this? Wondering how large companies with hundreds of locations typically approach this? I'd really appreciate your input.
Web Design | | shaycw0 -
Website Drops Some Traffic after Redesign. What's Happening?
What it is NOT: No Link was broken. I have used Moz, Screaming Frog, Excel, etc - there are not broken links. We have not added spammy links. We kept the same amount of links and content on the homepage - with an exception of 1 or 2. All the pages remained canonical. Our blog uses rel=prev rel=next, and each page is canonicalized to itself. We do not index duplicated content. Our tags are content="noindex,follow" We are using the Genesis Framework (we were not before.) Load time is quicker - we now have a dedicated server. Webmaster tools has not reported any crawl report problems. What we did that should have improved our rankings and traffic: Implemented schema.org Responsive design Our bounce rate is down - Average visit length is up. Any ideas?
Web Design | | Thriveworks-Counseling0