Domain Authority hasn't recovered since August
-
I really need some major advice on this one. Back in September, I asked a question on here as follows:
"A client wanted to change their domain name, which we have now done. The site content itself is exactly the same. We put 301 redirect links in so that Google searchers would redirect from the old site to the new one. However Moz then said that it couldn't crawl the old domain because of the redirects and advised creating a brand new campaign for the new domain. We have done this but now Moz says that the domain authority of the new site is 2 (it was 14 on the old domain)." My original question and the answers I got are here: https://moz.com/community/q/new-domain-wipes-out-domain-authority).
Generally the responses I got were that we should give Moz time to crawl the new domain and process all the "new" pages.
It is now February, ie 6 months after the domain rename, and on Moz the site still has a DA of 2. It seems like 6 months is enough time to wait. We checked all the recommended guides and believe we have done it all correctly.
I really don't know what to do now. Can anyone help or have a quick look and work out why this is so bad?
Specifics are:
old domain: https://ryemeadcleaning.co.uk
new domain: https://ryemeadgroup.co.uk -
Thanks for your responses Maureen
From what I know, sometimes when you alter your site to be 'faster', you sometimes have to wait a few days for that to start reflecting in the page-loading speeds. I am pretty sure that, if you have server-side caching enabled, and resources have been cached (previously) non-compressed, then sometimes the old resources will continue being served to people for days (or even weeks) after alterations are made
This is certainly true of image compression (where the old JPG / PNG files continue to be served after being replaced with more highly compressed versions, since the cache has not refreshed yet) - I am unsure of whether that applies to GZip compressed files or not (sorry!)
From what I understand, page-speed optimisation is not a straightforward, linear process. For example many changes you could make, benefit 'returning' visitors whilst making the site slower for fir-time visitors (and the reverse is also true, there are changes which take you in both directions). Due to these competing axioms, it's often tricky to get the best of both. For example, one common recommendation is to get all your il-line (or in-source) CSS and JS - and place it in '.css' or '.js' files which are linked to by your web pages
Because most pages will call in the 'separated out' CSS or JS files as a kind of external common module (library), this means that once a user has cached the CSS or JS, it doesn't have to be loaded again. This benefits returning site-users. On the flip-side, because external files have to be pulled in and referenced on the first load (and because they often contain more CSS / JS than is needed) - first time users take a hit. As you can see, these are tricky waters to navigate and Google still doesn't make it clear whether they prefer faster speeds for returning or first-time users. In my experience, their bias floats more towards satisfying first-time users
Some changes that you make like compressing image files (and making them smaller) benefit both groups, just be wary of recommendations which push one user-group's experience at the expense of another
For image compression, I'd recommend running all your images (download them all via FTP to preserve the folder structure) through something like Kraken. I tend to use the 'lossy' compression algorithm, which is still relatively lossless in terms of quality (I can't tell the difference, anyway). Quite often developers will tell me that a 'great' WordPress plugin has been installed to compress images effectively. In almost all cases, Kraken does a 25%-50% better job. This is because WP plugins are designed to be run on a server which is also hosting the main site and serving web-traffic, as such these plugins are coded not to use too much processing power (and they fail to achieve good level of compression). I'm afraid there's still no substitute for a purpose-built tool and some FTP file-swapping :') remember though, even when the images are replaced, the cache will have to cycle before you'll see gains...
Hope that helps
-
Just to add that we also added GZIP compression and page caching to speed it up but it appears to be little steps on the gain front.
Again, really appreciate the time you've taken. Have a great weekend.
-
I typed up a response but it seems to have disappeared. Thank you so much for your very comprehensive message about this, I really do appreciate it. I am going to digest all this information over the weekend and discuss with my colleagues to see if we can make a plan to improve the site.
We did some work today and reduced the page load speed down by about 60% by removing some plug-ins, and that reduced the site score from an F to a D - you must have checked after we'd done it. But sure, D is still not great!
Perhaps LCN isn't the right environment then, I believe the site was already hosted there when we took it over last year. We'll have to look at this. Usually we host our Wordpress sites on WPEngine and they seem pretty good, but this one is an exception.
Thanks again and have a great weekend!
-
It may help, but a lot of plugins burden the server more on the back-end than they do on the front end. Still, it all helps! In this case though, I wouldn't expect anything noticeable in that area (at all)
Right... I wrote WAY more below than I had anticipated writing. But I think I have demonstrated pretty conclusively that yes, site performance is your main hindrance. I may use some dramatic language below, it's just because I'm passionate about search and SEO :') so please don't be offended. That being said, yeah the situation is pretty bad
It (the site) actually is very slow and laggy, that could have a lot to do with it if site performance has decreased. Using my favoured page-speed tool, GTMetrix - you can easily see that the scores are pretty bad all around. Here's a screenshot
If you look at the waterfall chart it generates (needs a free account only, no payment details required) then you can see that the request "GET ryemeadgroup.co.uk" occurs three times and seems to take ages and ages to respond. Looking at the data as a once-over, I can't tell if that's just the request to get the whole page (so obviously it would be longest) or something else. If that is what it is, I don't get why it recurs thrice
You could optimise all your images. You could set GZip compression and do lots of things, but the fact is - the server environment is just horribly, horribly weak. I did a very small, few minutes long stress test which I had to cancel almost immediately. Even crawling the site at a few URLs per second stops it rendering and causes it to time-out! If I had left the test going it could have hurt the site or taken it offline, so before that happened I stopped the crawler in its tracks
By the way, this is a crawler designed for low-intensity SEO crawling, not actual stress-testing or DDoS simulation. From what I can see here, any user with a reasonable website connection (maybe BT infinity or one of the Virgin cable deals) could, if they wanted to - just take your site offline, just like that. Someone trying to do it maliciously would use much more aggressive tools and crawling techniques
The crawl delay has probably been set in your robots.txt to compensate for this. But what that means is, with the crawl delay on Google can't index your site and content fast enough. With the delay removed, they still won't be able to because even their basic, non-intensive crawling will take your site offline in seconds / minutes
Obviously when a new site goes live, even with the same files on the same server-environment, it slows down for a bit (while the cache builds back up again). Whilst that could be part of the problem, the main problem is that no matter what you do - that server environ is only fit for a hobbyist, not a fully-fledged business. Worse still, even if you did some amazing Digital PR and got loads of traffic, you wouldn't get it because it would knock the site offline. So even when you win, you'll lose anyway
Check out these terrible (worst I have ever seen) Google Page-Speed loading scores. I fired Google off to check your site, just after I had finished and killed my extremely moderate stress-test. Look at this screenshot. I know Google PSI asks for too much, but this is just dire
Let's check through Google's Mobile Website Speed Testing tool (which fires requests through 3g, just to be safe). This time I left a margin of time after the mild stress-test to see if scores got significantly better. Nope, the results are still really poor
Let's try Pingdom Tools. Here are the results. Again, really poor grade. Wouldn't be happy if my kid got a D on a school assignment, not happy with the grade here either. Beginning to see a pattern with all this?
I guess you might be in one of those situations where decision-makers are saying, before we put more money into the site (for a better server) we want to see more success. Well guess what? That's technically impossible. If you get more traffic, your site will go down - taking the Google Analytics tracking script with it. So all that traffic will be invisible to them, and they'll never have the data to decide that they need better. As such, it's a self-fulfilling circle here unless they'll just budge
What you're in danger of here, is taking an old mule on its last legs and 'optimising it'. Give it reinforced leg-braces, stuff it full of steroids. It all helps a bit, but ... you know, never be surprised when it entirely fails to beat an actual race-horse. It's not winning the grand national, the old girl (the hosting environment) simply doesn't have it in her
So what's the problem? Not enough processing power (brain-power) for the server? Not enough RAM (memory)? Not enough bandwidth?
It could be any or all of the above. When someone requests data (web-pages) from your server, three basic things have to happen. The server has to 'think of' what the user wants, if the processor can't keep up - then no matter how good the bandwidth, it's like putting "2+2=4" on a huge blackboard and expecting it to look good, look sophisticated (it won't). Next you have local memory. Once the server thinks of what it has to 'assemble' for the user, those lego pieces have to be put down in (very) temporary 'storage' before they can be shipped to the user. If you have great processing power and bandwidth on your server, great - but if it's funneled through narrow local memory... It's like trying to fit the entire theory of general relatively on one corner of a post-it note. It's not happening
Finally you have your bandwidth. If everything else is great and your bandwidth sucks, then locally you generate complex pages really fast - but can't get them 'shipped' to the user in a timely fashion
I don't know what the exact problem(s) with you server are, but it sucks. You have to investigate and secure better spend, or it will never ever improve!
Quite often, page-speed changes only influence moderate gains in Google's SERPs. That's because once you reach a certain standard, most users will be satisfied and so will Google. But in your extreme situation, I can well mark that - you will be under nasty algorithmic SERP devaluations.
Dev changes and coding will only get you so far, at the end of the day your site needs a good home to live in. Currently, it doesn't have one. It doesn't live on a server that Google would take seriously for an online business (IMO)
Important P.S: There is one other alternative issue, other than what I have summarised. It may not be that the server is weak, it may be that the server is programmed to 'fake' weakness and 'play dead' when one source (a crawler, or a user) gets too aggressive. If so, that same fail-safe has been over applied and is affecting Google's results. Play dead to Google? Get dead results. To establish whether my initial thoughts are right - or whether this final 'PS' is correct, we'd need to talk in real-time (over chat) and establish a two-way stress test. I'd need to stress the site again a little, make it time out for me - then see if it's also timing out for you. If it's affecting just me, your problem is an over-applied defense mechanism. If it's affecting both of us... the server is garbage
-
Yes the content was the same, the client just wanted to change the domain name as he changed the company trading name.
Thanks for the tip about the robots.txt crawl-delay - we'll have a look at that. I don't know why that was set. The site is hosted on LCN. The page loading speeds have increased considerably since 2 things happened:
The domain name was changed - but was kept on the same hosting.
We applied an SSL certificate to change to httpsWe have switched off some plug-ins to speed up the page loading, that may help perhaps.
-
There's a lot of conflicting information circulating about, what constitutes 'proper' redirects. If your content is slightly different on the new domain, then 301 redirects won't translate the 'full' amount of SEO authority across. You did say that, the content is exactly the same on the new domain - so I guess that wouldn't be it!
That makes me think that something could be technically wrong with the redirects, or that something is different for the new domain. Is it still on the same hosting environment, or did you move that when the new domain was applied? I am wondering if page-loading speeds have changed negatively
Another thing I see is that, in your robots.txt file you have set a crawl delay. If that wasn't on the site before it moved, it could potentially be hampering Google in terms of ... keeping their view of the site up to date (which in turn could hit rankings)
-
Hi effectdigital, thanks for your response. I understand your point, but on the old site their DA was 14. It dropped to 2 in August when the new, renamed site went live and has not changed since. Their rankings have also dropped by about 50% but we have consulted all the expert guides and believe the redirects have been done properly.
-
There are a few things to consider here. The first and foremost, is that PA and DA (Moz's metrics) are 'shadow metrics' which are meant to mimic Google's true PageRank algorithm. Since Google has never made PageRank public knowledge (except for a very watered down, over-simplified version which used to be accessible through some browser extensions, which Google have now decommissioned) - obviously SEOs needed to build a metric all of their own. Moz accomplished this
Due to this, many backlink-index providing platforms (like Moz, Ahrefs, Majestic etc) have tried to create alternate metrics (PA, Citation Flow, Ahrefs Rating) based upon similar philosophies, so that web-marketers have 'something' to go on, in terms of evaluating web-page worth from a machine's perspective. But don't be fooled, Google evaluates the strength of web-pages via their own internal PageRank algorithm (in its 'true' form, web-marketers have never seen it!)
Because Moz's page and link index is nowhere close to the same size or scope as Google's, PA and DA are 'shadow' metrics. They are indicators only, and are to be taken with a pinch of salt. Google does not use 'Page Authority' or 'Domain Authority' from Moz in their ranking algorithms, instead they use PageRank
Because PA and DA are shadow metrics, based on a smaller index (sample of web-pages), they don't react as quickly to change as Google's 'real' page-weighting metrics. As such, unless you're seeing a colossal drop-off in terms of traffic, revenue etc... I wouldn't worry much (at all) about your DA score
**If you are also **seeing a performance drop-off, that's bad news and it hints at a botched site migration with improperly configured redirects
-
Hi Dave, since we changed the htacess file a couple of days ago to allow the dotbot, I forced a new crawl by Moz yesterday. It has made no difference and the Domain Authority is still 2, all other factors remain the same, eg rankings etc. The only thing that's changed is that there are now only 2 404 errors whereas there were 5 previously, 3 of which we have fixed. Could there be something more fundamental wrong?
-
Hi again Dave
We have changed the htaccess file to allow the dotbot - so hopefully now this will rectify itself? I don’t know why it was set up this way as none of our other sites are. Can you check that you can access the site now please?
-
Thanks Dave, I will investigate this
-
Yes
-
Hey Maureen, thanks for reaching out!
So I tried to curl your new domain (http://ryemeadgroup.co.uk) against our link index user-agent and it appears we are blocked from that site. I get a 403 forbidden when trying to access it so we would not be able to index it without being unblocked. Typically this is resolved by your hosting admin, they should be able to whitelist our user-agent "dotbot". I hope that helps to point you in the right direction, if you need further technical assistance, please reach out to help@moz.com!
-
Hello There,
I went through a similar process recently, did you change the domain on GSC?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We're in Australia are webinars offered during our time
The hours offered for the Moz training are not in our business hours, they're inconvenient.
Getting Started | | RepositPower0 -
Moz can't crawl my site.
Moz cannot carry out the site crawl on my online shop. Not really sure what the issue is, it has no problem getting onto my site when you use www. before the address, but it needs to be able to access bluerinsevintage.co.uk Stuck as what to do, we are a shopify store. Anyone else had this problem, or know what i need to change so they can crawl the site? thjis is the page they are getting when trying to get on bluerinsevintage.co.uk but if they use www.bluerinsevintage.co.uk the site comes up. Adam
Getting Started | | bluerinsevintage0 -
Standard Syntax in robots.txt doesn't prevent Moz bot from crawling
A client is getting many false positive site crawl errors for things like duplicate titles and duplicate content on pages that include /tag/ in the URL. An example is https://needquest.com/place_tag/autism-spectrum-disorder/page/4/ To resolve this we have set up a disallow statement in the robots.txt file that says
Getting Started | | btreloar
Disallow: /page/ For some reason this appears not to work, as the site crawl errors continue to list pages like this. Does anyone understand why that would be and what we need to do to properly disallow crawling these pages?0 -
SSL - green padlock but Moz say there's an 804 error?
Hi, my site has a green padlock and no SSL errors but Moz are reporting an 804 error. I use CloudFlare with fairly complex settings. I've read this thread but it's quite old and I don't understand which parts of it are still valid. I'd love to know whether this can be sorted before I spend hours setting up Moz's features as if they can't crawl my site then I would obviously need to cancel my subscription. Thanks
Getting Started | | Barn2Plugins0 -
New non-www. web address but the domain is the same
Hi Everyone, we're launching a new WP website that has a non-www. web address. Old address www.1to1therapy.ca, new address http://1to1therapy.ca. A re-direct has been created for the www. address. It appears that this is causing an issue for the Moz page crawler. It is currently only crawling 1 page. I will set up a new campaign. BUT As best practice should I set up all new google analytics on http://1to1therapy.ca? It appears that the analytics are functioning correctly, but I'm unsure if any issues may arise from the change.
Getting Started | | JayTurner0 -
Where can I find the list of all the question I've asked here?
In my profile I see only the comments, I'd like to keep track also of my questions
Getting Started | | 2mlab0 -
Weekly Report not updated since Jan 26\.
I have multiple campaigns and they all say that the last update was Jan. 26th. Should I be getting a new weekly update for all of them by Feb. 3? Is there a setting that I need to checK? Thanks for your help 🙂 JoAnne
Getting Started | | jojobo0 -
I don't believe moz is seeing everything that is on my webpage
I used the page key word grader and got an "F" Moz said that my keyword employee handbook was not in my title nor was it found in the body of my page. But when I look at the page and double check everything it is there all over the place. I am not blaming moz this is a wiz site and while I am a beginner and very well could be wrong could anyone just take a look and tell me if I am nuts or what. The web page is http://www.cestoday.com/#!employee-handbook/co0h I now have the font so big I will have to fix that. Thank you
Getting Started | | redsman9440