Domain Authority hasn't recovered since August
-
I really need some major advice on this one. Back in September, I asked a question on here as follows:
"A client wanted to change their domain name, which we have now done. The site content itself is exactly the same. We put 301 redirect links in so that Google searchers would redirect from the old site to the new one. However Moz then said that it couldn't crawl the old domain because of the redirects and advised creating a brand new campaign for the new domain. We have done this but now Moz says that the domain authority of the new site is 2 (it was 14 on the old domain)." My original question and the answers I got are here: https://moz.com/community/q/new-domain-wipes-out-domain-authority).
Generally the responses I got were that we should give Moz time to crawl the new domain and process all the "new" pages.
It is now February, ie 6 months after the domain rename, and on Moz the site still has a DA of 2. It seems like 6 months is enough time to wait. We checked all the recommended guides and believe we have done it all correctly.
I really don't know what to do now. Can anyone help or have a quick look and work out why this is so bad?
Specifics are:
old domain: https://ryemeadcleaning.co.uk
new domain: https://ryemeadgroup.co.uk -
Thanks for your responses Maureen
From what I know, sometimes when you alter your site to be 'faster', you sometimes have to wait a few days for that to start reflecting in the page-loading speeds. I am pretty sure that, if you have server-side caching enabled, and resources have been cached (previously) non-compressed, then sometimes the old resources will continue being served to people for days (or even weeks) after alterations are made
This is certainly true of image compression (where the old JPG / PNG files continue to be served after being replaced with more highly compressed versions, since the cache has not refreshed yet) - I am unsure of whether that applies to GZip compressed files or not (sorry!)
From what I understand, page-speed optimisation is not a straightforward, linear process. For example many changes you could make, benefit 'returning' visitors whilst making the site slower for fir-time visitors (and the reverse is also true, there are changes which take you in both directions). Due to these competing axioms, it's often tricky to get the best of both. For example, one common recommendation is to get all your il-line (or in-source) CSS and JS - and place it in '.css' or '.js' files which are linked to by your web pages
Because most pages will call in the 'separated out' CSS or JS files as a kind of external common module (library), this means that once a user has cached the CSS or JS, it doesn't have to be loaded again. This benefits returning site-users. On the flip-side, because external files have to be pulled in and referenced on the first load (and because they often contain more CSS / JS than is needed) - first time users take a hit. As you can see, these are tricky waters to navigate and Google still doesn't make it clear whether they prefer faster speeds for returning or first-time users. In my experience, their bias floats more towards satisfying first-time users
Some changes that you make like compressing image files (and making them smaller) benefit both groups, just be wary of recommendations which push one user-group's experience at the expense of another
For image compression, I'd recommend running all your images (download them all via FTP to preserve the folder structure) through something like Kraken. I tend to use the 'lossy' compression algorithm, which is still relatively lossless in terms of quality (I can't tell the difference, anyway). Quite often developers will tell me that a 'great' WordPress plugin has been installed to compress images effectively. In almost all cases, Kraken does a 25%-50% better job. This is because WP plugins are designed to be run on a server which is also hosting the main site and serving web-traffic, as such these plugins are coded not to use too much processing power (and they fail to achieve good level of compression). I'm afraid there's still no substitute for a purpose-built tool and some FTP file-swapping :') remember though, even when the images are replaced, the cache will have to cycle before you'll see gains...
Hope that helps
-
Just to add that we also added GZIP compression and page caching to speed it up but it appears to be little steps on the gain front.
Again, really appreciate the time you've taken. Have a great weekend.
-
I typed up a response but it seems to have disappeared. Thank you so much for your very comprehensive message about this, I really do appreciate it. I am going to digest all this information over the weekend and discuss with my colleagues to see if we can make a plan to improve the site.
We did some work today and reduced the page load speed down by about 60% by removing some plug-ins, and that reduced the site score from an F to a D - you must have checked after we'd done it. But sure, D is still not great!
Perhaps LCN isn't the right environment then, I believe the site was already hosted there when we took it over last year. We'll have to look at this. Usually we host our Wordpress sites on WPEngine and they seem pretty good, but this one is an exception.
Thanks again and have a great weekend!
-
It may help, but a lot of plugins burden the server more on the back-end than they do on the front end. Still, it all helps! In this case though, I wouldn't expect anything noticeable in that area (at all)
Right... I wrote WAY more below than I had anticipated writing. But I think I have demonstrated pretty conclusively that yes, site performance is your main hindrance. I may use some dramatic language below, it's just because I'm passionate about search and SEO :') so please don't be offended. That being said, yeah the situation is pretty bad
It (the site) actually is very slow and laggy, that could have a lot to do with it if site performance has decreased. Using my favoured page-speed tool, GTMetrix - you can easily see that the scores are pretty bad all around. Here's a screenshot
If you look at the waterfall chart it generates (needs a free account only, no payment details required) then you can see that the request "GET ryemeadgroup.co.uk" occurs three times and seems to take ages and ages to respond. Looking at the data as a once-over, I can't tell if that's just the request to get the whole page (so obviously it would be longest) or something else. If that is what it is, I don't get why it recurs thrice
You could optimise all your images. You could set GZip compression and do lots of things, but the fact is - the server environment is just horribly, horribly weak. I did a very small, few minutes long stress test which I had to cancel almost immediately. Even crawling the site at a few URLs per second stops it rendering and causes it to time-out! If I had left the test going it could have hurt the site or taken it offline, so before that happened I stopped the crawler in its tracks
By the way, this is a crawler designed for low-intensity SEO crawling, not actual stress-testing or DDoS simulation. From what I can see here, any user with a reasonable website connection (maybe BT infinity or one of the Virgin cable deals) could, if they wanted to - just take your site offline, just like that. Someone trying to do it maliciously would use much more aggressive tools and crawling techniques
The crawl delay has probably been set in your robots.txt to compensate for this. But what that means is, with the crawl delay on Google can't index your site and content fast enough. With the delay removed, they still won't be able to because even their basic, non-intensive crawling will take your site offline in seconds / minutes
Obviously when a new site goes live, even with the same files on the same server-environment, it slows down for a bit (while the cache builds back up again). Whilst that could be part of the problem, the main problem is that no matter what you do - that server environ is only fit for a hobbyist, not a fully-fledged business. Worse still, even if you did some amazing Digital PR and got loads of traffic, you wouldn't get it because it would knock the site offline. So even when you win, you'll lose anyway
Check out these terrible (worst I have ever seen) Google Page-Speed loading scores. I fired Google off to check your site, just after I had finished and killed my extremely moderate stress-test. Look at this screenshot. I know Google PSI asks for too much, but this is just dire
Let's check through Google's Mobile Website Speed Testing tool (which fires requests through 3g, just to be safe). This time I left a margin of time after the mild stress-test to see if scores got significantly better. Nope, the results are still really poor
Let's try Pingdom Tools. Here are the results. Again, really poor grade. Wouldn't be happy if my kid got a D on a school assignment, not happy with the grade here either. Beginning to see a pattern with all this?
I guess you might be in one of those situations where decision-makers are saying, before we put more money into the site (for a better server) we want to see more success. Well guess what? That's technically impossible. If you get more traffic, your site will go down - taking the Google Analytics tracking script with it. So all that traffic will be invisible to them, and they'll never have the data to decide that they need better. As such, it's a self-fulfilling circle here unless they'll just budge
What you're in danger of here, is taking an old mule on its last legs and 'optimising it'. Give it reinforced leg-braces, stuff it full of steroids. It all helps a bit, but ... you know, never be surprised when it entirely fails to beat an actual race-horse. It's not winning the grand national, the old girl (the hosting environment) simply doesn't have it in her
So what's the problem? Not enough processing power (brain-power) for the server? Not enough RAM (memory)? Not enough bandwidth?
It could be any or all of the above. When someone requests data (web-pages) from your server, three basic things have to happen. The server has to 'think of' what the user wants, if the processor can't keep up - then no matter how good the bandwidth, it's like putting "2+2=4" on a huge blackboard and expecting it to look good, look sophisticated (it won't). Next you have local memory. Once the server thinks of what it has to 'assemble' for the user, those lego pieces have to be put down in (very) temporary 'storage' before they can be shipped to the user. If you have great processing power and bandwidth on your server, great - but if it's funneled through narrow local memory... It's like trying to fit the entire theory of general relatively on one corner of a post-it note. It's not happening
Finally you have your bandwidth. If everything else is great and your bandwidth sucks, then locally you generate complex pages really fast - but can't get them 'shipped' to the user in a timely fashion
I don't know what the exact problem(s) with you server are, but it sucks. You have to investigate and secure better spend, or it will never ever improve!
Quite often, page-speed changes only influence moderate gains in Google's SERPs. That's because once you reach a certain standard, most users will be satisfied and so will Google. But in your extreme situation, I can well mark that - you will be under nasty algorithmic SERP devaluations.
Dev changes and coding will only get you so far, at the end of the day your site needs a good home to live in. Currently, it doesn't have one. It doesn't live on a server that Google would take seriously for an online business (IMO)
Important P.S: There is one other alternative issue, other than what I have summarised. It may not be that the server is weak, it may be that the server is programmed to 'fake' weakness and 'play dead' when one source (a crawler, or a user) gets too aggressive. If so, that same fail-safe has been over applied and is affecting Google's results. Play dead to Google? Get dead results. To establish whether my initial thoughts are right - or whether this final 'PS' is correct, we'd need to talk in real-time (over chat) and establish a two-way stress test. I'd need to stress the site again a little, make it time out for me - then see if it's also timing out for you. If it's affecting just me, your problem is an over-applied defense mechanism. If it's affecting both of us... the server is garbage
-
Yes the content was the same, the client just wanted to change the domain name as he changed the company trading name.
Thanks for the tip about the robots.txt crawl-delay - we'll have a look at that. I don't know why that was set. The site is hosted on LCN. The page loading speeds have increased considerably since 2 things happened:
The domain name was changed - but was kept on the same hosting.
We applied an SSL certificate to change to httpsWe have switched off some plug-ins to speed up the page loading, that may help perhaps.
-
There's a lot of conflicting information circulating about, what constitutes 'proper' redirects. If your content is slightly different on the new domain, then 301 redirects won't translate the 'full' amount of SEO authority across. You did say that, the content is exactly the same on the new domain - so I guess that wouldn't be it!
That makes me think that something could be technically wrong with the redirects, or that something is different for the new domain. Is it still on the same hosting environment, or did you move that when the new domain was applied? I am wondering if page-loading speeds have changed negatively
Another thing I see is that, in your robots.txt file you have set a crawl delay. If that wasn't on the site before it moved, it could potentially be hampering Google in terms of ... keeping their view of the site up to date (which in turn could hit rankings)
-
Hi effectdigital, thanks for your response. I understand your point, but on the old site their DA was 14. It dropped to 2 in August when the new, renamed site went live and has not changed since. Their rankings have also dropped by about 50% but we have consulted all the expert guides and believe the redirects have been done properly.
-
There are a few things to consider here. The first and foremost, is that PA and DA (Moz's metrics) are 'shadow metrics' which are meant to mimic Google's true PageRank algorithm. Since Google has never made PageRank public knowledge (except for a very watered down, over-simplified version which used to be accessible through some browser extensions, which Google have now decommissioned) - obviously SEOs needed to build a metric all of their own. Moz accomplished this
Due to this, many backlink-index providing platforms (like Moz, Ahrefs, Majestic etc) have tried to create alternate metrics (PA, Citation Flow, Ahrefs Rating) based upon similar philosophies, so that web-marketers have 'something' to go on, in terms of evaluating web-page worth from a machine's perspective. But don't be fooled, Google evaluates the strength of web-pages via their own internal PageRank algorithm (in its 'true' form, web-marketers have never seen it!)
Because Moz's page and link index is nowhere close to the same size or scope as Google's, PA and DA are 'shadow' metrics. They are indicators only, and are to be taken with a pinch of salt. Google does not use 'Page Authority' or 'Domain Authority' from Moz in their ranking algorithms, instead they use PageRank
Because PA and DA are shadow metrics, based on a smaller index (sample of web-pages), they don't react as quickly to change as Google's 'real' page-weighting metrics. As such, unless you're seeing a colossal drop-off in terms of traffic, revenue etc... I wouldn't worry much (at all) about your DA score
**If you are also **seeing a performance drop-off, that's bad news and it hints at a botched site migration with improperly configured redirects
-
Hi Dave, since we changed the htacess file a couple of days ago to allow the dotbot, I forced a new crawl by Moz yesterday. It has made no difference and the Domain Authority is still 2, all other factors remain the same, eg rankings etc. The only thing that's changed is that there are now only 2 404 errors whereas there were 5 previously, 3 of which we have fixed. Could there be something more fundamental wrong?
-
Hi again Dave
We have changed the htaccess file to allow the dotbot - so hopefully now this will rectify itself? I don’t know why it was set up this way as none of our other sites are. Can you check that you can access the site now please?
-
Thanks Dave, I will investigate this
-
Yes
-
Hey Maureen, thanks for reaching out!
So I tried to curl your new domain (http://ryemeadgroup.co.uk) against our link index user-agent and it appears we are blocked from that site. I get a 403 forbidden when trying to access it so we would not be able to index it without being unblocked. Typically this is resolved by your hosting admin, they should be able to whitelist our user-agent "dotbot". I hope that helps to point you in the right direction, if you need further technical assistance, please reach out to help@moz.com!
-
Hello There,
I went through a similar process recently, did you change the domain on GSC?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Site Crawl can't index WIX sites
We've been attempting to work on some SEO for a new potential client however they are using a WIX site. We've noticed that Moz SEO tools will not index any WIX sites. e.g. https://www.sharonradisch.com/ (which is one of their case studies). Anyone seen this that can offer any advice? Thanks,
Getting Started | | monkeex
Mark2 -
I set up a custom report to be sent to me daily about a week ago, but I haven't received any of them yet?
How long do they really take? I think I opted in to receive emails from MOZ but I didn't see that as an option on the email form. I need to be able to rely on the assumption that if I set up a custom email to send that it will in fact send.
Getting Started | | matthew.dimmett0 -
Improve Page Authority or Domain Authority
I am very confused as to the practical implementation of these 2 terms PA and DA. I have a website and i want to rank a particular page. So to rank it at the top, should i have more backlinks of page** or **more backlinks of my Homepage. Does more back links of a particular page improves only PA or (PA & DA) both ?
Getting Started | | himanshu22630 -
Why can't I Ctrl + click on links on Moz any more?
I'm interested if it's just me that gets frustrated by this? I've just Ctrl + clicked a few links to open them in separate tabs and then realised that none of them had opened. I know it's been like this for a while. It's a usability issue as it goes against expected norms, and now I have to right-click and then click "Open in new tab" on each link, which is more time-consuming and frustrating. More and more websites seem to be losing their Ctrl + click on links ability (JavaScript often breaks it). I don't know if there's a Mac equivalent... Anyway, I hope that doesn't seem like I'm too angry. It just frustrates me a little and I hope it gets fixed. 🙂 Edit - I've just realised these are getting blocked by Chrome's pop-up blocker - but why? It's only an issue on a small number of websites.
Getting Started | | Alex-Harford1 -
Can't create a brand query?
When I try to create a brand query, clicking the 'preview query' button doesn't do anything. Anyone else have this happening to them? It would be my first brand query if that matters.
Getting Started | | pixelflyte20140 -
Why I can't add more campaign to my moz analytics?
As we know that being a standard pro member in Moz community, we have at least 5 campaigns to add and track & as well in new moz analytics. But when I just made http://www.norwoodgreen.in website campaign an archive to make active or add one of the old himachalpackagetour.com campaign which I was unable to do WHY WHY WHY? Let me know . .? Also do guide or train me How to use Moz analytics as I was not enough familiar with it as I am habitual with in Pro tools. Best,
Getting Started | | Futura
Teginder0 -
Oops! This doesn't appear to be a valid URL. Please try again.
Hi, I just started with using MOZ and at my first attemp I can not start a new campaign because my url doesn't appear to be valid.. I tried http://www.mydomain.nl , www.mydomain.nl and mydomain.nl.. It's a Dutch extension (.nl). Can this be the problem? Excuse me my spelling and typo's. I'm Dutch 🙂
Getting Started | | SWP0