Domain Authority hasn't recovered since August
-
I really need some major advice on this one. Back in September, I asked a question on here as follows:
"A client wanted to change their domain name, which we have now done. The site content itself is exactly the same. We put 301 redirect links in so that Google searchers would redirect from the old site to the new one. However Moz then said that it couldn't crawl the old domain because of the redirects and advised creating a brand new campaign for the new domain. We have done this but now Moz says that the domain authority of the new site is 2 (it was 14 on the old domain)." My original question and the answers I got are here: https://moz.com/community/q/new-domain-wipes-out-domain-authority).
Generally the responses I got were that we should give Moz time to crawl the new domain and process all the "new" pages.
It is now February, ie 6 months after the domain rename, and on Moz the site still has a DA of 2. It seems like 6 months is enough time to wait. We checked all the recommended guides and believe we have done it all correctly.
I really don't know what to do now. Can anyone help or have a quick look and work out why this is so bad?
Specifics are:
old domain: https://ryemeadcleaning.co.uk
new domain: https://ryemeadgroup.co.uk -
Thanks for your responses Maureen
From what I know, sometimes when you alter your site to be 'faster', you sometimes have to wait a few days for that to start reflecting in the page-loading speeds. I am pretty sure that, if you have server-side caching enabled, and resources have been cached (previously) non-compressed, then sometimes the old resources will continue being served to people for days (or even weeks) after alterations are made
This is certainly true of image compression (where the old JPG / PNG files continue to be served after being replaced with more highly compressed versions, since the cache has not refreshed yet) - I am unsure of whether that applies to GZip compressed files or not (sorry!)
From what I understand, page-speed optimisation is not a straightforward, linear process. For example many changes you could make, benefit 'returning' visitors whilst making the site slower for fir-time visitors (and the reverse is also true, there are changes which take you in both directions). Due to these competing axioms, it's often tricky to get the best of both. For example, one common recommendation is to get all your il-line (or in-source) CSS and JS - and place it in '.css' or '.js' files which are linked to by your web pages
Because most pages will call in the 'separated out' CSS or JS files as a kind of external common module (library), this means that once a user has cached the CSS or JS, it doesn't have to be loaded again. This benefits returning site-users. On the flip-side, because external files have to be pulled in and referenced on the first load (and because they often contain more CSS / JS than is needed) - first time users take a hit. As you can see, these are tricky waters to navigate and Google still doesn't make it clear whether they prefer faster speeds for returning or first-time users. In my experience, their bias floats more towards satisfying first-time users
Some changes that you make like compressing image files (and making them smaller) benefit both groups, just be wary of recommendations which push one user-group's experience at the expense of another
For image compression, I'd recommend running all your images (download them all via FTP to preserve the folder structure) through something like Kraken. I tend to use the 'lossy' compression algorithm, which is still relatively lossless in terms of quality (I can't tell the difference, anyway). Quite often developers will tell me that a 'great' WordPress plugin has been installed to compress images effectively. In almost all cases, Kraken does a 25%-50% better job. This is because WP plugins are designed to be run on a server which is also hosting the main site and serving web-traffic, as such these plugins are coded not to use too much processing power (and they fail to achieve good level of compression). I'm afraid there's still no substitute for a purpose-built tool and some FTP file-swapping :') remember though, even when the images are replaced, the cache will have to cycle before you'll see gains...
Hope that helps
-
Just to add that we also added GZIP compression and page caching to speed it up but it appears to be little steps on the gain front.
Again, really appreciate the time you've taken. Have a great weekend.
-
I typed up a response but it seems to have disappeared. Thank you so much for your very comprehensive message about this, I really do appreciate it. I am going to digest all this information over the weekend and discuss with my colleagues to see if we can make a plan to improve the site.
We did some work today and reduced the page load speed down by about 60% by removing some plug-ins, and that reduced the site score from an F to a D - you must have checked after we'd done it. But sure, D is still not great!
Perhaps LCN isn't the right environment then, I believe the site was already hosted there when we took it over last year. We'll have to look at this. Usually we host our Wordpress sites on WPEngine and they seem pretty good, but this one is an exception.
Thanks again and have a great weekend!
-
It may help, but a lot of plugins burden the server more on the back-end than they do on the front end. Still, it all helps! In this case though, I wouldn't expect anything noticeable in that area (at all)
Right... I wrote WAY more below than I had anticipated writing. But I think I have demonstrated pretty conclusively that yes, site performance is your main hindrance. I may use some dramatic language below, it's just because I'm passionate about search and SEO :') so please don't be offended. That being said, yeah the situation is pretty bad
It (the site) actually is very slow and laggy, that could have a lot to do with it if site performance has decreased. Using my favoured page-speed tool, GTMetrix - you can easily see that the scores are pretty bad all around. Here's a screenshot
If you look at the waterfall chart it generates (needs a free account only, no payment details required) then you can see that the request "GET ryemeadgroup.co.uk" occurs three times and seems to take ages and ages to respond. Looking at the data as a once-over, I can't tell if that's just the request to get the whole page (so obviously it would be longest) or something else. If that is what it is, I don't get why it recurs thrice
You could optimise all your images. You could set GZip compression and do lots of things, but the fact is - the server environment is just horribly, horribly weak. I did a very small, few minutes long stress test which I had to cancel almost immediately. Even crawling the site at a few URLs per second stops it rendering and causes it to time-out! If I had left the test going it could have hurt the site or taken it offline, so before that happened I stopped the crawler in its tracks
By the way, this is a crawler designed for low-intensity SEO crawling, not actual stress-testing or DDoS simulation. From what I can see here, any user with a reasonable website connection (maybe BT infinity or one of the Virgin cable deals) could, if they wanted to - just take your site offline, just like that. Someone trying to do it maliciously would use much more aggressive tools and crawling techniques
The crawl delay has probably been set in your robots.txt to compensate for this. But what that means is, with the crawl delay on Google can't index your site and content fast enough. With the delay removed, they still won't be able to because even their basic, non-intensive crawling will take your site offline in seconds / minutes
Obviously when a new site goes live, even with the same files on the same server-environment, it slows down for a bit (while the cache builds back up again). Whilst that could be part of the problem, the main problem is that no matter what you do - that server environ is only fit for a hobbyist, not a fully-fledged business. Worse still, even if you did some amazing Digital PR and got loads of traffic, you wouldn't get it because it would knock the site offline. So even when you win, you'll lose anyway
Check out these terrible (worst I have ever seen) Google Page-Speed loading scores. I fired Google off to check your site, just after I had finished and killed my extremely moderate stress-test. Look at this screenshot. I know Google PSI asks for too much, but this is just dire
Let's check through Google's Mobile Website Speed Testing tool (which fires requests through 3g, just to be safe). This time I left a margin of time after the mild stress-test to see if scores got significantly better. Nope, the results are still really poor
Let's try Pingdom Tools. Here are the results. Again, really poor grade. Wouldn't be happy if my kid got a D on a school assignment, not happy with the grade here either. Beginning to see a pattern with all this?
I guess you might be in one of those situations where decision-makers are saying, before we put more money into the site (for a better server) we want to see more success. Well guess what? That's technically impossible. If you get more traffic, your site will go down - taking the Google Analytics tracking script with it. So all that traffic will be invisible to them, and they'll never have the data to decide that they need better. As such, it's a self-fulfilling circle here unless they'll just budge
What you're in danger of here, is taking an old mule on its last legs and 'optimising it'. Give it reinforced leg-braces, stuff it full of steroids. It all helps a bit, but ... you know, never be surprised when it entirely fails to beat an actual race-horse. It's not winning the grand national, the old girl (the hosting environment) simply doesn't have it in her
So what's the problem? Not enough processing power (brain-power) for the server? Not enough RAM (memory)? Not enough bandwidth?
It could be any or all of the above. When someone requests data (web-pages) from your server, three basic things have to happen. The server has to 'think of' what the user wants, if the processor can't keep up - then no matter how good the bandwidth, it's like putting "2+2=4" on a huge blackboard and expecting it to look good, look sophisticated (it won't). Next you have local memory. Once the server thinks of what it has to 'assemble' for the user, those lego pieces have to be put down in (very) temporary 'storage' before they can be shipped to the user. If you have great processing power and bandwidth on your server, great - but if it's funneled through narrow local memory... It's like trying to fit the entire theory of general relatively on one corner of a post-it note. It's not happening
Finally you have your bandwidth. If everything else is great and your bandwidth sucks, then locally you generate complex pages really fast - but can't get them 'shipped' to the user in a timely fashion
I don't know what the exact problem(s) with you server are, but it sucks. You have to investigate and secure better spend, or it will never ever improve!
Quite often, page-speed changes only influence moderate gains in Google's SERPs. That's because once you reach a certain standard, most users will be satisfied and so will Google. But in your extreme situation, I can well mark that - you will be under nasty algorithmic SERP devaluations.
Dev changes and coding will only get you so far, at the end of the day your site needs a good home to live in. Currently, it doesn't have one. It doesn't live on a server that Google would take seriously for an online business (IMO)
Important P.S: There is one other alternative issue, other than what I have summarised. It may not be that the server is weak, it may be that the server is programmed to 'fake' weakness and 'play dead' when one source (a crawler, or a user) gets too aggressive. If so, that same fail-safe has been over applied and is affecting Google's results. Play dead to Google? Get dead results. To establish whether my initial thoughts are right - or whether this final 'PS' is correct, we'd need to talk in real-time (over chat) and establish a two-way stress test. I'd need to stress the site again a little, make it time out for me - then see if it's also timing out for you. If it's affecting just me, your problem is an over-applied defense mechanism. If it's affecting both of us... the server is garbage
-
Yes the content was the same, the client just wanted to change the domain name as he changed the company trading name.
Thanks for the tip about the robots.txt crawl-delay - we'll have a look at that. I don't know why that was set. The site is hosted on LCN. The page loading speeds have increased considerably since 2 things happened:
The domain name was changed - but was kept on the same hosting.
We applied an SSL certificate to change to httpsWe have switched off some plug-ins to speed up the page loading, that may help perhaps.
-
There's a lot of conflicting information circulating about, what constitutes 'proper' redirects. If your content is slightly different on the new domain, then 301 redirects won't translate the 'full' amount of SEO authority across. You did say that, the content is exactly the same on the new domain - so I guess that wouldn't be it!
That makes me think that something could be technically wrong with the redirects, or that something is different for the new domain. Is it still on the same hosting environment, or did you move that when the new domain was applied? I am wondering if page-loading speeds have changed negatively
Another thing I see is that, in your robots.txt file you have set a crawl delay. If that wasn't on the site before it moved, it could potentially be hampering Google in terms of ... keeping their view of the site up to date (which in turn could hit rankings)
-
Hi effectdigital, thanks for your response. I understand your point, but on the old site their DA was 14. It dropped to 2 in August when the new, renamed site went live and has not changed since. Their rankings have also dropped by about 50% but we have consulted all the expert guides and believe the redirects have been done properly.
-
There are a few things to consider here. The first and foremost, is that PA and DA (Moz's metrics) are 'shadow metrics' which are meant to mimic Google's true PageRank algorithm. Since Google has never made PageRank public knowledge (except for a very watered down, over-simplified version which used to be accessible through some browser extensions, which Google have now decommissioned) - obviously SEOs needed to build a metric all of their own. Moz accomplished this
Due to this, many backlink-index providing platforms (like Moz, Ahrefs, Majestic etc) have tried to create alternate metrics (PA, Citation Flow, Ahrefs Rating) based upon similar philosophies, so that web-marketers have 'something' to go on, in terms of evaluating web-page worth from a machine's perspective. But don't be fooled, Google evaluates the strength of web-pages via their own internal PageRank algorithm (in its 'true' form, web-marketers have never seen it!)
Because Moz's page and link index is nowhere close to the same size or scope as Google's, PA and DA are 'shadow' metrics. They are indicators only, and are to be taken with a pinch of salt. Google does not use 'Page Authority' or 'Domain Authority' from Moz in their ranking algorithms, instead they use PageRank
Because PA and DA are shadow metrics, based on a smaller index (sample of web-pages), they don't react as quickly to change as Google's 'real' page-weighting metrics. As such, unless you're seeing a colossal drop-off in terms of traffic, revenue etc... I wouldn't worry much (at all) about your DA score
**If you are also **seeing a performance drop-off, that's bad news and it hints at a botched site migration with improperly configured redirects
-
Hi Dave, since we changed the htacess file a couple of days ago to allow the dotbot, I forced a new crawl by Moz yesterday. It has made no difference and the Domain Authority is still 2, all other factors remain the same, eg rankings etc. The only thing that's changed is that there are now only 2 404 errors whereas there were 5 previously, 3 of which we have fixed. Could there be something more fundamental wrong?
-
Hi again Dave
We have changed the htaccess file to allow the dotbot - so hopefully now this will rectify itself? I don’t know why it was set up this way as none of our other sites are. Can you check that you can access the site now please?
-
Thanks Dave, I will investigate this
-
Yes
-
Hey Maureen, thanks for reaching out!
So I tried to curl your new domain (http://ryemeadgroup.co.uk) against our link index user-agent and it appears we are blocked from that site. I get a 403 forbidden when trying to access it so we would not be able to index it without being unblocked. Typically this is resolved by your hosting admin, they should be able to whitelist our user-agent "dotbot". I hope that helps to point you in the right direction, if you need further technical assistance, please reach out to help@moz.com!
-
Hello There,
I went through a similar process recently, did you change the domain on GSC?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I view Domain Authority stats for longer than the previous 12 months?
When you view the DA chart, it goes back 12 months. There is no apparent way to see any data before that. I searched the Q&A but see no other similar questions. Any insight?
Getting Started | | cptutty0 -
Where does SEOMoz pull it's keyword data from?
I would like to know where the data in the keyword recommendation tool comes from. Specifically, from what sources is SEOMoz referencing to compile the keyword recommendations and analysis? Thanks!
Getting Started | | FilterEasy1 -
Can't track my site, keep getting "Ooops. Our crawlers are unable to access that URL"
Hello, So i keep getting this message and I went to hurl.it and I get 200 response. But it appears its not my actual homepage bc it says the body is empty and in the title it says "COMING SOON" which is not what my actual homepage says. Does anyone know what this means?? Thank you in advance! Rena
Getting Started | | Palila-Studio0 -
High total links, but very few root domains?
Hi Moz community!I've just joined and am getting to grips with SEO basics. Right now, I'm looking at the Competitive Link Metrics in Moz Pro, and I'm curious about the following- Of the three competitors that we're following, I'm trying to figure out some differences between two of them - we'll call them A and B. 'A' has 3.6k external followed and total links, with 5 total linking root domains. 'B' (a more prestigious and established company with a much higher DA) has 2.2k total external links, with 180 root domains. So my question is, how can A have nearly 1,000 more links, but only from 5 domains? Any feedback much appreciated! Thanks!
Getting Started | | thegildedteapot0 -
I set up a custom report to be sent to me daily about a week ago, but I haven't received any of them yet?
How long do they really take? I think I opted in to receive emails from MOZ but I didn't see that as an option on the email form. I need to be able to rely on the assumption that if I set up a custom email to send that it will in fact send.
Getting Started | | matthew.dimmett0 -
Can't create a brand query?
When I try to create a brand query, clicking the 'preview query' button doesn't do anything. Anyone else have this happening to them? It would be my first brand query if that matters.
Getting Started | | pixelflyte20140 -
I am new to MOZ, I set up one tracking campaign two weeks ago, I have tracked no keywords, I have done some keyword research for ranking difficulty and in two weeks I have already hit 50K pages crawled, I'm maxed out, is this common?
I am a startup and can't afford the higher plans yet. And even their highest plan is 600K pages crawled, which seems really low considering how lightly I used the tool and how quickly I hit 50K. Does anyone have any advice or information on how they use the tool on lower packages? Did I do something wrong to hit 50K pages crawled that fast? Does this pricing make any sense, it seems like an incredibly high price, I love the tool, any help is appreciated.
Getting Started | | Daedilus1 -
What do Pro Subscribers Get that Free Users Don't?
I've been playing with my new account here on Moz, and I'm trying to figure out what Pro subscribers get that Free users don't get. Is there a page somewhere that lists the differences? I Googled, but couldn't find one.
Getting Started | | CharlesSleigh0