Ahrefs - What Causes a Drastic Loss in Referring Pages?
-
While I was doing research on UK Flower companies I noticed that one particular domain had great rankings (top 3), but has slid quite a bit down to page two. After investigating further I noticed that they had a drastic loss of referring pages, but an increase in total referring domains. See this screenshot from ahrefs.
I took a look at their historical rankings (got them from the original SEO provider's portfolio) and compared it to the Wayback Machine. There did not seem to be any drastic changes in the site structure. My question is what would cause such a dramatic loss in total referring pages while showing a dramatic increase in referring domains? It appears that the SEO company was trying rebound from the loss of links though.
Any thoughts on why this might happen?
-
The Google Disavow tool doesn't work like that. It won't actually remove links from any pages. It is basically just a signal to Google that you want those links to be nofollowed. Ahrefs would have no clue if something has been disavowed or not.
-
Hi,
One thing that comes to mind is using the Disavow tool. The UK Flower company might have used the Disavow tool to remove all the referring pages that they thought were spam.
-
First thing that comes to mind is maybe the site had a lot of site-wide links before. If it had 5,000 or 10,000 links coming from 1 single domain and that website went down, that would be a huge loss of referring pages in a short amount of time. Maybe they were in web directories and asked to be removed? While simultaneously attempting to build some high quality backlinks from more referring pages?
It's all speculation of course, but plausible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Rank on Moz compared to Ahrefs
So there seems to be a huge philosophical difference behind how Moz and Ahrefs calculates page rank (PA). On Moz, PA is very dependent on a site's DA. For instance, any new page or page with no backlinks for a 90DA site on Moz will have around 40PA. However, if a site has around 40 DA, any new page or page with no backlinks will have around 15PA PA. Now if one were to decide to get tons of backlinks to this 40 DA/15PA page, that will raise the PA of the page slightly, but it will likely never go beyond 40PA....which hints that one would rather acquire a backlink from a page on a high DA site even if that page has 0 links back to it as opposed to a backlink from a page on a low DA site with many, many backlinks to it. This is very different from how Ahrefs calculates PA. For Ahrefs, the PA of any new page or page with no backlinks to it will have a PA of around 8-10ish....no matter what the DA of the site is. When a page from a 40DA site begins acquiring a few links to it, it will quickly acquire a higher PA than a page from a 90DA site with no links to it. The big difference here is that for Ahrefs, PA for a given page is far more dependent on how many inbound links that page has. On the other hand, for Moz, PA for a given page is far more dependent on the DA of the site that page is on. If we were to trust Moz's PA calculations, SEOrs should emphasize getting links from high DA sites....whereas if we were to trust Ahref's PA calculations, SEOrs should focus less on that and more on building links to whatever page they want to rank up (even if that page is on a low DA site). So what do you guys think? Do you agree more with Moz or Ahref's valuation of PA. Is PA of a page more dependent on the DA or more dependent on it's total inbound links?
Algorithm Updates | | ButtaC1 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Using a stop word when optimizing pages
I have a page (for a spa) I am trying to fully optimize and, using AdWords have run every conceivable configuration (using Exact Match) to ascertain the optimal phrase to use. Unfortunately, the term which has come up as the 'best' phrase is "spas in XXX" [xxx represents a location]. When reviewing the data, phrases such as "spas XXX" or "spa XXX" doesn't give me an appropriate search volume to warrant optimizing. So, with that said, do I optimize the page without the word "in", and 'hope' we get the search volume for searches using the word "in", or optimize using the stop word? Any thoughts? Thank you!
Algorithm Updates | | MarketingAgencyFlorida0 -
Quickest way to deindex a large number of pages
Our site was recently hacked by spammers posting fake content and bringing down our servers, etc. After a few months, we finally figured out what was going on and fixed the issue. However, it turns out that Google has indexed 26K+ spammy pages and we've lost page rank and search engine rankings as a result. What is the best and fastest way to get these pages out of Google's index?
Algorithm Updates | | powpowteam0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Is my page footer the reason keyword rankings have dropped?
Hi all, One of my sites http://henstuff.com/ has seen some ranking drops for major keywords over the past few weeks and I was wondering if it was something to do with Penguin not taking a positive view of link-filled footers. It is something we are looking at phasing out but wanted to get the opinions of the SEOMOZ community. Thanks! Rob
Algorithm Updates | | RobertHill0 -
FB Like and G+ on page transferred ?
Hi , As we all know that social signals like FB Like and G+ give a boost to the search engine rankings. I have several pages we have garnerd 100s of FB likes with a 10s of G+ . Now i want to reorganize my site to so that the contents of the blog would be having a different URL . So i have planned for 301 redirections for the older content to the newer ones. But does the FB Like and G+ also move along with the 301 redirection ? If not whats is the best way to handle this... Warm Rgds Avinash MB
Algorithm Updates | | ShoutOut0 -
Google +1 link on Domain or Page?
Since its release, I've seen Google +1 being used across an entire domain but only reference the root href in the code snippet. At the same time, you see other sites use +1 more naturally with the button being specific to the page you're on. What's your take on this? To clarfiy, do you add: or .. on each page.
Algorithm Updates | | noeltock0