What is the difference between "Referring Pages" and "Total Backlinks" [on Ahrefs]?
-
I always thought they were essentially the same thing myself but appears there may be a difference?
Any one care to help me out?
Cheers!
-
Referring pages is the amount of pages (URLS) pointing to your site.
Total backlinks is the total amount of links you have to your site.
Example: 10 pages have 2 links to your site.
Referring Pages: 10
Total Backlinks: 20 -
Hi Paul,
There are many things this could mean to Ahrefs, so I can only really suggest contacting their own support to get the real meaning of what this is. I could make educated guesses, but they could still be wrong and would hate to misinform you.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How handle pages with "read more" text query strings?
My site has hundreds of keyword content landing pages that contain one or two sections of "read more" text that work by calling the page and changing a ChangeReadMore variable. This causes the page to currently get indexed 5 times (see examples below plus two more with anchor tag set to #sectionReadMore2 This causes Google to include the first version of the page which is the canonical version and exclude the other 4 versions of the page. Google search console says my site has 4.93K valid pages and 13.8K excluded pages. My questions are: 1. Does having a lot of excluded pages which are all copies of included pages hurt my domain authority or otherwise hurt my SEO efforts? 2. Should I add a rel="nofollow" attribute to the read more link? If I do this will Google reduce the number of excluded pages? 3. Should I instead add logic so the canonical tag displays the exact URL each time the page re-displays in another readmore mode? I assume this would increase my "included pages" and decrease the number of "excluded pages". Would this somehow help my SEO efforts? EXAMPLE LINKS https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp?ChangeReadMore=More#sectionReadMore1 https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp?ChangeReadMore=Less#sectionReadMore1
Technical SEO | | DougHartline0 -
Google Search Console "Text too small to read" Errors
What are the guidelines / best practices for clearing these errors? Google has some pretty vague documentation on how to handle this sort of error. User behavior metrics in GA are pretty much in line with desktop usage and don't show anything concerning Any input is appreciated! Thanks m3F3uOI
Technical SEO | | Digital_Reach2 -
Ranking penalty for "accordion" content -- hidden prior to user interaction
Will content inside an "accordion" module be ranked as non-hidden content? Is there an official guide by google and other search engines addressing this? Example of accordion element: https://v4-alpha.getbootstrap.com/components/collapse/#accordion-example Will all elements in the example above be seen + treated equally by search engines?
Technical SEO | | houlihanlokey1 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Duplicate page errors from pages don't even exist
Hi, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist. My website has around 40-50 pages but SEO report shows that 375 pages have been crawled. My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report. Here is an example of a reported link: http://www.mywebsite.com/Door/Doors/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/ btw there is no issue such as crawl error in my Google webmaster tool. Any help appreciated
Technical SEO | | mmoezzi0 -
If my home page never shows up in SERPS but other pages do, does that mean Google is penalizing me?
So my website I do local SEO for, xyz.com is finally getting better on some keywords (Thanks SEOMOZ) But only pages that are like this xyz.com/better_widgets_ or xyz.com/mousetrap_removals Is Google penalizing me possibly for some duplicate content websites I have out there (working on, I know I know it is bad)...
Technical SEO | | greenhornet770 -
Authorship Markup worth it for "invisible" authors
Greetings everyone! Background I help run multiple continuing education sites for Allied Health professionals. Our editors do a great job of getting some of the best authors in their respective fields to come onto the site and present webinars and we publish articles around those presentations. I would love to be able to use the rel=author tag on these sites as the authors we use help to improve our credibility when a user is on the site and I would like to take advantage of this in the SERPs. The issue is that while most of these authors are leaders in their respective fields and have published in many academic publications, they are not on Facebook or Twitter, let alone Google+. Also, they are probably not interested in setting up a G+ profile. They are "famous" and well published within their fields, yet they are somewhat "invisible" on the web. We are looking to implement author bios on our site and then could use the rel=author tag internally so that seems like a good first step. The question is then around linking out with rel=me to any profiles (FB, Twitter, G+) The issue is that, as I mentioned above, the online profiles are pretty scarce. Question / Discussion Is it worth it to setup all the authorship markup to internal bios on a site when many of the authors are "invisible" on G+, twitter, FB, etc. and so I will be limited in how I can link rel=me to those profiles. If the Google+ profile is not available for an author, what do you prefer to link to. Would you say FB over Twitter as FB has more users, or if a user has both profiles, but uses twitter more often, would you link to the Twitter profile instead? Many of these authors work at the university and have a bio page on the university website, would it be working linking to that profile? How do you judge the "best" place to link to if there is no Google+ profile. Thanks!
Technical SEO | | CleverPhD0 -
Too many on page links for WP blog page
Hello, I have set my WP blog to a page so new posts go to that page making it the blog. On a SEOmoz campaign crawl, it says there are too many links on one page, so does this mean that as I am posting my blog posts to this page, the search engines are seeing the page as one page with links instead of the blog posts? I worry that if I continue to add more posts (which obviously I want to) the links will increase more and more, meaning that they will be discounted due to too many links. What can I do to rectify this? Many thanks in advance
Technical SEO | | mozUser14692366292850