Do the referring domains matter a lot in back-links? Google's stand?
-
Hi,
It's a known fact about quality of back-links than quantity. Still domains are heavily different from links. Multiple domains are huge comparing to multiple links. Taking an average, how much does 'number of referring domains" boost website authority? I am not speaking about low quality domains, just number of domains including which are irrelevant to the topic or industry.
Thanks
-
Hello vtmoz,
I think your questions is very difficult to answer correctly. Domains are not weighted the same. Even high quality domains are not weighted the same. Where are the links coming from DA 100 or DA 25 links. DA is more difficult to increase the higher a domain is, if you have a DA 20 site and want to increase it to a DA 30 site it would have to have a good mix of high DA links. Every ten DA after that the quality needs to be the same or higher and exponentially more links from those higher DA sites.
DA 100 sites like Facebook have millions of links.
If you want to see the link profiles of your competitors, use the open site explorer and research the higher DA sites and see who links to them. Then try to emulate the links that are relevant to your niche.
Thanks,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is anyone else's ranking jumping?
Rankings have been jumping across 3 of our websites since about 24 October. Is anyone seeing similar? For example ... jumps from position 5 to 20 on one day, then back to 5 for 3 days and then back to 20 for a day I'm trying to figure out if it's algorithm based or if my rank checker has gone mad. I can't replicate the same results if I search incognito or in a new browser, everything always looks stable in the SERPs if I do the search myself
Algorithm Updates | | Marketing_Today0 -
Bad Grammar's Effect on Rankings
Mozzers, I have a client who's brand style guide dictates that they write in all lowercase letters. Do you think this will hurt rankings? Nails
Algorithm Updates | | matt.nails1 -
A Google Update Happened?
I'm curious to know what us MOZ folks have to say about an update on Google. Article here: http://searchengineland.com/big-google-search-update-happening-chatter-thinks-258142 Any ideas?
Algorithm Updates | | Chenzo0 -
Links to category pages unnatural?
If people are linking to your site, it would seem natural that the vast majority of those links would point to the homepage, product page, or a article/content page. Let's say you have 100 links pointing to your site, and 40 of them are pointing to category pages. Would this seem unnatural? Does Google or other search engines have a way of determining this as a factor in ascertaining whether the links are natural or not? Is there a rule of thumb when it comes to the pages that are linked to on your site?
Algorithm Updates | | inhouseseo0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
How effective are nofollow links today (2013) ?
Hi, We had a question about the effectiveness of nofollow today. Nofollowing some links on pages was to make sure pagerank flows to content which is most relevant and useful to visitors on the site. Looking at the 2009 article, http://www.seomoz.org/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow, it seems that adding the meta tag nofollow would no longer help us in ensuring this goal. We had a couple of questions: 1. Do you think Google today only passes pagerank to dofollow links
Algorithm Updates | | SEMEnthusiast
2. Are sites today using iframes/javascript to make sure googlebot passes pagerank to only relevant pages
3. Any other best practice you would suggest Thanks0 -
Sub-Links of Organic SERP
I would like to know if you can modify (or suggest) the sub-links under an organic listing. For Example: Main Link/Title = COMPANY NAME - What We Do.... Sub-Links (popular pages within site) currently include links like: Locations / Catalog Request / Bestsellers Is it possible to suggest other pages as sub-links or do the search engines determine these? Please advise, and thanks in advance....
Algorithm Updates | | WhiteCap0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0