When to stop link building because page authority is low - open site explorer
-
Hi,
I'm link building with Open Site Explorer. I'm really picky in get links from only high quality sites.
When do you stop going down the list of possible backlink providers because the page authority is too low.
I usually stop at 40, but what do you do, why, and what does it depend on?
-
Lots of people come to these forums looking for numbers and formulas.
If the link is going to be on what looks like a quality site then it doesn't matter what the page authority is...
.... and if the site has a high page authority but it looks like a crappy manipulative site then why would you want a link on it?
-
Disregard page strength, acquire links.
...as long as the page is relevant. The fact that cheaper forms of SEO like article marketing are still effective proves that poor page strength shouldn't sap your efforts so quickly. I use page strength more of an indicator of where on my priority list I should put acquiring a link, if possible, and also assuming the page is relevant.
I understand not wanting to be associated with bad neighborhoods. Page strength is not a strong indicator of a bad neighborhood. One common SEO tactic is to create contests, correct? One recent contest I developed required participants make a review of a product, and naturally that meant linking to the client website. Most of these reviews came from personal blogs and general websites that did not have a lot of domain authority. Yet it was still a boost to rankings and domain authority.
Use your own judgement by looking at a website and determining if you want to be associated with it. Don't pick and choose based on the website popularity alone.
-
Someone else may be able to give you an 'absolute stopping point', but I really play it by ear if there's something that looks relevant and good, then I'll try and get a link.
I obviously wouldn't put as much effort into getting a link with a lower PA/DA as a high one. If I was to put a lower limit, I'd always use domain authority rather than page authority as this to me is a much better indicator, especially if you're looking for a link on the site in general, not just specifically on the same page as your competitor's link.
If I was to use a lower limit, I'd probably stick around the 40 mark for domain authority to be worth putting time and effort into getting a link, but I would still consider a lower DA if the link was relevant and fairly easy to get.
-
Good response. That helps. Yes, I will use gut instinct. There's one thing, though - I don't want any new sites. One of my requirements is age. Does that change your response? Also, where do you absolutely stop? PA 25? PA 20? I'd like to have a stopping place to rely on.
-
A link can still be good regardless of page authority, simply because there are many other ranking factors involved to determine the quality of a site which simply aren't available to OSE. You could easily get a spammy directory much higher in the OSE listings than a good quality content rich site that's lacking links further down.
I would use a combination of page authority and gut instinct. 40 is probably a good level to use as a marker, however there may be some newish or good content sites further down that you can make a judgement on; it's just that they may be fewer and farther in-between
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Backlink profile from Open Site Explorer does not seem to update
I have been monitoring our backlink profile on Open Site for over a year now and despite getting a number of new domains linking to us they are not reflected in the tool. Our URL is: www.BlueLinkERP.com Any thoughts on why this might be the case? The number of linking domains also seems very low compared with other tools we use (i.e. HubSpot)
Moz Pro | | BlueLinkERP0 -
How to find page with the link that returns a 404 error indicated in my crawl diagnostics?
Hi Newbie here - I am trying to understand what to do, step by step, after getting my initial reports back from seomoz. The first is regarding the 404 errors shown as high priority to fix, in crawl diagnostics. I reviewed the support info help on the crawl diagnostics page referring to 404 errors, but still did not understand exactly what I am supposed to do...same with the Q&A section when I searched how to fix 404 errors. I just could not understand exactly what anyone was talking about in relation to my 404 issues. It seems I would want to find the page that had the bad link that sent a visitor to a page not found, and then correct the problem by removing the link, or correcting and re-uploading the page being linked to. I saw some suggestions that seemed to indicate that seomoz itself will not let me find the page where the bad link is and that I would need to use some external program to do this. I would think that if seomoz found the bad page, it would also tell me what page the link(s) to the bad page exists on. A number of suggestions were to use a 301 redirect somehow as the solution, but was not clear when to do this versus, just removing the bad link, or repairing the page the link was pointing to. I think therefore my question is how do I find the links that lead to 404 page not founds, and fix the problem. Thanks Galen
Moz Pro | | Tetruss0 -
Open Site Explorer Social Media?
Open Site Explorer always reports only a few facebook likes. If I go to the business page of the site, it has 100+ likes. This seems to happen a lot. Is this a OSE issue or is there an issue with the site clearly being defined to google as a connected facebook account? If OSE doesn't see the likes, will google? How can I fix?
Moz Pro | | JML11790 -
Linking Building: Do I have to beat Linking Root Domains or Total Links?
Hello, When skoping for how to beat competitors, do I need to aim to beat the linking root domains number or the total links? Or do I need to think about both? Thanks!
Moz Pro | | BobGW0 -
Conflicting Data with Open Site Explorer & SEO Moz Toolbar
Hi there, Why is my SEOmoz toolbar showing me higher linking root domains than open site explorer? For example, for this site: http://thisisouryear.com/, the linking root domains in SEOMOZ show 1,015 and in open site explorer it says it has 35. I have noticed this for other sites as well. Which one is right? And why are they different?
Moz Pro | | qlkasdjfw1 -
How fast can page authority be grown
I understand that it is easier to rank for a particular keyword given a higher DA score. How fast can page authority be established and grown for a given keyword if DA is equal to 10/20/30/50? What are the relative measures that dictate the establishment and growth of this authority? Can it be enumerated to a percentage of domain links? or a percentage of domain links given an assumed C-Block ratio? For example you have a website with DA of 40, and you want to target a new keyword, the average PA of the top ranked pages is 30, the average domain links are 1,000, and the average number of linking domains is 250 - if you aim to build 1,000 links per month from 500 linking domains, how fast can you approximate the establishment of page authority for the keyword?
Moz Pro | | NickEubanks0 -
Competitive .edu Research via Open Site Explorer
I was using open site explorer and trying to figure out how my competitors are getting so many .edu links. Now I won't mention any names here but I am trying to figure out why almost all their links point to downloads of documents. Here are a few of the examples of sites I keep finding... www.cs.uoregon.edu/research/paracomp/tau/monitor/sheehan_examples/tim_context_xmpl?wiki=TailleMaximaleDesFichiersSurUnePartition/orphan www.atmos.albany.edu/facstaff/mathias/video/090926_wima_klimawandel.ivr?url_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.jtitle=American Journal Clinical Nutrition&rft.atitle=Problems with red meat in the WCRF2&rft.volume=89&rft.spage=1274&rft.epage=9&rft.date=2009&rft.aulast=Truswell&rft.aufirst=AS&rfr_id=info%3Asid%2Fwiley.com%3AOnlineLibrary faculty.unlv.edu/jensen/CEE_468/GISTutorialWorkbook/tutorial11/LandUsePgh.lpk?arrow=nws&read=24922 <colgroup><col width="3111"></colgroup>
Moz Pro | | MichealGooden0