Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Percentage of good links vs. bad
-
Hi
Does anyone know the best way of determining good links from bad links using the SEO Moz tools?
I bought some directory links to two or three pages on my site a few years back. The were all very obviously spammy because of the anchor text and I didn't have a high enough ratio of good links to counteract them.
I read somewhere that if more than 10% of the links to a page have the same (or similar) anchor text, it's obvious that you're on the bad list.
-
What is considered low domain and page authority?
-
There is really no set % of anchor text and links that will get you penalized. Everything is speculation and best practice.
Everything is up to the webmaster and usually they know which links are bad. It is best to be honest.
As for checking bad links, they are usually the ones with low domain authority and page authority. You should extract your backlink report from OSE and weed through the low quality links(low DA/PA).
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Is Moz Able to Track Internal Links Per Page?
I am trying to track internal links and identify orphan pages. What is the best way to do this?
Moz Pro | | WebMarkets0 -
GWMT / Search Analytics VS OpenSiteExplorer
Just had the experience of using OSE data to show what we call "linkrot" to a client -- only to find that GWMT / Search Analytics shows no such thing. Fortunately the client is an old friend and no face was lost, but it was dicey there for a bit as I have come to rely on and reference OSE again and again and again, OSE showed Domain Authority dropping by about 1/3 in the last 12 months, presumably due to old links getting broken, linking sites changing their architecture etc. And of course, ranking is tanking, as you would expect. But Google shows many more (and much more spammy looking!) backlinks. Has anyone had any experience benchmarking the 2 data sets of backlinks against each other? Dr Pete?
Moz Pro | | seo_plus
Does one update more frequently than another? Do you trust one more than another?? If so, why?? Thanks!0 -
MoZ vs SEMRush - Keyword Difficulty and Keyword With Low Competition.
Hi, My question is very focussed regarding these 2 tools. Is it correct to understand that MoZ tells keyword difficulty and not which keyword is easy to compete. And SEMRush tells which keyword is easy to compete but does not tell which keywords are difficult to compete? I mean one (Moz) misses the one part and the other (SEMRush) misses the other part. Hope some will enlighten me to the point. Best
Moz Pro | | Sequelmed0 -
What is Linking C-Blocks
Currently i am using MOZ pro tool under moz analyticls >> Moz Competitive Link Metrics >> history having a graph "Linking C-Blocks" Please help me understanding Linking C-Blocks, what is, How to build, how to define ...
Moz Pro | | shankar3334 -
Potential spam websites with high DA linking back to us
Hey everybody, I'm going through all my sites and disavowing crap links. However, I'm having trouble distinguishing which high DA sites to disavow. What would you do? For example:
Moz Pro | | MEllsworth
https://moz.com/researchtools/ose/spam-analysis?site=busca.starmedia.com&target=domain&source=subdomain&page=1&sort=spam_score and https://moz.com/researchtools/ose/spam-analysis?site=cc879fe.activerain.com&target=domain&source=subdomain&page=1&sort=spam_score They both have tons of backlinks - both good and crap. The first has a DA of 72 and a Moz spam score of 4/17 and the second has a DA of 86 and a Moz spam score of 9/171 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Noindex/nofollow on blog comments; is it good or bad ?
Hi, I changed the design of one my wordpress website at the beginning of the month. I also added a "facebook seo comments" plugin to rewrite facebook comments as normal comments. As most of the website comments are facebook comments, I went from 250 noindex/nofollow comments to 950; URL's are ?replytocom=4822 etc. Moz campaign noticed it and I'm asking myself : is it good to have comments in noindex/nofollow ? Should I do something about this ? Erwan.
Moz Pro | | johnny1220 -
How can I reduce the number of links on a page and keep the site easy to navigate?
The SEOmoz Site Crawl indicates that we have too many on page links on over 9,970 pages. This is an ecommerce site with a large number of categories. I have a couple of questions regarding this issue: How important is the "too many on page links" factor to SEO? What are some methods of reducing the number of links when there are a large number of categories? We have main categories with dropdown menus currently and have found that they are used to browse and shop the store.
Moz Pro | | afmaury1