Quality links are beneficial, but are neutral links detrimental?
-
So obviously a link profile featuring quality / authoritative / relavant in-bound links is preferable, but here's my question:
If I'm starting work on a brand new domain, should I build links that one would consider neutral (i.e. from a non-spammy, but unrelated site) or should I not bother and only focus on quality links?
Thanks
-
Try and do you best to stick to quality websites for links but most importantly make sure the link is tied to relevant content. Beyond that i always look at 3 factors when choosing places to reach out to:
- Do they have a domain Moztrust above 2
- Are they currently indexed in Google
- Is the content focused, relevant and to the point
Also if you ever want a sanity check on your link profile, http://www.linkdetox.com/, does an awesome job of automating the process of telling you good, suspicious and toxic links in your profile. Plus it is only $40 for the tool which is totally worth it!
Good luck!
- Kyle
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal link structure for my loan website
Hi folks. I own a Norwegian consumer loan/financing website, which has been monetized with links. I've created various silos for my content, according to what I believe is most relevant to the user.
Technical SEO | | llevy
However, as a result each article now has a sidebar list, which in turn links to all other articles within the same category (silo). As you can see here, it has about 30 links in the sidebar: forbrukslån.no/beste-lån. With 30 articles in a silo, that corresponds to over 900 internal links, in just one silo alone. I wonder if this could be hurting me SEO wise? I know G cares a lot about relevance and user experience. So I have a feeling it could be interpreted as spammy. Reason I did this in the first place, is that the header links are also being repeated on all pages, without any issue. T4FHxHw0 -
Does reciprocal linking carry any value?
No matter how much I research this one, there's no definite answer and there's a lot of contradictions. Basically we're looking to launch an article on 24 expert interior design tips for 2015. Each tip is submitted from a different interior designer we have chosen who have a reputable, trusted website. The main goal for this article is to generate various inbound links for our site from the designers and it will help to create engagement on social media. Although if we're giving out links to these designers for their contributions, the inbound links we receive in return will be little or no value as this is reciprocal linking? Some say this is okay as it's completely natural within the blog posts, others say to avoid it as it can be seen as an obsolete practice to deceive Google. Does anyone have any more information on this and how it should be carried out? Would a better process be to link to their social media accounts? Rather than reciprocal linking? Thanks
Technical SEO | | Jseddon920 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Internal Links
In OSE, it is reporting that i don't have any internal links to my homepage. In the header on every page is my logo in the top left hand corner which links back to my homepage. Shouldn't this mean then that every page should link to the home page? Similarly, internal pages which link from my main nav aren't showing up as having any internal links in OSE. Any ideas?
Technical SEO | | Santaur0 -
Number of links you should have on a taxonomy term??
According to SeoMoz, my taxonomy terms contain more than 100 links (links to articles in my case) and it tells me that I should reduce it. I have seen a video by Matt Cutts, the google software engineer, and in that video he said that Google's engine has dramatically improved ever since and 100 is not the limit anymore. What do you guys think is the best practice here? To clarify the subject even more: I want to learn this from link juice perspective, does it effect how link juice is distributed? Let's say I have 5 taxonomy terms and all of them have 200 articles and these 5 terms are listed on the home page of a PR7 website. In this case some of the PR will be passed to these 5 taxonomy terms. However, if I increase taxonomy terms to 10, then i will reduce links to 100, but the PR will be distributed even more. This means each taxonomy term will have even less PR value. Am I wrong? Any ideas?
Technical SEO | | mertsevinc0 -
Adding my web link on wikiquote, is it ok?
If i insert my link on wikiquote on a appropriate page, is it ok for seo or negative?
Technical SEO | | rimon56930 -
Link Profile, is the keyword ratio too high?
Hi, Our website (www.NutritionMission.co.uk) has dropped from 6 to 30 in the ranking for our main keyword (Nutritional Therapy). Pulling a spreadsheet off of all the inbound links looks like 41% of anchor text is related to Nutritional Therapy. Is this ratio too high for the new google update? There are also a lot of directory submissions from the SEO people we were paying before. Anyone point me on the right track to get some idea to how to work things out? ie. add more links to ratio of anchor text is lowered etc. Kind Regards, Ian.
Technical SEO | | ianwr0 -
Internal linking with Old Content
Hello, I have a sports website in which users write their opinions about the sporting events that take place every day throughout the year. Each of these sporting events generates a new page or URL indicating the match with date. For example: www.domain.com/baseball/boston-v-yankees-04-24-2012-1234.html The teams face several times a year, and each match creates a different URL or page. I would like to link old pages to new pages and vice versa. How would you recommend these pages to be linked? Linking them to each other or linking old pages to new pages that are generated or otherwise? I would appreciate your orientation and help in this case. Thank you.
Technical SEO | | NorbertoMM1