Can spammy links affect indexing?
-
Meaning, if you have a lot of bad quality links (directories, blog comments) that are giving great rankings for some terms (on a homepage of a site), could the low quality of these links negatively affect the crawling frequency of interior pages or perhaps even give interior pages a ranking penalty?
-
This is a topic I feel needs extensive testing. As you have said, although google has not commented on the subject to any great degree other than best practice (with respect to relevant linking) it would appear that there is methodology out there that would suggest the comparative quality of inbound links does have an effect - Eric Ward/Adgooroo are avid promoters of "domain profile" when it comes to inbound links.
The penalty side of things has only come into play recently with the PANDA update - many sites that rely only on syndicated content to exists have been penalised - as a result, perfectly legitimate sites that have syndicated content out with links in have suffered .
Of course the efficiency factor comes in to play too - both from the time and effort needed to gain a high quality link, over the "loads of links in 10 minutes" through directories/blog comments/syndication. We all know that 10 high quality, relevant .edu links will earn way more brownie points than 100 "easy" links from non-relevant sources.
I suspect if you took a large site with a high number of back links and reduced the number of spammy links - you might, over time, see an improvement in ranking. This is of course an assumption though, would be interesting to see if anyone has experimented.
-
There is some discussion on this same topic you can take a look at: http://www.seomoz.org/q/few-high-quality-links-or-a-plethora-of-mediocre-links
Most SEO experts will share the same thought "you want to build your site over time with high quality links". I completely agree but many sites would like a boost to get started. Others have good content but due to heavy competition or other factors desire to perform better in SERP.
It is my understanding having a lot of low quality links can help a site, and cannot harm a site. For those who feel otherwise, I would appreciate the opportunity to further discuss the topic. I would love to see any information from Google or Matt Cutts on the topic.
The term "penalty" often used to describe these links or the site which receives them usually refers to the loss of link juice from the bad link. I am not aware of any negative effects outside of discounting the bad link.
The Pandora effect, or any drop of ranking, is due to the loss of juice from the offending links, not a penalty from Google. The site still benefited. For a time, they had higher rankings and more exposure to the public. During that period the site could have earned additional sales or picked up readers who otherwise may have not seen the site. Those additional customers and readers can directly lead to the site ranking higher then it would of if it never received the "bad" link.
-
Just because you are getting great rankings for some terms, or better yet - THINK you are getting rankings for great terms due to bad quality links, does not mean that at some point you won't be harmed by them. And you could be harmed from them right now - or not necessarily harmed as much as hindered.
While Google does a questionable job at weeding out sites with lots of bad links, they're not completely helpless. And sites have been known to fall due to an over-abundance of bad links, though it's more that Google will discount them as a ranking factor before they actually penalize a site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Is my page being indexed?
To put you all in context, here is the situation, I have pages that are only accessible via an intern search tool that shows the best results for the request. Let's say i want to see the result on page 2, the page 2 will have a request in the url like this: ?p=2&s=12&lang=1&seed=3688 The situation is that we've disallowed every URL's that contains a "?" in the robots.txt file which means that Google doesn't crawl the page 2,3,4 and so on. If a page is only accessible via page 2, do you think Google will be able to access it? The url of the page is included in the sitemap. Thank you in advance for the help!
Technical SEO | | alexrbrg0 -
Can I set a canonical tag to an anchor link?
I have a client who is moving to a one page website design. So, content from the inner pages is being condensed in to sections on the 'home' page. There will be a navigation that anchor links to each relevant section. I am wondering if I should leave the old pages and use rel=canonical to point them to their relevant sections on the new 'home' page rather than 301 them. Thoughts?
Technical SEO | | Vizergy0 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
Single URL not indexed
Hi everyone! Some days ago, I noticed that one of our URLs (http://www.access.de/karriereplanung/webinare) is no longer in the Google index. We never had any form of penalty, link warning etc. Our traffic by Google is constantly growing every month. This single page does not have an external link pointing to it - only internal links. The page has been indexed all the time. The HTTP status code is 200, there is no noindex or something in the code. I submitted the URL on GWMT to let Google send it to the index. It was crawled successfully by Google, sent to the index 5 days ago - nothing happened, still not indexed. Do you have any suggestions why this page is no longer indexed? It is well linked internally and one click away from the home page. There is still the PR of 5 showing, I always thought that pages with PR are indexed.......
Technical SEO | | accessKellyOCG0 -
No results with Link Analysis
So I have been working with a domain since November last year that still shows no improvement in regards to the link analysis. I am baffled because we have gotten them onto the first page on Google for a few of the keywords we are optimizing. Any help with this is greatly appreciated and I am a noob so definitely open to learning. Thanks in advance to all of you. Domain in question - www.modernportablerefrigeration.com Domain is currently on a shared server if that makes any difference. Cordially, Todd Richard admin@richfinn.org
Technical SEO | | RichFinnSEO0 -
Subdomain mozTrust - does other parkd domains can affect that ?
Hi , I have my domain www.mydomain.com and it have dpmain authority 26 , domain mozRank around 3 , domain mozTrust 1.63 , page authority 31, Google PR 2.0 etc etc So I am not in very bottom of scores, but my SUBDOMAIN MOZTRUST is only 0.961 and I've checked other websites that I've made some time ago and they have it like 4.0. So it is quite bad. I am having some domain parked within my hosting package. they have different names like www.mydomain2.co.uk , www.mydomain3.com etc. I can acces those domains as wel by typing : mydomain2.mydomain.com mydomain3**.mydomain.com** and have some testing subdomains there as well (just if I need to test something like drupal , wordpress or testing shoping cart etc.) Can that fact affect my subdomain rank ? Because I am having those domains parked there or I've made some subdomains that are not in use and nobody is linking to them and they are visible in Google ?
Technical SEO | | sever3d0 -
I just found something weird I can't explain, so maybe you guys can help me out.
I just found something weird I can't explain, so maybe you guys can help me out. In Google http://www.google.nl/#hl=nl&q=internet. The number 3 result is a big telecom provider in the Netherland called Ziggo. The ranking URL is https://www.ziggo.nl/producten/internet/. However if you click on it you'll be directed to https://www.ziggo.nl/#producten/internet/ HttpFox in FF however is not showing any redirects. Just a 200 status code. The URL https://www.ziggo.nl/#producten/internet/ contains a hash, so the canonical URL should be https://www.ziggo.nl/. I can understand that. But why is Google showing the title and description of https://www.ziggo.nl/producten/internet/, when the canonical URL clearly is https://www.ziggo.nl/? Can anyone confirm my guess that Google is using the bulk SEO value (link juice/authority) of the homepage at https://www.ziggo.nl/ because of the hash, but it's using the relevant content of https://www.ziggo.nl/producten/internet/ resulting in a top position for the keyword "internet".
Technical SEO | | NEWCRAFT0