Too many nofollowed blog comments with exact anchor text
-
Back in my dumb days, I decided to use Fiver to get 25 backlinks from .edu sites. Well, they were all nofollowed, and they share space with hundreds of other sites spamming them. Top top it off, all the spam links for my site are exact-match anchor text: embroidered patches.
If you look at my link profile in OSE, it looks so polluted with these. I'm just looking for post-Penguin opinions about this--if it has the potential to hurt.
Since Penguin, I have moved to the #1 position for the KW embroidered patches, but I am still scared that future algorithm tweaks will incorporate this blog comment spam.
What do you think?
-
Since Penguin, I have moved to the #1 position for the KW embroidered patches
Nice work! WooHoo!
-
That's a good idea about the sitewide links thing. I have 12,900 links from one site, so that must be a sitewide thing. The good news is, the rest of my link profile is good, with a lot of varying anchor text. Maybe that's why I haven't been hurt by this one bad thing.
-
"it would be a good time to build some natural links to counterweight the existing ones."
yes exactly, and use completely different anchor texts, stay away from any variations of "embroidered patches"
look at some of the sites that are linking to you and if there are some easy quick wins, like a sitewide link on a bad site with a large index, and has webmaster contact info ask to get it taken down. Sometimes offering $20 via paypal helps get a webmasters attention.
-
Black and white zoo animals--funny. Those are really the only bad links; it's just that they are on high-authority sites, so when you look at my link profile, they're dominating the first two pages, so it looks so bad. But the rest of the links are fine, so I guess I won't worry too overly much.
-
This could potentially cause your website to suffer but it's all about ratio and balance. If the majority of your website's link profile is made up of these types of links, then it would be a good time to build some natural links to counterweight the existing ones.
As long as the positive metrics outweigh the negatives, generally, you should be fine. Just building some links with varying anchor text to the same page should ensure that you're website doesn't get hit further down the line by one of Google's black and white zoo animals.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any benefit of posting links on blogs where last conversation took place 6 years ago
i am working on backlinking , i stumbled upon a site whos DA is 90 and spam score is 1%, however that site had last comment 6 years ago, my question is am i still getting any juice since the fact that other people may never go to this in near future, that site looks like an abandoned site.
White Hat / Black Hat SEO | | calvinkj0 -
Is horizontal hashtag linking between 4 different information text pages with a canonical tag to the URL with no hashtag, a White Hat SEO practice?
Hey guys, I need help. hope it is a simple question : if I have horizontal 4 text pages which you move between through hashtag links, while staying on the same page in user experience, can I canonical tag the URL free of hashtags as the canonical page URL ? is this white hat acceptable practice? and will this help "Adding the Value", search queries, and therefore rank power to the canonical URL in this case? hoping for your answers. Best Regards, and thanks in advance!
White Hat / Black Hat SEO | | Muhammad_Jabali0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
11 000 links from 2 blogs + Many bad links = Penguin 2.0\. What is the real cause?
Hello, A website has : 1/ 8000 inbound links from 1 blog and 3000 from another one. They are clean and good blogs, all links are NOT marked as no-follow. 2/ Many bad links from directories that have been unindexed or penalized by Google On the 22nd of May, the website got hurt by Penguin 2.0. The link profile contains many directories and articles. The priority we had so far was unindexing the bad links, however shall we no-follow the blog links as well? Thanks!
White Hat / Black Hat SEO | | antoine.brunel0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Hidden Text in Style Sheet
I have read Eric Enge's Comprehensive Guide To Hidden Text but I'm no coder so I would appreciate some clarification. Am I assuming correctly that the following is hidden text coded within a style sheet: here I am! Thanks!
White Hat / Black Hat SEO | | Bragg0 -
Should I use nofollow or don’t I have to worry about that?
I'm a developer and each time than I put at the bottom of the sites I build my company's logo with a link to our site. Could This action harm my website? Should I use nofollow or don’t I have to worry about that?
White Hat / Black Hat SEO | | soulmktpro0 -
How many times should one submit the same article to various websites? 1 time? 10 times? What is okay to do with the most recent Panda update?'
For link-building purposes, seemingly it was okay to post the same article to multiple sites for links in the past. However, after the most recent Panda update our thought is that this may not be a good practice. So the question is, how many times is okay to submit an article for link building purposes. Should you always only submit to one site? Is it okay to do more than once? What is the right way to submit for link-building in Google's eyes? Thanks
White Hat / Black Hat SEO | | Robertnweil10