What are your thoughts on using Dripable, VitaRank, or similar service to build URL links too dilute link profile???
-
One of my sites has a very spamy link profile, top 20 anchors are money keywords. What are your thoughts on using Dripable, VitaRank, or similar service to help dilute the link profile by building links with URLs, Click Here, more Info, etc.
I have been building URL links already, but due to the site age(over 12 years) the amount of exact match anchor text links is just very large and would take forever to get diluted.
-
Unfortunately, a lot of those links you may have a very hard time removing. They were probably part of a network that was used to sell links, and the webmaster does little or no upkeep. I also know that some webmasters or companies will ask you for money to remove the links.
Another problem is, you have to do A LOT of research to determine which links are hurting you to the best of your ability, Otherwise, if you go about it with carpet bombing approach, you might remove some links that are actually helping you. This will cause more harm.
Given your difficult situation, this is what I would do:
-
Article directories : remove all duplicate copies of articles and leave ONE copy on the best site, i.e Ezinearticles. You may even want to remove them all and put them on your own site if it is a real quality article.
-
Try to remove as many forum comments, spammy profile links.
-
Other links (i.e blog posts, etc) - for each keywords/page affected, find the the backlinks to that page. Use OSE AND Webmaster tools. WEBMASTER TOOLS IS YOUR FRIEND HERE. Google is telling you which links are being pointed to your site. If you know if a spammy link, and Google is not showing in the links to your site, you better focus your energy elsewhere.
Make a list of these sites, and examine the site: spammy jibberish content; excessive links pointing out; their backlinks (they may 100s of backlinks with spammy keywords like "viagra" or "payday loans," this is a red flag that the site is likely hurting you); links to pills, gambling, and porn.
Approach the sites via contact, email, and whois contact. Document ALL correspondence and attempts to clean up.
Once you feel you have done all you can, send Google another reconsideration request with all the documenting as you can (be thorough). Specifically, tell them which links you have made good faith attempts to remove and have been unsuccessful.
Hopefully, they will lift the penalty or let it expire.
-
-
What you seem to be talking about is the reconsideration request. If he has not gotten a message from Google warning about unnatural links it is probably a bad idea to file a reconsideration request.
Read this: http://searchengineland.com/penguin-update-recovery-tips-advice-119650
-
Thanks for the response, that is what i thought too, but don't know what else to do. The top 20 anchors are exact match keywords. the first occurrence of a generic anchor or url anchor isn't until about 21-25 depending on what tool i use to check the link profile. I need a large number of URL links in different variations mysite.com, www.mysite.com, http://www.mysite.com, click here, etc, etc. I have already removed all links I could, contacted 100s of webmasters to remove links, most successfully, some never answered. i have detailed this and sent in a reconsideration request and google replied that they STILL saw unnatural links. What would you do in a situation like this? I'm running out of ideas and would appreciate some good advice.
-
Thanks for the response. I guess i was hoping someone had used a service like the ones i mentioned and had some tips or experiences with them to share. I realize that i should not be spamming, but i really don't think i can build enough links to dilute the link profile without some service like this, and the site is completely out of google's top 50 for all the money keywords, and supporting keywords. Traffic is down from 1500 visitors a day to about 200, need to get some traffic ASAP. Don't have to much time left to wait for some signs of recovery
-
What are your thoughts on using Dripable, VitaRank, or similar service to help dilute the link profile by building links with URLs, Click Here, more Info, etc.
I hope that all of my competitors are using this stuff. I think that it is just like dipping a crappy link profile in really cheap chocolate and telling Google its a Snickers bar. It will not take them long to find out.
-
Joel,
Thanks for the response. We used to rank very well in a highly competitive vertical, all the websites that used to compete with us are in our same position, we were in such a highly competitive industry that everyone who used to rank was doing AGGRESSIVE SEO campaigns, not necessarily black hat, but very aggressive white/grey hat . We probably were guilty of some other on page grey hat techniques, which have been fixed since. I believe that we are being penalized for excessive anchor text optimization and I'm hopping some generic or URL links might help us dilute our anchor text distribution enough to get below the exact match % limit and get out from the penalty. does this make any sense to you??
-
What I would do is first determine if the spammy links have had an impact on your SERPS. You can have a crazy amount of low pr links but if you have some golden links in there you should be fine. Google most likely is already discounting those spammy links but because of your domain age and if you have high quality links you should be fine. Now if you confirm that those links are causing you harm ( perform a site audit ) then you can use googles webmaster tools to send google a message that you have attempted to have those links removed. Obviously you should actually attempt to remove them and keep a log. Hope this helps. Good luck.
-
That might solve part of your problem. Unless you know that excessive use of targeted anchor text is your ONLY problem, then this might not be a good idea. It is likely that if you have a "spammy profile," then you have links from spammy sites. Going out there and spamming some more doesn't sound like a good long term solution.
Even if anchor text was your only problem, you might cause other problems by aggressively getting "easy links."
Find the keywords that were hit hardest and actually delivered traffic. If you can identify the worst links for that page/keyword (i.e spammy sites linking to porn, pills, or gambling), then remove them to the best of your ability. Then try to add some useful content to the page itself. Then promote that content to related sites. It will take time, but it is the right answer.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Using Canonical URL to poin to an external page
I was wondering if I can use a canonical URL that points to a page residing on external site? So a page like:
Intermediate & Advanced SEO | | llamb
www.site1.com/whatever.html will have a canonical link in its header to www.site2.com/whatever.html. Thanks.0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Dealing with Penguin: Changing URL instead of removing links
I have some links pointing to categories from article directories, web directories, and a few blogs. We are talking about 20-30 links in total. They are less than 5% of the links to my site (counting unique domains). I either haven't been able to make contact with webmasters, or they are asking money to remove the links. If I simply rename the URL (for example changing mysite.com/t-shirt.html to mysite.com/tshirts.html), will that resolve any penguin issues? The link will forward to the homepage since that page no longer exists. I really want to avoid using the disavow tool if possible. I appreciate the feedback. If you have actually done this, please share your experience.
Intermediate & Advanced SEO | | inhouseseo0 -
How fast is too fast in building quality links?
So I'm working on a brand new project, and want to go after my competition rather aggressively. I have a long list of industry resources, including blogs, articles and directories, and several news/media/press release type sites that I could have link to me in a very short amount of time. How fast is too fast? Is there a penalty for getting links too fast if they are all legit?
Intermediate & Advanced SEO | | GoogleMcDougald0 -
Can obfuscated Javascript be used for too many links on a page?
Hi mozzers Just looking for opinions/answers on if it is ever appropriate to use obfuscated Javascript on links when a page has many links but they need to be there for usability? It seems grey/black hat to me as it shows users something different to Google (alarm bells are sounding already!) BUT if the page has many links it's losing juice which could be saved....... Any thoughts appreciated, thanks.
Intermediate & Advanced SEO | | TrevorJones0 -
Footer Link
Hello, Some of my hosted clients don't mind if I put a footer link on the bottom of their website. I would like to put a footer link that looks like Seomoz's - http://imgur.com/GrC8y Basically it would look like so: "Powered by "my company name". The world's #1 "keyword" provider (LOGO goes here) Here are my questions: 1. Would this hurt or help my rankings? 2. Should the logo be hosted by my clients so that a different ip is hosting my logo (where my image name will get picked up)? Or is it best to host it myself? 3. If my company name and keyword are getting linked, is that one link too many? 4. Is it a good idea to use a different keyword so that other keywords get picked up by SERPs, or should I set myself up on one keyword ? Thank you so much! Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Would the use of
Hi, I am wondering on you through relevant to SEO in the following situation. I have a "travel" website and obvisouls as part of that I have a whole list of desitinations. So I have a drop down in my page navigation, which lists all my desitinations. At the moment I see have 2 main options to display the lists as follows: 1/. Perfect Anchors, but not good for usability - IE repeating the word "holiday in a list of 100 destinations, looks spammy for one, and when the headline says "Holiday Destinations", then from a use perspective its pretty pointless and takes away from navigation rather than improves it".
Intermediate & Advanced SEO | | James77
New York Holidays
Las Vegas Holidays 2/. Non Perfect Anchors - But better for usability
New York
Las Vegas So I am thinking - would the use of the title attribute provide a perfect solution?? Or am I wasting my time with this and it is just pointless considering it as an option. EG - what I had in mind was:
3/. Ideal Solution for both SEO and usability??
New York
Las Vegas Thanks for you help in advance.0