Removing links - Best practice
-
Hi
I have noticed on webmaster that I have a lot of links to my sites from link building directories. Either I did this many years a go or somehow they've linked to me.
Would links to link building directories harm my site?
i.e linkspurt.com
I have quite a few and just wondering what to do with them. Also I have some customer sites which are massive one site has 38,000 links coming to my site as I have put a credit that I built the site with a link back to mine. It has a low score in Google would this also harm my site?
Any advise would be appreciated.
-
Google has said that the disavow tool should be the last step, after you've tried all other methods to get the link removed, rather than the first step. I'd do as Irving suggested for links that you do want to remove.
-
If I use the google disavow some of these links? would that be enough?
-
Definitely remove the sitewides as soon as possible. That footer designed by looks like a sponsored theme link to Google.
I would reach out to the directory sites via their contact form and ask them to remove your site, tell them you were penalized. I have found directory sites to not be too bad in terms of responsiveness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Worried About Broken Links
In Wordpress, I'm using a plugin called Broken Link Checker to check for broken links. Should I be worried about/spend time fixing outbound links that result in: 403 Forbidden -Server Not Found -Timeout -500 Internal Server Error -etc. Thanks for your help! Mike
Technical SEO | | naturalsociety3 -
Internal no follow links
I have just discovered that the WordPress theme I have been using for some time has no follow internal links on the blog. Simply put each post has an image and text link plus a 'read more'. The Read more is a no-follow which is also on my homepage. The developer is saying duplicate follow links are worse than an internal no follow. What is your opinion on this? Should I spend time removing the no follow?
Technical SEO | | Libra_Photographic0 -
Too Many Links?
Search Term is Indianapolis Wedding Photographers. Site is http://www.tallandsmallphotography.com/ Their metrics are through the roof compared to everyone else's. They've dropped from 27 in May to 40 Now. 'A' Grade on-site optimization. Either there's too many links, or there's some bad links involved... I don't know which it is...
Technical SEO | | WilliamBay0 -
Is it possible to export Inbound Links in a CSV file categorized by Linking Root Domains ?
Hi, I am performing an analysis of the total inbound links to my homepage and I would like to have the total amount of inbound links categorized by the Linking root domains. For example, the Open Site explorer does offer the feature to show you the Linking Root Domains to your page. Then when you click on the first Linking Root Domain, it also shows you the Top Linking Pages ( Which means all the pages that link to your page from this particular top level domain) Now I would like to export this data to a CSV file, but open site explorer only exports the total amount of top level linking domains. Does anyone has a solution to this problem ? Thank you very much for the help in advance!
Technical SEO | | Feweb0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Remove Links or 301
Howdy Guys, Our main site has been hit pretty hard by penguin and we are just wondering what steps we should now take. For the past 2 months we have been working through our back link profile removing spammy / un-natural links, we have documented everything in a spreadsheet... We recently submitted a reconsideration request to Google and they have now responded saying we still have bad links. I'm just wondering would be it easier just to 301 redirect our site to another TLD we have for our main site? Or Do we keep working through our links 1 by 1 and removing them? Has anyone had any success in 301ing? Thanks, Scott
Technical SEO | | ScottBaxterWW0 -
Cloaking? Best Practices Crawling Content Behind Login Box
Hi- I'm helping out a client, who publishes sale information (fashion sales etc.) In order for the client to view the sale details (date, percentage off etc.) they need to register for the site. If I allow google bot to crawl the content, (identify the user agent) but serve up a registration light box to anyone who isn't google would this be considered cloaking? Does anyone know what the best practice for this is? Any help would be greatly appreciated. Thank you, Nopadon
Technical SEO | | nopadon0 -
External Linking & Your sites Link juice
Hey guys, quick question. Does a page lose link juice when it gives link juice? If I link to an outside site, do I lose that same amount of link juice or is it just applied to there site and not removed from mine? I understand that linking to a competitor can in turn help him and hurt me (if he then is seen as more relevant than me to google) but does it have a direct relation to hurting/removing my page link juice? Hope this all makes sense. Thanks
Technical SEO | | SheffieldMarketing0