Link Research Tools
-
Is anyone else here a user of Link Research Tools?
I recently completed a Link Detox for my sites. However, it is saying that links from high quality press release sites are deadly and should be removed. They are also saying the same about the links from the Yellow Pages.
Obviously I know these tools are automated, but does anyone know why they are showing these links as 'deadly' and should be removed?
I have tried contacting LRT about this issue but am yet to receive a reply.
-
I think must be used just like a tool to get important data, but you must review each link one by one. I have recently had a penalty from google, and used link detox. It was nice to get the data and other information, like sitewide links, same ip c-block and so on, but, you need to check each one and make a decistion about it.
Use the "Ultimate guide to Google Penalty removal" that you can find here in Moz to decide which links are good for you and which are bad.http://moz.com/blog/ultimate-guide-to-google-penalty-removal
it was useful for me.
I am still involve trying to remove my penalty. The first Reconsideration Request was send mostly with Link Detox information, and it didn't work, we are now working in a more manual and human perspective.Hope you get rid of that penalty!!
-
Its' a personal choice yes Google doesn't like directories but they can still work especially if they are relevant. Changing to a no follow would fix any issue but knowing the cost behind it for me I'd like some bang for my buck and keep it follow if its relevant and working for you. As I said though everyone is different. the bigger directories like yell/yahoo etc. are normally fine in Google's eyes. Ill let you make our own mind up
best of luck
-
Thanks for your response.
For example, the link we have on Yell is saying that it is a 'Deadly Risk' because it looks 'highly unnatural'. From reviewing all our links, i think that LRT classifies all links from directories as 'Deadly'. However, we receive a lot of enquiries from Yell, so it would be silly to remove it. Perhaps it should be changed to a nofollow?
-
Link detox has a column (I think its rules) that tells you why its been marked up (e.g algorithm picked it up, holding page etc.) it can suffer from more than one but as you mentioned its automated and shouldn't be taken seriously at first glance.
Link detox can be great for helping you but you will still want to go through and manually look at the links as they wont all be "toxic" and it even mentioned on exporting that the links may not all be harmful so don't worry too much.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Keywords, Meta Keywords and Meta Descriptions; Keeping Webmaster Tools Current
What are Search Keywords, Meta Keywords and Meta Descriptions? What exactly is the difference between them and which one is more important? In regards to Webmaster Tools, if we delete a page or a product, it still shows up in Search Analytics. How can we update Webmaster Tools so as to keep it current with our website? Lastly, again in regards to Webmaster Tools, in Search Analytics. At the moment we put relevant queries into the Meta Description of low ranking pages, in order to raise the position of the page. Is this the right way to handle queries? Should we be putting the queries into the Meta Description or the Meta Keywords?
Reporting & Analytics | | CostumeD0 -
URL Formatting for Internal Link Tagging
After doing some research on internal campaign link tagging, I have seen conflicting viewpoints from analytics and SEO professionals regarding the most effective and SEO-friendly way to tag internal links for a large ecommerce site. It seems there are several common methods of tagging internal links, which can alter how Google interprets these links and indexes the URLs these links point to. Query Parameter - Using ? or & to separate a parameter like cid that will be appended to all internal-pointing links. Since Google will crawl and index these, I believe this method has the potential of causing duplicate content. Hash - Using # to separate a parameter like cid that will be appended to all internal-pointing links. Javascript - Using an onclick event to pass tracking data to your analytics platform Not Tagging Internal Links - While this method will provide the cleanest possible internal link paths for Google and users navigating the site and prevent duplicate content issues, analytics will be less effective. For those of you that manage SEO or analytics for large (1 million+ visits per month) ecommerce sites, what method do you employ and why? Edit* - For this discussion, I am only concerned with tagging links within the site that point to other pages within the same site - not links that come from outside the site or lead offsite. Thank you
Reporting & Analytics | | RobbieFoglia0 -
Linked my adwords account to GA and vice versa and still paid search is getting recorded into organic traffic??
Hi Mozzers, I have linked properly my adwords account to GA and vice versa and somehow I can see 3/4 of this paid traffic recorded to organic search. The most confusing part is that I can see 1/4 of the paid traffic under the "paid" metric. At this point I don't know really what should I do? Thank you guys in advance!
Reporting & Analytics | | Ideas-Money-Art0 -
Webmaster Tools Indexed pages vs. Sitemap?
Looking at Google Webmaster Tools and I'm noticing a few things, most sites I look at the number of indexed pages in the sitemaps report is usually less than 100% (i.e. something like 122 indexed out of 134 submitted or something) and the number of indexed pages in the indexed status report is usually higher. So for example, one site says over 1000 pages indexed in the indexed status report but the sitemap says something like 122 indexed. My question: Is the sitemap report always a subset of the URLs submitted in the sitemap? Will the number of pages indexed there always be lower than or equal to the URLs referenced in the sitemap? Also, if there is a big disparity between the sitemap submitted URLs and the indexed URLs (like 10x) is that concerning to anyone else?
Reporting & Analytics | | IrvCo_Interactive1 -
Lost rankings after disavowing links
About two months ago, I received an unnatural inbound links message from Google. Then I disavowed 58 (the worst ones) and now I can see that right after the date I submitted my disavow file I'm losing rankings. What would you suggest? I don't really want to revoke my disavow file because it has totally bad links. I have this idea to build 58 links from high quality sites (instead of the 58 I disavowed). Do you think it'll work faster (if at all) or I just need to remove my disavow file?
Reporting & Analytics | | VinceWicks0 -
Enabling Webmaster Tools data within Analytics
Hello, Im having a hard time connecting webmaster tools within Google Analytics i want to be able to see search queries in GA That's what Google tells me to do : "You can visit the Property Settings page in Analytics account management to change which of your Webmaster Tools sites' data you wish to show, and control which profiles on your Web Property have access to view the data." I cant find "Property Settings Page" in google analytics, or anything that has to do with "Webmaster tools" I was wandering if you can help me on that 🙂 Thanks
Reporting & Analytics | | tonyklu0 -
Megamenu: Too many links really bad?
Hi there! Our site hosts paid training videos, and has a javascript menu that lists EVERY video on the site, and it is our most-used method of navigation. The menu is structured to look like the business software our training videos cover, so it's very intuitive for users. That said, since we currently have so many videos EVERY page has more than 250 links. The only way to get this down to under 100 as SEOMOZ recommends is to delete/hide the link from being seen by search engines. What should I do? Is the menu worth being visible to search engines? businessonetraining.com
Reporting & Analytics | | TigerSheep0 -
Why Webmasters Tools verification meta tags change??
Why do these keep changing for sites that I have verfied over and over and over and over? About a month goes by and the meta code has changed and the site is unverified again. Seems like a trick to force you into logging into Webmaster Tools all the time. What gives???
Reporting & Analytics | | 2CDevGroup0