How do I know if my SEO person is creating solid links vs spammy links?
-
Please see question
-
Some good suggestions above, try some back link checking tools, check their Domain Authority, etc. However, in my opinion, the best way for you to ensure your SEO person is building good links is to learn the basic difference between a good and bad link and actually check them yourself (the bigger your site and the more links you build, the less feasible this is, but the concept that you should be able to look at the links being built and understand what is a good or a bad link is still applicable). Obviously if you are building massive numbers of links, this is difficult (although there are tools that can help), but if your SEO employee (I assume it is singular) is building good links, they shouldn't be building massive numbers of them unless they are coming organically (through creating content or a product that is so popular that high quality links are appearing without traditional link building). Also, how are you measuring success? Ranking growth? Number of links? Quality of links? If you ask your SEO person to report on the links being built and ask he/she to include measures like Domain Authority, Page Authority, etc and then just try and audit the links periodically, you'll start to learn enough about SEO to measure their performance yourself (seriously, try Googling "audit my back links," there's some great tools out there, as well as reasonably simple explanations of the major things to look out for.
I also agree with those mentioning that outsourcing SEO is a dangerous (if somewhat necessary) strategy. In my opinion, learning about SEO basics is one of the single most valuable things a small business owner can do, since it will both improve your ability to market online, as well as protect you against hiring a bad employee.
-
SEO is too important for the small business owner to outsource it to anyone. Learn to do SEO yourself and you won't have to worry about all these shady practitioners.
-
I've never used LinkDetox, like trung.ngo mentions below but if they have a free version where you can just see if your backlink profile looks spammy to them at least you'd have one opinion on the matter. How many links are you looking to have reviewed?
-
You can hire someone, but you need to trust that they'll do a good job reviewing.
Have you asked your current SEO for a list of links that have been built?
-
You can check out http://www.linkdetox.com/. It's a link auditing tool that will at least at a high level provide some information about whether or not there are spammy links pointing to your site. I'd recommend reviewing the "toxic" links that they report back on manually though to determine if they're actually spammy links or not.
-
Is there a third party that can review the links for me?
-
This all depends on your purpose for SEO. Are you trying to rank well or are you trying to draw referral traffic through these links? Personally, I would shoot for the latter. Once you have your purpose down you should be able to work with your SEO and have them be totally transparent with you about the links they are building for you. If they aren't transparent with you or they give you excuses as to why they can't show you the links they have built that should be a potential red flag for you.
As for determining whether a link is quality or not, that really depends on who's eye is on it. I like to take a look at the websites that I have links on and determine if the site is real first off, then I ask myself if this is the type of site that people I care about are on. That's not to say that I don't have a few links on random sites that aren't necessarily spammy, but aren't really that quality either. What really matters is that you have a variation of links to your site.
It's ok to have a bunch of semi-quality links to your site, just make sure that you have more quality links that actually generate traffic and eyeballs. These are the links that are going to get you visitors and get you bumped up in the rankings. Just have a healthy diet of various links across the web. I hope this helps.
-
The first question I'd ask is where are you getting links from? If the sites are not relevant to your business or the article/page in which the link exists is not relevant to your business, I would say it's time to reevaluate your relationship with said consultant. I would also ask the SEO if they're requesting specific anchor text or not? I'd opt for no specific anchor text requests to keep the links more editorial in nature--having too much specific anchor text can get you in trouble with algorithm filters like Penguin.
Hope that helps you get started in evaluating your links!
-Trung
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links to external site (hotels link)
Hello, I am currently designing the webpages of my website and I am wondering if I should link externally or if it going to hurt me ? I am in the travel industry and for example in the France in the Loire valley, I want to list hotels that people can stay at in pre and pods trip. Is it ok to link to maybe 10 of those hotels websites or can it hurt me ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Link Resolvers, Academic Publishing, and SEO Visibility
I have recently started working at an academic publisher on their digital products. In this industry it's standard practice to use link resolvers - such as SFX from ExLibris - when updating a product as an easy way to manage URL migration. However, these link resolvers appear to use 302 redirects which makes me concerned about the potential for rankings to drop. Does anybody out there know about the use of link resolvers and their effects on search engine visibility? The main sources of information I've been able to find so far have been a Google Webmaster Central forum post and a piece on DOI news from 2005. Any information that's more up to date would be very useful, thanks!
Intermediate & Advanced SEO | | BenjaminMorel0 -
Website architecture - levels vs filters and authority loss - Enterprise SEO
Hi Everyone, I am participating in the development of a marketplace website where the main channel will be traffic via SEO. We have encountered the directories (levels) vs filters situation. 1. Does everyone still agree that if we have too many levels, authority is loss as you do down through the levels? Does everyone agree that there should be a max of 3 levels and never 4. Example 1 www.domain.com/level1/level2/level3 vs www.domain.com/level1 In theory, the content on "level 3" will have a lower DA than the content on "level1". 2. Does everyone agree that for enterprise SEO (huge marketplace websites) filters are a better idea than levels? Example 2 www.domain.com/level1/level2/level3 vs www.domain.com/filter-option1 In theory, the content on "level 3" will have a lower DA than the content on "filter-option1". Thanks so much in advance
Intermediate & Advanced SEO | | Carla_Dawson0 -
If linking to contextual sites is beneficial for SE rankings, what impact does the re=“nofollow” attribute have when applied to these outbound contextual links?
Communities, opinion-formers, even Google representatives, seem to offer a consensus that linking to quality, relevant sites is good practice and therefore beneficial for SEO. Does this still apply when the outbound links are "nofollow"? Is there any good research on this out there?
Intermediate & Advanced SEO | | danielpressley0 -
How to lay off your SEO compnay?
I have decided to replace my seo company. The pint is this company has been partly my developer too. So he has set up a demo server of my website. 1- Should I be worried about duplicate material when I end my cooperation with this company(The demo server) 2- Should I be worried that if they do not like it, they go and delete all the submitted materials and destroy my pages rankings? Thanks all
Intermediate & Advanced SEO | | AlirezaHamidian0 -
What To Do With Too Many Links?
We have four pages that have over 100 links (danger, danger from what I gather), but they're not spammy footer links. They are FAQ videos for our four main areas of practice. Does that make a difference? If not, should I just take half the questions on each page and make four additional pages? That strikes me as a worse UX, but I don't want to get penalized either. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Linking to urls with Query Parameters good for SEO?
Hey guys, I am currently buying link ad spots on sites (hardcoded, not using ad networks). I track the each link I buy and the sales they generate with query parameters such as : http://www.mydomain.com/?r=top_menu_nav_on_seomoz My question is : do these links still pass link juice? I have my canonical already set to http://www.mydomain.com Also, in Webmaster tools I have it set to ignore anything after /?r= The way I see it, a link is a link. Naturally I would prefer to send directly to my root domain, however, these links cost a lot of money and I like to track my results. Does anyone have experience with SEO and working with query parameters?
Intermediate & Advanced SEO | | CrakJason0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0