What Backlink Services Do You Use or Recommend?
-
Hi,
I need some help w/ link blding and would like to outsource some of it. Can you recommend any reputable company thats affordable?
Thanks
-
Ha! I've seen those far too often. I don't know the process that led a company to get a contract by building links like that. Frankly I'd be interested to see their proposal.
I was given the opportunity to look at competitor proposals on several occasions. Most bank on giving an informational overload to show that they have top-secret SEO tools and knowledge of the industry. I saw a 40-page PDF spitting out data from what seemed like a normal SEOmoz account. I think they get the deal-maker to focus on data and "wow" them instead of putting focus on case studies and guarantees.
I just can't imagine why a proposal would ever be over five pages. I think Alan Weiss also states that all proposals should be under three pages. Harder to do sometimes given large projects, but still a good rule to follow.
Cheers to tasteful back linking.
-
I think this type of hands off approach is useful for smaller companies who are not accountable to share holders. I can well imagine the annoyance a team of link builders may experience documenting in forensic detail their activities.
There are lots of examples of blue chip's getting burned, by "don't ask don't tell" link building.
I only today saw a very high profile supermarket chain in the UK back linking to picture from "Japansesearthquakepictures.com"..
I don't know how much more tasteless back linking can get than that.
-
Well I'm fine giving broad answers to the question (guest blogging, agreements with webmasters, social/forum associated link building.. etc) but anything more than that is unnecessary.
I had a client recently that wanted to know exactly how many hours I worked, a spreadsheet containing every link I built, and the list goes on. This was after we had signed a contract stating that I price based on value and that reporting like this would not fit into the contract. Instead I offered a guarantee on rankings and a broad idea of the links I build (white hat only).
When you outsource SEO you aren't just buying links. You are buying the case studies that came from the consultant or company as well. They know what works. It doesn't make sense for them to divulge every detail. Alan Weiss states that at the very least, reporting like this should cost extra. A lot extra. (I agree)
As long as you check out the references, portfolio, and case studies you will be fine outsourcing to a reputable company/consultant. It just might not be affordable like the OP requested. (And by affordable, I mean not initially affordable. All SEO projects should give a return to the client, even counting your fees, within one year is the general stance I take.)
-
This is something I am considering myself, when I ask SEO companies where they source links from etc I have had answers that range from...
"Our back linking methodology has been approved by the board" or
"It would be infringing on our intellectual property" or
"If we told you, you know enough about SEO to go out and do it yourself"..
My thoughts are if you have a number of responsibilities like I do, as long as you set out from the start what you need from an SEO company and keep them on a short leash for the first quarter you can get a good feeling if the links they are getting you are quality and equally as important are getting cached and indexed.
As with most things it comes down to budgets and resource.
-
Can you recommend any reputable company thats affordable?
You really have to pick if you want reputable or affordable.
I would recommend that you do the process yourself because it's much cheaper than outsourcing in most cases. Any time I have outsourced SEO I ended up getting burned in some shape or form, so I don't have any companies to recommend. Sorry.
If you give more details about your project perhaps the community may give you advice on doing the task yourself.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can some sort of wildcard redirect be used on a single folder path?
We have a directory with thousands of pages and we are migrating the entire site to another root URL. These folder paths will not change on the new site, but we don't want to use a wildcard to redirect EVERYTHING to the same folder path on the new site. Setting up manual 301 redirects on this particular directory would be crazy. Is there a way to isolate something like a wildcard redirect to apply only to a specific folder? Thanks!
Intermediate & Advanced SEO | | MJTrevens0 -
Should I use https schema markup after http-https migration?
Dear Moz community, Noticed that several groups of websites after HTTP -> HTTPS migration update their schema markup from, example : {
Intermediate & Advanced SEO | | admiral99
"@context": "http://schema.org",
"@type": "WebSite",
"name": "Your WebSite Name",
"alternateName": "An alternative name for your WebSite",
"url": "http://www.your-site.com"
} becomes {
"@context": "https://schema.org",
"@type": "WebSite",
"name": "Your WebSite Name",
"alternateName": "An alternative name for your WebSite",
"url": "https://www.example.com"
} Interesting to know, because Moz website is on https protocol but uses http version of markup. Looking forward for answers 🙂0 -
Using a pre-design template and SEO
Hi, If I use a template that maybe 50 other websites use but customise it my way will I still rank or will it hurt my ranking because other websites have the same template (even though they are in a different industry). Thanks,
Intermediate & Advanced SEO | | seoanalytics0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Which automatic redirects to use in International SEO
Hi, I need help with international SEO redirects. I'm going to have intelligencebank.com/au for Australian visitors and intelligencebank.com for the rest of the world. I would like to automatically redirect aus users that land on .com to .com/au and vice versa for non-australian users. 1. Which automatic redirects should I use: a) java script because it will allow US based google bots to crawl my /au website (bots won't read javascript so they won't be redirected) b) http redirects c) 301 redirects d) 302 redirects e) anything else? a) Should I still use rel alternate even though I only use english? b) if I should add rel alternate, can I still keep my existing rel canonical tags that are use to avoid duplicate content (I use a lot of utm codes when advertising)
Intermediate & Advanced SEO | | intelligencebank0 -
I am launching a new site, what I need to know about backlinks
Hi 🙂 soon I am launching a new site. It will be focused on topics: Android, smartphones, reviews etc... First I will buy a good looking WP theme, it will be responsive theme. after that when I set up all in theme, after that I will start writing. I will try to write interesting and good content.... This content I will share on my pages: Facebook, Twitter, Google+ and Youtube. So only remains, how to get good backlinks ? good backlinks for Google ? Blog commenting is not good, right ? Writing on relative forums and posting links as source of that information ? Contacting other relative sites for sharing my content ? Am I missed something or not ? In your opinion, which is the best way of getting good backlinks ? And what is the best solution in the beginning, because no one will know my site because my site is new. Thank you 🙂
Intermediate & Advanced SEO | | Ivek990 -
Rankings dropped - lost 3,000 backlinks (but from same domain)
lost a lot of rankings. 2 things have happened to my site over the past 2 weeks and I am trying to establish which may be the case: My site has 30.000 links from a top news site (on main bar across all pages - not for SEO, it just happened). My site lost 3,000 links from this site last week, when they were doing some cleaning up, which means I still have 25,000+ links from them. My site - job site - has published another job board's jobs so I get paid per click. This means ratio of my job vs external jobs is 1 to 5. So basically, I added last week 5 times more jobs that is already live on external site to get paid per click. In other words, my own site's unique content went down big time. Which of the 2 is more likely to cause the massive drop in rankings since last night?
Intermediate & Advanced SEO | | knielsen0 -
Ever Wise to Intentionally Use Javascript for Global Navigation?
I may be going against the grain here, but I'm going to throw this out there and I'm interested in hearing your feedback... We are a fairly large online retailer (50k+ SKUs) where all of our category and subcategory pages show well over 100 links (just the refinement links on the left can quickly add up to 50+). What's worse is when you hover on our global navigation, you see the hover menu (bot sees them as ) of over 80 links. Now I realize the good rule of thumb is not to exceed 100 links on a page (and if you did your math, you can see we already exceeded that well before we let the bots get to the good stuff we really wanted them to crawl in the first place). So... Is it wise to intentionally shield these global nav links from the bots by using javascript?
Intermediate & Advanced SEO | | mrwestern0