New linkbuilding: If networks are useless, and I need high volume through a 1-man team, what's the best option?
-
I work for an online retailer, and we have thousands of product pages and our vertical for content is brutal -- half of them are owned by our competitors.
Are there any new linkbuilding strategies that can be done through a 1-man team? I'm not talking about bots or traditional link networks.
Our current strat revolves around the following:
1. Link prospecting through buzzstream tools and singular contacts
2. Finding bloggers/vloggers, sending product and having them send backlinks to our homepage level with their reviews (slow turnaround, low juice).
3. Syndicating our videos through multiple avenues.
4. Being active on social.
We need to gain more authority outside of simple content building. Are there any alternatives to link networks to optimize build outs via a 1-man team?
Many thanks!
-
Thanks Brent!
I figured as much. Those articles are definitely helpful.
-
I wish I had an easy answer. There are a bunch of great posts by SEOmoz on Linkbuilding.
http://www.seomoz.org/pages/search_results?q=linkbuilding
A few I have bookmarked:
http://www.seomoz.org/blog/10-extraordinary-examples-of-effective-link-bait
http://www.seomoz.org/blog/the-power-of-using-lists-for-link-building
http://www.seomoz.org/ugc/outsource-link-building-like-a-small-seo-company
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Needed regarding DA and PA..
Hey there guys, I need help regarding the DA & PA on my site Criccraze From the previous few weeks, the DA PA of my website has gone down. Any fruitful suggestions from anyone, please? I would be heartily thankful.
Algorithm Updates | | Jashii80 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
How to determine the best keyword strategy/purpose for a blog in 2014?
Currently our blog has been used to add content to our site targeting desired keywords (fairly top-level). For example, if we wanted organic traffic for "Some City Contractors" (by no means a longtail), we would write a blog using this key term in the Title, url, a sub heading perhaps and a couple variations of the term throughout any subheadings or body copy. I think the idea was that since there was so much work to be done to get the static site pages optimized (rewriting that copy), we just decided to crank out fresh content targeting these high level KWs, assuming a search engine result is a result and as long as we got real estate there, a click and there was a link to the relevant site page in that article, we were golden (well, maybe not golden, but good). We are now building a new, responsive site and taking care to make sure that the site's relevant pages are nicely optimized. Higher level page are optimized for high-level KWs and sub pages target longer tail KWs identified in KW research. Along the way an SEO said it was bad that so many of our blogs were better optimized for key terms than the actual site pages (i.e. service pages, things you would find in the main nav.) This does make some sense to me so... So what is the new purpose for our blogs in this new age of Google and ever-increasing social influence? Should we forget about focusing on KWs already addressed within the site's core? Focus more on interesting, super long-tails that maybe don't have a ton of traffic, but are relevant (and oh by they way, something like 3 million terms are searched for the first time each day, right?)? Or forget the keywords, as long as the topic is relevant and interesting the real pay-off is in social interactions. I'm really interested to see if this results in clear-cut answer or more of a lengthy discussion...
Algorithm Updates | | vernonmack1 -
Will Google discount new gTLDs?
I have the opportunity now to acquire a very desirable generic domain name with either the .org or .pro TLD. Obviously, the .pro version I can get for a far better price. The .org will probably end up being about 10x the price of the .pro. I feel like the .org would give me instant clout while the .pro might raise eyebrows. Also, I'm concerned that Google might also discount these new gTLDs. What do you guys think? Is the perceived authority of the old-time TLDs something worth investing in? Or will this fizzle away over time as the new gTLDs flood the market? THanks! Ira
Algorithm Updates | | iraweissman0 -
Best Practices for Page Titles | RSS Feeds
Good Morning MOZers, Quick question for the community: when creating an RSS feed for one of your websites, how do you title your RSS feed? Currently, the sites I'm managing use the 'rss.xml' for the file name, but I was curious to know whether or not it would, in any way, benefit my SERP if I were to add my domain to precede the 'rss.xml', i.e. 'my-sites-rss.xml' or something of that nature. Beyond that, are there any 'best practices' for creating RSS feed page titles or is there a preferred method of implementation? Anybody have any solutions
Algorithm Updates | | NiallSmith0 -
Don't use an h1 and just use h2's?
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
Algorithm Updates | | Dave_Whitty0 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0