Leveraging "Powered by" and link spam
-
Hi all,
For reference: The SaaS guide to leveraging the "Powered By" tactic.
My product is an embeddable widget that customers place on their websites (see example referenced in link above). A lot of my customers have great domain authority (big brands, .gov's etc).
I would like to use a "Powered By" link on my widgets to create high quality backlinks.
My question is: if I have identical link text (on potentially hundreds) of widgets, will this look like link spam to Google?
If so, would setting the link text randomly on each widget to one of a few different phrases (to create some variation) avoid this?
Hope this makes sense, thanks in advance.
-
I'd defenitely recommend not to use keyword rich anchor text. Just use your brand name and diversify your link profile.
-
Dan,
Thanks for taking the time to respond to my question.
Your advice is sound. Matt certainly advises a nofollow however at the beginning he cautions against making widget links the primary source of link building in a strategy. At the end he says that links from widgets don't "carry the same weight" as links freely given.
As such, I wouldn't necessarily expect a blanket penalty for widget links. Rather than abandon widget links entirely I will instead apply a nofollow to all the links except a hand selected few on the very best domains (.govs and major brand / media sites).
Hopefully this approach will not raise any red flags (or black hats as the case may be).
Thanks again.
-
I would be very careful making embeddable widgets as an important facet of your link building campaign. This tactic used to work very well, but has been on Google's radar for some time now. In August of last year, Matt Cutts said the following: "I would recommend putting a nofollow, especially on widgets." The attached video of him discussing this may be helpful to you as you consider this tactic.
With regards to the anchor text, I would be VERY careful with it if you decide to proceed. I would personally recommend abandoning this tactic (unless there is a value outside of link building) and investing in high-quality content instead, but, if you do decide to proceed, I would build solely branded anchor text. This would be more defendable if a Google engineer ever flags the site. It won't look like you were trying to game the rankings on a keyword, but may still have a positive impact on the rankings. I would proceed with caution before doing that though.
Instead of putting the effort into a widget, I would put it into something that lives on your site (evergreen content) and provides a ton of value to end users. That will attract links and real users.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
Redundant categorization - "boys" and "girls" category. Any other suggestions than implementing filtering?
One of our clients (a children's clothing company) has split their categories (outwear, tops, shoes) between boys and girls - There's one category page for girls outwear, and one category for boys outwear. I am suspecting that this redundant categorisation is diluting link juice and rankings for the related search queries. Important points: The clothes themselves are rather gender-neutral, girl's sweaters don't differ that much from the boy's sweaters. Our keyword research indicates that norwegians' search queries are also pretty gender neutral - people are generally searching after "children's dresses", "shoes for kids", "snowsuits", etc. So these gender specific categories are not really reflective of people's search behavior. I acknowledge that implementing a filter for "boys" and "girls" would be the best way to solve this redundant categorization, but that would simply be to expensive for our client. I'm thinking that some sort of canonicalisation would be the best approach to solve this issue. Are there any other suggestions or comments to this?
Technical SEO | | Inevo0 -
"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical. The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
Technical SEO | | RosemaryB
https://support.google.com/webmasters/answer/44231?hl=en GWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard. I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links. Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content? Thanks for your feedback. Rosemary0 -
How unique does a page need to be to avoid "duplicate content" issues?
We sell products that can be very similar to one another. Product Example: Power Drill A and Power Drill A1 With these two hypothetical products, the only real difference from the two pages would be a slight change in the URL and a slight modification in the H1/Title tag. Are these 2 slight modifications significant enough to avoid a "duplicate content" flagging? Please advise, and thanks in advance!
Technical SEO | | WhiteCap0 -
Why is this url showing as "not crawled" on opensiteexplorer, but still showing up in Google's index?
The below url is showing up as "not crawled" on opensitexplorer.com, but when you google the title tag "Joel Roberts, Our Family Doctors - Doctor in Clearwater, FL" it is showing up in the Google index. Can you explain why this is happening? Thank you http://doctor.webmd.com/physician_finder/profile.aspx?sponsor=core&pid=14ef09dd-e216-4369-99d3-460aa3c4f1ce
Technical SEO | | nicole.healthline0 -
Value of Twitter Links
Let's ignore the "social metric" value of Twitter links and mentions and look at it from the pure link juice point of view. Twitter accounts such as http://twitter.com/randfish used to have their own PageRank and were treated as separate URLs. Twitter changed that to http://twitter.com/#!/randfish consolidating all their content to a single URL. When I search for "randfish" in Google, however, the result is the first URL version. Some clarification on this matter would be much appreciated.
Technical SEO | | Dan-Petrovic0