How do YOU use site explorer?
-
I normally use open site explorer to identify links that competitors of my clients have and sometimes this gives me what I call 'some low hanging fruit' to go after. (and of course links that are more challenging to get)
I don't know why this didn't occur to me sooner. If my client is a chiropractor why not look at the links for 50 or 100 of the top rankings chiropractic sites all over the US? This would HAVE to uncover a wealth of blogs to comment on that have good authority, great industry associations, publications, forums - a whole wealth of items.
It made me wonder how many people use site explorer like I have been (top 3-4 competitors that your client has) or identifying links pointing to LOTS of competitors? How do you use it?
Couldn't you almost base an entire link building campaign using OSE? Why would this be a bad idea if not?
Just some random thoughts. THE WEEKEND IS ALMOST HERE - Have a great day everybody!
-
One way I use it is for ideas of types of links I might be able to get for my husband's business (selling model RC warships). There are only a couple of competitors for the fiberglass ship hulls in the US, so I look at their backlink profile. Just this evening I was browsing through, and saw that my competitor had a link from a crew website for people who had served on a certain ship, for anyone interested in building a model of that ship.
We don't carry the same hull so we couldn't get a link on that page, but we do have about 35 hulls, so there are opportunities for several links from other sites.
-
If this is the case for me, then i would just take top 2 competitors only with all their links. And then start refining the each links manually whether this is okay to consider or simply ignore the one, finally you will get clean links opportunities areas where you need to build links and compete your competitors. I won with THIS. .yes, you can too!
-
You could build a link campaign entirely around OSE but I would also look at other options too that competitors haven't thought of.
-
We use OSE exactly like you do, but also use it as an education tool for our clients. Many people has so many misconceptions about SEO and if you take the time to explain the basis of what you are actually doing they will see the value.
-
hi
Sure you could. We use it to export link reports to excel and then filter by moz page rank. we'll do that for every site that comes up in google under terms we want to compete for. next, we'll add a column in excel for yes/no to qualify the links in terms of whether we can/should go after them. It's a great tool to wrap your head around what other sites are doing and where you are relatively and to get in that "link circle" so to speak.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using Weglot on wordpress (errors)
Good day to you all, Does anyone have experience of the errors being pulled up by Moz about the utility of the weglot plugin on Wordpress? Moz is pulling up URLs such as: https://www.ibizacc.com/es/chapparal-2/?wg-choose-original=false These are classified under "redirect issues" and 99% of the pages are with the ?wg-choose parameter in the URL. Is this having an actual negative impact on my search or is it something more Moz related being highlighted. Any advice be appreciated and a resolution .. Im thinking I could exclude this parameter.
Moz Pro | | alwaysbeseen0 -
How to deal with auto generated pages on our site that are considered thin content
Hi there, Wondering how to deal w/ about 300+ pages on our site that are autogenerated & considered thin content. Here is an example of those pages: https://app.cobalt.io/ninp0 The pages are auto generated when a new security researcher joins our team & then filled by each researcher with specifics about their personal experience. Additionally, there is a fair amount of dynamic content on these pages that updates with certain activities. These pages are also getting marked as not having a canonical tag on them, however, they are technically different pages just w/ very similar elements. I'm not sure I would want to put a canonical tag on them as some of them have a decent page authority & I think could be contributing to our overall SEO health. Any ideas on how I should deal w/ this group of similar but not identical pages?
Moz Pro | | ChrissyOck0 -
Using Seomoz for Site Evaluation am I up to par ?
Just wanted to see how people using the seomoz bar would rate a four month old site with Domain-Homepage Authority of 27 Mozrank of 5.08 and Moztrust of 5.65 . I've read up on all the factors but just wanted to know if Im up to par on building a great site thats search engine friendly. Inner pagers are on a PA of 20 and around the same mozrank and moztrust levels of +- 5.
Moz Pro | | NikolasNikolaou0 -
Can 2 people from our company use the SEOMoz toolbar?
Hello, I've got SEOMoz Pro Can 2 of us use the search toolbar at once, or do we need to pay twice? Thanks! Bob
Moz Pro | | BobGW0 -
Handling long URLs and overly-dynamic URLs on eCommerce site
Hello Forum, I've been optimizing an eCommerce site and our SEOmoz crawls are favorable for the most part, except for long URLs and overly-dynamic URLs. These issues stem from two URL types: Layered navigation (faceted search) and non-Google internal search results. I outline the issues for each below. We use an SEO-friendly URL structure for our product category pages, but once bots start "clicking" our layered navigation options, all the parameters are appended to our SEO-friendly urls, causing the SEOmoz crawl warnings. Layered Navigation :
Moz Pro | | pano
SEO-Friendly Category Page: oursite.com/shop/meditation-cushions.html Effects of layered navigation: oursite.com/shop/meditation-cushions.html?bolster_material_quality=414&bolsters_appearance=206&color=12&dir=asc&height=291&order=name As you can see the parameters include product attributes and page sorts. I should note that all pages generated by these parameters use the element to point back to the SEO-friendly URL We have also set up Google's Webmaster Tools to handle these parameters. Internal Search Function:
Our URLs start off simple: oursite.com/catalogsearch/result/?q=brown. Then the bot clicks all the layered navigation options, yielding oursite.com/catalogsearch/result/index/?appearance=54&cat=67&clothing_material=83&color=12&product_color=559&q=brown. Also, all search results are set to noindex,follow. My question is: Should we worry about these overly-dynamic and long ULR warnings? We have set up canonical elements, "noindex,follow" solutions, and configured Webmaster Tools to handle our parameters. If these are a concern, how would you resolve these issues?0 -
Tons of Crappy links in new OSE (Open Site Explorer)
I am starting to miss the old OSE. I've found that for a lot of the pages on our site, the new OSE is showing WAY more links and most of them are garbage nonsense links from China, Russia, and the rest of the internet Wild West. For instance, in the old OSE, this page used to show 9 linking domains: http://www.uncommongoods.com/gifts/by-recipient/gifts-for-him It now shows 454 links. Some of the new links (about 5 of them) are legitimate. The other 400+ are garbage. Some are porn sites, most of them don't even open a web page, they just initiate some shady download. I've seen this for other sites as well (like Urban Outfitters) This is making it much harder for me to do backlink analysis on bc I have no clue how many "Normal" links they have. Is anyone else having this problem ? Any way to filter all this crap out ? See attached screenshot of the list of links I'm getting from OSE. NHXnn
Moz Pro | | znotes1 -
Any tools for scraping blogroll URLs from sites?
This question is entirely in the whitehat realm... Let's say you've encountered a great blog - with a strong blogroll of 40 sites. The 40-site blogroll is interesting to you for any number of reasons, from link building targets to simply subscribing in your feedreader. Right now, it's tedious to extract the URLs from the site. There are some "save all links" tools, but they are also messy. Are there any good tools that will a) allow you to grab the blogroll (only) of any site into a list of URLs (yeah, ok, it might not be perfect since some sites call it "sites I like" etc.) b) same, but export as OPML so you can subscribe. Thanks! Scott
Moz Pro | | scottclark0