Non-Recognition of Links
-
Hi All,
I asked about a client last month and have had to do some other digging to try to find out what's going on with its Google rankings.
According to our link-building spreadsheet, we have got up to 50 links (from 50 domains) in process of being actioned and a large proportion of these are actually in existence.
There are two questions:-
1. Open Site Explorer only recognises 3 domains - as I know that other domains exist and are pointing (mostly 'followed'), what can be the reason OSE doesn't recognise this?
2. What can be done to encourage these external links to be more easily accessible by OSE and, presumably other bots?
Other Points:-
1. I initially thought it might be crawl blocking issue causing the rankings, but Bing/Yahoo rankings are slowly dragging themselves upwards.
2. Robots.txt is not blocking any of the site
3. Pro on-site analysis for the target keyword is 'A'
4. The website's stats per OSE are better than some competitors in the top 20 except on the root domain issue, which is why the above point is important.
Link building for other clients has worked really well without hiccups and with general gradual recognition, so any tips from more experienced folks out there would be gratefully appreciated.
Many thanks,
Martin
-
Hi Martin,
You might find it useful to take a look at the Linkscape Update Schedule in case timing is a factor.
I believe Rand outlined the recent changes to the indexing rationale in this webinar
Using Open Site Explorer to Uncover New Marketing Opportunities , but if you still have questions then as Brian suggested, it may be a good idea to lodge a ticket or email the Help Team. help [at] seomoz.org.
Hope that helps,
Sha
-
Hey, if the page you got the link on was interesting enough that you wanted to get a link on it, than what harm is there in letting the world know about that resource via Twitter, Facebook, or whatever other service you choose? ... and if it's not worth talking about, or you would be embarrassed to speak of it, than how "quality" was that link anyway
On the OSE Catch 22, gotcha... all I can think of is that perhaps the low quality sites are not always re-crawled with the update, thus not picking up the new links?. An SEOmoz staffer with intimate knowledge of the crawl behavior could better answer that one though
Brian
-
Hi guys,
Thanks for the feedback so far and I will be definitely checking GWT and maybe even tweeting out the links. I did think that seemed a little bit... you know, false - but I guess it's just ensuring Google takes note of the actual page? What do people think? I'm unwilling to Facebook them out, because that's even more 'in your face' and I'm unwilling to SPAM out 50 domains just to get them indexed. Advice welcomed on these points.
@Brian - yes, I suppose they could be coming from lower domains, but equally many have been pulled from competitor link data from OSE, so catch 22?
@Theo - I will double-check
@Ross - firmly NO to Black Hat. I don't do this ANYWAY, but equally something's screwing up the SEO anyway, so going down that route could permanently jeopardise the site and that's not what the client's paying for.
-
Like Theo said, I would start with Webmaster Tools ( Links to your site > All domains area ), if they are in there, Google knows about them, and if they have any value to pass though that link, they are passing it.
One other quick note, if you know those pages you are getting links from are all index and follow pages, you may want to just double check to see if they have been indexed ( Google search for site:www.the-exact-domain.com/and-page-url.html ), if you get no results back, then you know those pages are not in the index (not found yet, or otherwise dropped).
On the OSE thing, if I am remembering this correctly Rand said something about how they were focusing the crawl, pulling in less low quality sites - could it be that the domains you are getting links from are low quality?
Brian
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
The fact that OSE doesn't pick up a link doesn't necessarily mean a link isn't 'active' and giving your site value. Even though Linkscape captures a vast amount of URLs, it only crawls a portion of the web, most likely from the bigger pages down. If many of these links to your site are coming from smaller / less powerful domains, they might not (yet) have been picked up by Linkscape.
Try looking at Google Webmaster central to see if the links are included there. If Google lists them as links there, they are very likely to be counted by them as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to see if Internal Links are in the copy, nav or footer?
I've used Open Site Explorer to crawl my site for Internal Links. However I would like to see just internal links within the copy, not in the nav or footer. Is there a way of doing so that isn't manual? Or is there another tool that would do so?
Moz Pro | | omac0 -
Crawl Diagnostics - 350 Critical errors? But I used rel-canonical links
Hello Mozzers, We launched a new website on Monday and had our first MOZ crawl on 01/07/15 which came back with 350+ critical errors. The majority of these were for duplicate content. We had a situation like this for each gym class: GLOBAL YOGA CLASS (canonical link / master record) YOGA CLASS BROMLEY YOGA CLASS OXFORD YOGA CLASS GLASGOW etc All of these local Yoga pages had the canonical link deployed. So why is this regarded as an error by MOZ? Should I have added robots NO INDEX instead? Would think help? Very scared our rankings are gonna get effected 😞 Ben
Moz Pro | | Bendall0 -
Link reporting.
Is there a way in the Pro reporting where I can see a summary of the number of incoming links by type (blogs / news / wiki / dir / forums etc)? Even better, could the report give me an average Page Rank for each link type? Thanks,
Moz Pro | | CarlDarby0 -
What is more important, page authority or linking root domains?
I have 2 pages in my report that are identical: www.ferringway.com (more page authority......21 vs. 1) www.ferringway.com/index.php (more linking root domains...... 40 vs. 21) Which one should I delete or should I do something like a 301 redirect? Your help is greatly appreciated! Sheryl
Moz Pro | | TOMMarketingLtd.0 -
Should I worry about limiting link count on product listing/category pages?
I've noticed that my link count is high (165ish for some) on my category listing pages. I've been scouring my page to see if there's any way that I can reduce the link count without restricting functionality to the end user. Each product listing on the category page has 5 links currently: A link to the product in the title A link to the product from the image An 'add to compare' link An 'add to cart' link An 'add to wishlist' link When the customer chooses to show 30 products per page, the link tally goes off the scale. So I have two questions: Firstly - is it appropriate to keep link count down in this scenario? To elaborate - is it just inevitable that product listing pages will have lots of links, and should I just assume that Google knows this and forget about these warnings. Secondly - There are two links to the same page (the title and image links to the product page). Does SEOmoz include this in the link count, and more importantly, will Google take heed of these when deciding whether the page is too link-heavy?
Moz Pro | | SimonGreer0 -
Tons of Crappy links in new OSE (Open Site Explorer)
I am starting to miss the old OSE. I've found that for a lot of the pages on our site, the new OSE is showing WAY more links and most of them are garbage nonsense links from China, Russia, and the rest of the internet Wild West. For instance, in the old OSE, this page used to show 9 linking domains: http://www.uncommongoods.com/gifts/by-recipient/gifts-for-him It now shows 454 links. Some of the new links (about 5 of them) are legitimate. The other 400+ are garbage. Some are porn sites, most of them don't even open a web page, they just initiate some shady download. I've seen this for other sites as well (like Urban Outfitters) This is making it much harder for me to do backlink analysis on bc I have no clue how many "Normal" links they have. Is anyone else having this problem ? Any way to filter all this crap out ? See attached screenshot of the list of links I'm getting from OSE. NHXnn
Moz Pro | | znotes1 -
Link Count Per Page Including JavaScript Links - Should We Worry About Them?
With large ecommerce sites, we usually have more than 100 links per page and many times have more than 200 links on each page due to links and images in the header, footer, guided navigation and then the body product grid and content. When I use most on-page link counting tools like SEO x-ray and the SEO Moz Pro crawl report, I notice that every visible link on the page gets counted. This includes and javascript based links that expand the product grid to 30, 60 or view all, javascript sorting links, javascript links to view customer reviews for each product. etc. There was a QA post here http://www.seomoz.org/q/should-i-nofollow-the-main-navigation-on-certain-pages about nofollowing and page rank sculpting and it seems pretty unanimous that most don't think that page rank sculpting is very valuable. So my question is, are the javascript links on pages that don't link to another page viewed differently by search engines? If so, shouldn't there be a way to see on-page link count minus javascript call links that don't actually link to another page? To expand a bit on my question, we also use nofollow attributes on the text links in the left navigation that are meant for refining products just as the javascript links in the product grid are meant to refine the products, sort them, allow for product comparison, allow for viewing customer reviews, etc. So should it be ok to have 300 links on a page if the unimportant ones that you don't want crawled like the left navigation refinements and product grid javascript links all have rel="nofollow" applied to them? I know that would basicly be PageRank sculting, but it seems like the best options for shopping sites that have a lot of navigation links.
Moz Pro | | abernhardt0 -
Is the "Too Many Links" metric a blunt instrument?
The SEO Moz crawl diagnostics suggest that we have too many on-page links on several pages including on our homepage. How does your system determine when a page has too many links. It looks like when a page as more than 100 links it’s too many. Should the system take into account the page authority, domain authority, depth of page or other metrics? In addition, no-follow links are being included. As these are dropped from Google’s link graph, does it matter if we have too many? For example, we could have 200 links, 120 of which are no follow. Your tools would tell us we have too many links. Feedback appreciated. Donal
Moz Pro | | AdiRste0