Non-Recognition of Links
-
Hi All,
I asked about a client last month and have had to do some other digging to try to find out what's going on with its Google rankings.
According to our link-building spreadsheet, we have got up to 50 links (from 50 domains) in process of being actioned and a large proportion of these are actually in existence.
There are two questions:-
1. Open Site Explorer only recognises 3 domains - as I know that other domains exist and are pointing (mostly 'followed'), what can be the reason OSE doesn't recognise this?
2. What can be done to encourage these external links to be more easily accessible by OSE and, presumably other bots?
Other Points:-
1. I initially thought it might be crawl blocking issue causing the rankings, but Bing/Yahoo rankings are slowly dragging themselves upwards.
2. Robots.txt is not blocking any of the site
3. Pro on-site analysis for the target keyword is 'A'
4. The website's stats per OSE are better than some competitors in the top 20 except on the root domain issue, which is why the above point is important.
Link building for other clients has worked really well without hiccups and with general gradual recognition, so any tips from more experienced folks out there would be gratefully appreciated.
Many thanks,
Martin
-
Hi Martin,
You might find it useful to take a look at the Linkscape Update Schedule in case timing is a factor.
I believe Rand outlined the recent changes to the indexing rationale in this webinar
Using Open Site Explorer to Uncover New Marketing Opportunities , but if you still have questions then as Brian suggested, it may be a good idea to lodge a ticket or email the Help Team. help [at] seomoz.org.
Hope that helps,
Sha
-
Hey, if the page you got the link on was interesting enough that you wanted to get a link on it, than what harm is there in letting the world know about that resource via Twitter, Facebook, or whatever other service you choose? ... and if it's not worth talking about, or you would be embarrassed to speak of it, than how "quality" was that link anyway
On the OSE Catch 22, gotcha... all I can think of is that perhaps the low quality sites are not always re-crawled with the update, thus not picking up the new links?. An SEOmoz staffer with intimate knowledge of the crawl behavior could better answer that one though
Brian
-
Hi guys,
Thanks for the feedback so far and I will be definitely checking GWT and maybe even tweeting out the links. I did think that seemed a little bit... you know, false - but I guess it's just ensuring Google takes note of the actual page? What do people think? I'm unwilling to Facebook them out, because that's even more 'in your face' and I'm unwilling to SPAM out 50 domains just to get them indexed. Advice welcomed on these points.
@Brian - yes, I suppose they could be coming from lower domains, but equally many have been pulled from competitor link data from OSE, so catch 22?
@Theo - I will double-check
@Ross - firmly NO to Black Hat. I don't do this ANYWAY, but equally something's screwing up the SEO anyway, so going down that route could permanently jeopardise the site and that's not what the client's paying for.
-
Like Theo said, I would start with Webmaster Tools ( Links to your site > All domains area ), if they are in there, Google knows about them, and if they have any value to pass though that link, they are passing it.
One other quick note, if you know those pages you are getting links from are all index and follow pages, you may want to just double check to see if they have been indexed ( Google search for site:www.the-exact-domain.com/and-page-url.html ), if you get no results back, then you know those pages are not in the index (not found yet, or otherwise dropped).
On the OSE thing, if I am remembering this correctly Rand said something about how they were focusing the crawl, pulling in less low quality sites - could it be that the domains you are getting links from are low quality?
Brian
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
The fact that OSE doesn't pick up a link doesn't necessarily mean a link isn't 'active' and giving your site value. Even though Linkscape captures a vast amount of URLs, it only crawls a portion of the web, most likely from the bigger pages down. If many of these links to your site are coming from smaller / less powerful domains, they might not (yet) have been picked up by Linkscape.
Try looking at Google Webmaster central to see if the links are included there. If Google lists them as links there, they are very likely to be counted by them as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavowing links and disavow tool
Hello, I have had a look on webmaster tools and Moz and can see lots of bad links on Moz only. I am thinking whether to disavow them and was wondering if you could you let me know if it is possible to have a look at your current disavow file? I do not currently know what links were included the previous time. Is there a way to check if there are any links in the file at all? I thought I would raise this question as there is a lot of negative info about the disavow tool.
Moz Pro | | SEM_at_Lees0 -
Non search traffic in a report, user data etc
Are there any plans so we could add more data from Google Analytics to the seomoz reports? At the moment the reports are nice but my clients want to know the bigger picture than just the search traffic. They need to know the total traffic, the bounce rate, pages viewed etc - data in Google Analytics. There are other seo type tools which offer this information when you generate reports and if seomoz were to allow it then it would make life so much easier as I would not need to maintain a subscription here for the cool tools on offer and to offer websites at up to $100 a month just to generate client reports.
Moz Pro | | MegaFastMoz0 -
Issue getting total links, page & domain authority
Hi guys, I am trying to get total links, page & domain authority using the API. I am requesting the following columns: Cols=6871947673632768328204816384343597383681653687091214 { "fjid": 207343179, "ued": 43324279, "pib": 255645, "ptrr": 0.0056131743357352125, "fmrp": 8.246626591590841, "unid": 954915, "fjf": 4003651, "fjr": 0.00040067116628622016, "ftrp": 8.308303969566644, "ftrr": 0.0012189619975325583, "fejp": 9.265328830369816, "pnid": 45883246, "fjd": 2480265, "ujfq": 1277385, "pjip": 1230240, "fjp": 9.586342983782004, "fuid": 294877628, "uu": "www.google.com/", "pejr": 0.0004768398971363439, "ufq": "www.google.com/", "pejp": 9.647424778525615, "ujp": 300689, "utrp": 7.916901429865898, "ptrp": 9.487254666203722, "utrr": 0.001639219985667878, "fmrr": 0.000731592927123369, "pda": 100, "pjd": 5600882, "ulc": 1342758719, "fnid": 12165784, "fejr": 0.00016052965883996156, "ujb": 107264 } I cannot see the UPA column returned in the JSON object. Im using 34359738368 for the UPA column. I need to retrieve the three fields (page authority, domain authority and total links) in the same query. Is it possible?
Moz Pro | | Srvwiz0 -
"Too many on-page links" warning on all of my Wordpress pages
Hey guys, I've got like 120 "Too many on-page links" warnings in my crawl diagnostics section. They're all the pages of my WordPress blog. Is this an acceptable and expected warning for Wordpress or does something need to be better optimized? Thanks.
Moz Pro | | SheffieldMarketing0 -
Why does the csv export from OSE only include 25 of the 90+ links?
In OpenSiteExplorer, I clicked "Download CSV" for a report on backlinks from one domain to another. The online visualization in OSE showed 93 external inbound links from site A to site B. When I opened the report, there are only 25 linking pages listed. How do I download the full list?
Moz Pro | | DanielH0 -
Old Incoming Links Redirected to new pages are not Being Factored on the Open Site Explorer
Hi, My website has been online since 1994. We have old links pointing to pages that no longer exist so what we have done is to create redirects to the specific page where the content is being displayed now. However, when we use the Open Site Explorer, the linking root domains do not show those sites that contain old links pointing to our domain. One good example of this one is: dmoz.org. Am I improperly handling the redirects? Or, what do I have to do so that old links that are being redirected to the new pages where the information is are accounted when calculating domain authority and trust? Thanks, Alex
Moz Pro | | costarica.com0 -
Is it possible to create a report where per page only external links are shown?
Hi all, When I create a report on page level (links to pages per page), it seems that also internal links are included. Is it possible to create a report where per page only external links are shown? thnx Dennis
Moz Pro | | djingel10 -
Is the "Too Many Links" metric a blunt instrument?
The SEO Moz crawl diagnostics suggest that we have too many on-page links on several pages including on our homepage. How does your system determine when a page has too many links. It looks like when a page as more than 100 links it’s too many. Should the system take into account the page authority, domain authority, depth of page or other metrics? In addition, no-follow links are being included. As these are dropped from Google’s link graph, does it matter if we have too many? For example, we could have 200 links, 120 of which are no follow. Your tools would tell us we have too many links. Feedback appreciated. Donal
Moz Pro | | AdiRste0