Open Site Explorer doesn't update correctly
-
We have been using OSE for a little over 2 Months now and our link demographics haven't changed although we have been getting lots of backlinks from other blogs and webpages.
Google Webmaster Tools displays these links but OSE, even after the updates, doesn't.
Our domain is about 1 Year old and has had indexed content on it since June this year.Google Webmaster Tools shows 24 Links to our domain but OSE only 3. The domain is http://www.wellbo.de
-
Yep i found majestic pretty useful, i also am now using a combination.
-
I posted the same question to the latest blog post on the linkscape update and i got some great advice. Essentially, don't rely simply on one source (even though SEOMOz rocks) for reviewing backlinks and authority. Some referred me to Majestic SEO and so far it is extremely close to YSE and WT. Check it out:
The free account will let you explorer basic backlink stats for any URL.
Good Luck - Kyle
-
Cannot say i'm happy with ose, inaccuracy to me.
-
OSE requires up to 60 days to find new links. The updates are based on the Linkscape crawl of the web which takes 2-3 weeks to complete. Once the crawl is complete, it takes 1-2 weeks to process and publish the data. Depending on when your link is published and the importance of the web page involved (i.e. PA/DA) the link may not be discovered during the current crawl cycle.
Also keep in mind Linkscape only crawls the top 25% of web pages on the internet. For the most part, if a link does not appear in OSE after 2 cycles the link has little to no value.
The index was updated last on November 28th. It will be updated again on January 4th. The update calendar can be seen here: http://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
It had been updated previously on November 2nd. If you are aware of a link which was definitely in place in October but does not appear in OSE, please share the link and we can offer feedback. Most often the link is on a page with no importance. A few other likely possibilities:
-
the page is noindexed or blocked by robots.txt
-
the page is buried deep on the site
-
the site is an island page with no links to it
-
the site has indexing / navigation issues
-
the site has low DA/PR and the page is too many clicks from the home page to be crawled.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Should I add 'nofollow' to site wide internal links?
I am trying to improve the internal linking structure on my site and ensure that the most important pages have the most internal links pointing to them (which I believe is the best strategy from Google's perspective!). I have a number of internal links in the page footer going to pages such as 'Terms and Conditions', 'Testimonials', 'About Us' etc. These pages, therefore, have a very large number of links going to them compared with the most important pages on my site. Should I add 'nofollow' to these links?
Technical SEO | | Pete40 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Redirect old URL's from referring sites?
Hi I have just came across some URL's from the previous web designer and the site structure has now changed. There are some links on the web however that are still pointing at the old deep weblinks. Without having to contact each site it there a way to automatically sort the links from the old structure www.mydomain.com/show/english/index.aspx to just www.mydomain.com Many Thanks
Technical SEO | | ocelot0 -
Site command
How reliable is site command? Is there any other way to check indexed pages.
Technical SEO | | gmk15670 -
I have a WordPress site with 30 + categories and about 2k tags. I'd like to bring that number down for each taxonomy. What is the proper practice to do that?
I want to bring my categories down to about 8 or so and the tags... They're just a mess and I'd really like to bring that figure down significantly and setup a standard for usage. My thought was to remove the un-needed tags and categories and setup 301 redirects for the ones that I'm removing. Is that even necessary? Are there tools that can assist with this? What are the "gotchas" I should be aware of? Thanks!
Technical SEO | | digisavvy1 -
Different version of site for "users" who don't accept cookies considered cloaking?
Hi I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there. They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content. Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content. I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form. From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case. So my question is: would this be considered cloaking/put the site at risk in any way? (They would prefer to not go down the First Click Free route as this will lower their email sign-ups.) Thank you!
Technical SEO | | TimBarlow0 -
How long does it take open site explorer to recognize new links?
I'm building a steady link profile to one of my websites and the new links still haven't shown up in open site explorer even after 2 months. How long does it take OSE to recognize new backlinks?
Technical SEO | | C-Style2