How can i discover how many of my pages have been indexed by google?
-
I am currently in the process of trying to produce a report for my corporation and this is a metric that i cannot seem to find on OpenSiteExplorer. Could anyone help?
-
Hi Vlad,
http://www.youtube.com/watch?v=_x1oSjCTKpw
To export your indexed pages to CSV/Excel to add to a report:
http://www.mathewporter.co.uk/list-a-domains-indexed-pages-in-google-docs/
Hope this helps
-
Vladimir,
Go into GWMT. Under dashboard go to Google Index. Then go to Index Status.
You will see your site's indexed pages over time.
Best,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please critique my seo packages page on my website
Hello, I think there is a lot of room for improvement to my SEO packages (analysis) page on my website. Visit it by clicking here. I've always done SEO for our own business, so I'm new to this. Any help is appreciated. btw: I'm going to redo the chart on the page. I don't think profit and sales would ever compare like that. Thanks.
Industry News | | BobGW1 -
Manual action penalty by Google
Hello, We have a big well-known brand - www.titanbet.com. This brand is well established and the site has been live for almost 4 years now ranking very well on some very strong KWs. we received a message from Google on Aug 29<sup>th</sup> saying “Google has detected a pattern of artificial or unnatural links pointing to your site” and that “Google has applied a manual spam action to titanbet.com/” The past 2 weeks since the penalty was received we saw some of our major KWs drop in rankings. BUT all brand related KWs were still ranked 1<sup>st</sup> Over the last weekend the penalty has worsen and we no longer rank on any of the brands KWs (we find the site in 5<sup>th</sup> page at best). Moreover, when searching for a sentence from the any of the page on the site in Google, we see other sites ahead of us in the SERPs. Based on the message we originally received from Google we have started cleaning some of the bad links to the site. We found a lot of links from bad sites, some of them are not indexed and probably penalized as well, some are from affiliate websites and some are from some automatic indexation websites based in China and Russia
Industry News | | Tit
we have started reaching out to some of these sites to try and have them remove our links. We are also worried about the duplication of our site. We have found many other sites (mostly affiliate websites) have copied and in some cases completely duplicated our content. Google for some reason has chosen to penalize us for this. Although we do not have control over these other sites. We have run copyscape to try and figure out which pages are the most problematic and we will try to re-write the content on these pages. But what if the other sites copy us again? Any suggestions on the above would be appreciated as we try to understand why Google has penalized us. thank you Titan Bet Team0 -
Google Changes Up The Search Results Page
Hi Guys, As you Google has made changes on search results page. I have two points two discuss here : 1. Are we going to see more ads on left sidebar in future ? 2. I think it will also affect the CTR of top three ads in SERP ? Waiting for you guys opinion on it ? Reference: http://www.webpronews.com/google-changes-up-the-search-results-page-2012-11
Industry News | | SanketPatel1 -
Google Cached "Text Only" version
Is there a way to test what a page would look like in Google "Text Only" version before a page is indexed in Google? Is there a tool out there to help with this?
Industry News | | activejunky10 -
Google's Current Wave of Updates (4/24 edition)
Edit: Someone beat me to the punch, here is the thread: http://www.seomoz.org/q/google-webspam-algo-update-24-4-12 Let's just discuss it there. So Google has said they are doing another wave of algorithm updates that could impact anywhere from 3-5% of SERPs. I saw it here: http://searchengineland.com/google-launches-update-targeting-webspam-in-search-results-119295 http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html Has anyone seen any changes? I've heard from a few friends that their sites are bouncing all over the place, which seems to happy a lot during these updates. We might not actually know the fallout for a few days/weeks/months. I saw a few of my smaller sites take a hit, but most of mine have stayed the same. Anyway what do you guys think? Sometimes an update like this can be a wake-up call to people who think they are doing white hat stuff but may be pushing the envelope a bit too much. Thoughts?
Industry News | | vforvinnie0 -
How many total independent SEO professionals are there in the US?
I'm looking for two pieces of data: 1. How many total independent/freelance SEO professionals are there in the US? That is, SEO pros who work directly with their own clients. 2. How many SEO firms are there in the US? I've been scouring all over looking for this data and can't find it. Any help would be much appreciated.
Industry News | | jsteimle0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google Directory no longer available?
Now, we will forever not know what is in the Google Directory. I just clicked on the link..... and everything is dead and points you to DMOZ. What does this mean for us? Is DMOZ going to get more editor juice, so submissions are actually reviewed for once? The Yahoo! directory has also been glitching - new submissions have been disabled for over a week now. Any comments?
Industry News | | antidanis0