How do I find out which pages are being indexed on my site and which are not?
-
Hi,
I doing my first technical audit on my site. I am learning how to do an audit as i go and am a lost. I know some page won't be indexed but how do I:
1. Check the site for all pages, both indexed and not indexed
2. Run a report to show indexed pages only (i am presuming i can do this via screaming Frog or webmaster tool)
3. I can do a comparison between the two list and work out which pages are not being indexed.
I'll then need to figure out way. I'll cross this bridge once i get to it
Thanks Ben
-
Hi Ben,
I'd echo what Patrick has said and probably recommend his first suggestion the most. Google Webmaster Tools is a good way of checking indexation and if you have a large site with lots of categories, you can even break down the sitemaps by category so that you can see if certain areas are having problems.
Here is an old, but still relevant post on the topic:
http://www.branded3.com/blogs/using-multiple-sitemaps-to-analyse-indexation-on-large-sites/
In terms of creating the sitemap, Screaming Frog has an option under Advanced Export for creating an XML sitemap file for you which works very well. You just need to make sure you're only including pages that you want indexed in there.
Cheers.
Paddy
-
Hi Patrick,
Thanks for replying.
Can you recommend any tools for creating the site map i've had a look around and the few i've found seem to all deliver different results? One has been submitted previously so i need to go through the process for myself so i can under these basics.
I've had a read up on robot txt so i understand what is happening there from an exclusion perspective and once i understand how the XML site works ill be able to do an audit as mentioned above.
Ben
-
Ben,
You can check a couple things:
- Have you submitted your XML site map to Google? If not, create one and get it submitted so you tell Google what pages you want indexed.
- Submit your domain and all pages through Google Webmasters Tool as well (Login > left side bar > Crawl > Fetch as Google
- Screaming Frog is an awesome software, so yes, if you have it, use it to scan your pages
- Try and do a simple "site:domainname.com" search in Google to see what is being indexed from your domain
Cross reference it all and you will then have a better understanding. I do believe, your sitemap is crucial in telling Google exactly what pages you do and do not want indexed. They will follow that. You're on the right track and hope my input was helpful! - Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
New pages on my web site
I have created web sites that appear somewhere on Google in hardly any time at all, but I appear to have forgotten something or things are different for pages added recently to an existing website. I have added a page on a particular subject, optimized it using on page grader, so that I get an A, and a check mark for everything except H1 tags and rel=canonical which my web hosting provider does not support. I do have a check mark for accessible to search engines The page has the format http://www.domain.com/specific-keyword It is in the menu, so should have internal links to it, as I understand it. I have created a new site map, and submitted it in webmaster tools. Interestingly it says that of the 96 pages only 76 were indexed is this a clue? and why would they not index a page I have then shared the page on google plus, facebook, tumblr, pinterest and twitter and some others In OSE it comes up as domain authority 28 page authority 1, the social media shares do show up in metrics on the right but no links internal or external are shown, they do on other pages I created in the same way. Is it just a case of waiting or is their something I do to help thank you
Moz Pro | | singingtelegramsuk0 -
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
Missing Page Titles On The Comptetive Link Comparison Page
Hello, When I do a Link Analysis using the SEOmoz tools I have noticed that most of the pages listed on the Top Pages tab show [No Data] for page title. Any idea why that could be? The page source of those pages have one and only one <title>tag.</p> <p>Thanks!</p></title>
Moz Pro | | andersvin0 -
Moztool and on page ranking matching
How does the Moztool compare and filter the search phrases you enter in your campaign? Or more correctly, will it filter out stop words or is it an exact match? For example I enter a phrase to track that say: "book ski trip austria" Identified in Google I see that most users search for just that "book ski trip austria" But in content, I cant write that as that is uncorrect english and I want to maby write something like: "When you book a ski trip to austria you get..." How will this affect my on page SEO report, will it still match and mark a "V" in done or show a an error? Even more interesting is, what happen if you do phrases in different order like "An austrian skip trip will make you feel..."
Moz Pro | | Macaper0 -
Sudden decrease in Moz Page rank
Hello, We have a serious issue with 404s and recently saw our Moz Page rank fall from 53 to 47. 1. OSE Inbound links no longer shows any of our Linked In posts, did Linked In stop passing juice? 2. Does SEO Moz reduce your ranking when there's a sudden increase in 404s? 2a. WP Yoast SEO - I accidentally checked the box on this plugin to "Strip the category base (usually /category/) from the category URL" which basically caused all of our blog post categories and Datafeedr categories to disappear. Didn't realize till too much time had passed that I accidentally clicked that box. Datafeedr is a plugin for our estore that parses the data feeds from affiliate vendors and allows you to create a saved search that auto updates old products every 3 days. I had a no index/follow parameter on the category items, but seeing the # of 404s continue to increase, I temporarily removed this parm last week to see if it reduces this now static number of 404s. Google Webmaster tools started showing a ton of soft 404s that kept increasing, while SEO Moz didn't show any of those 404s. I didn't pay much attention to GWT since Google kept saying it won't affect our rankings, and nothing was showing up on SEO Moz. Last week a fraction of those 404s showed up and I am not sure if that's what lowered our Moz rank or what looks like a possible delinking from Linked In and a higher ranking complimentary website directly related to our field itsallaboutyoga. Looking at the Moz graph of "Total Linking Root Domains Over Time" all of our competitors took a similar % hit since between June and the end of July, so I am thinking its more wide based than fat fingered mistake. I fixed # 2, (have to still figure out what to do with most of those 404s, thinking of submitting a request to Google vs 1,000s of 301s) so in doing my review of this sequence of events and using it as a learning experience, where would I assign max destructive value as a percentage? A. Ignoring GWT soft 404s in favor of SEO Moz campaign reports B. No follow from Linked In and related industry site C. Datafeedr, thousands of indexed products through Datafeedr that are no longer available mostly due to WP Yoast SEO fat finger error. I did have the D. WP Yoast SEO, "Strip the category base (usually /category/) from the category URL" E. Global Google algo change Cheers, Michael
Moz Pro | | MKaloud0 -
Open Site Explorer Question!
Hi, I have performed a search on a root domain and the page auth is higher then the domain auth? I would have thought they would have been the same or at least the other way around!
Moz Pro | | activitysuper0