Site command
-
How reliable is site command?
Is there any other way to check indexed pages.
-
Thank you all.
Crawled pages per day increased dramatically in the recent weeks according to GWT.
Traffic increased and number of pages driving traffic also increased.
So trying yo narrow down the real impact.
-
Hi gmkrish, Mr.Fishkin actually had an awesome post on this:
http://www.seomoz.org/blog/indexation-for-seo-real-numbers-in-5-easy-steps
Cheers!
-
If you are looking for a specific page you can use the info command as follows:
info:www.yoursite.com/specific-page.html
That will return all info (if any) that Google holds on the URL in question. Where no information is available, the URL is not indexed.
In my experience though, whilst the site command can be a bit misleading, google webmaster tools usually gives you solid results. I can't say I have rigorously tested this though but in my experience, it seems solid.
Marcus
-
Hi there, the Site: command doesn't give a full representation of a website's Indexed pages. Better to have a check within Google Webmaster Tools to see how many pages have actually been Indexed, which isn't always 100% accurate though is more so than the Site: command. Hope that helps, Regards, Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving site from html to Wordpress site: Should I port all old pages and redirect?
Any help would be appreciated. I am porting an old legacy .html site, which has about 500,000 visitors/month and over 10,000 pages to a new custom Wordpress site with a responsive design (long overdue, of course) that has been written and only needs a few finishing touches, and which includes many database features to generate new pages that did not previously exist. My questions are: Should I bother to port over older pages that are "thin" and have no incoming links, such that reworking them would take time away from the need to port quickly? I will be restructuring the legacy URLs to be lean and clean, so 301 redirects will be necessary. I know that there will be link juice loss, but how long does it usually take for the redirects to "take hold?" I will be moving to https at the same time to avoid yet another porting issue. Many thanks for any advice and opinions as I embark on this massive data entry project.
Technical SEO | | gheh20130 -
Site Link Issues
For several search terms I get site links for the page http://www.waikoloavacationrentals.com/kolea-rentals/kolea-condos/ It makes sense that that page be a site link as it is one of my most used pages, but the problem is google gave it the site link "Kolea 10A". I am having 0 luck making any sense of why that was chosen. It should be something like "Kolea Condos" or something of that nature. Does anyone have any thoughts on where google is coming up with this?
Technical SEO | | RobDalton0 -
How to Switch My Site to HTTPS in GWT?
I recently bought an SSL certificate and moved my site over to HTTPS. Now how do I make the change in Google Webmaster Tools?
Technical SEO | | sbrault740 -
Will Links to one Sub-Domain on a Site hurt a different Sub-Domain on the same site by affecting the Quality of the Root Domain?
Hi, I work for a SaaS company which uses two different subdomains on our site. A public for our main site (which we want to rank in SERPs for), and a secure subdomain, which is the portal for our customers to access our services (which we don't want to rank for) . Recently I realized that by using our product, our customers are creating large amounts of low quality links to our secure subdomain and I'm concerned that this might affect our public subdomain by bringing down the overall Authority of our root domain. Is this a legitimate concern? Has anyone ever worked through a similar situation? any help is appreciated!
Technical SEO | | ifbyphone0 -
On-site adjustment opinions
Hi folks, I've got a fairly interesting scenario. I'm trying to rank this page (http://www.staysa.co.za/sa/1-2-0-0-1/East-London/accommodation) better for the term, "accommodation east london". The client isn't keen on making many changes and it was built horribly with ASP, half CMS, half not. I have made the following changes today: I introduced two paragraphs of text below the H1 tag. I changed "East London Bed and Breakfast", "East London Conference Venues", "East London Cottage / Chalet" to just "Bed and Breakfast", "Conference Venues", "Cottage / Chalet" as the continual key phrase duplication in my experience is a bad move. I've made a change to the title tag (this is a huge mission as it's not CMS controlled, so I had to teach myself some basic ASP to do so). Meta data.. nightmare to change unfortunately, at least not without rewriting part of the CMS. I'm wondering, are there any other on-site factors that I'm missing? I'm not a fan of site-wide links, so I don't want to put an exact match anchor text link from the sidebar/footer to the page, not unless someone can motivate why I should. Keen to hear everyone's opinions 🙂
Technical SEO | | ChristopherM0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of migrating one of our retail sites over to magento. To our horror, we find out today that some pages are already being indexed by Google, and we have started receiving orders through new site. Do you have any suggestions for what may have caused this? Or similarly, what the best solution would be to de-index ourselves? We most recently excluded anything with a certain parameter from robots.txt - could this being implemented incorrectly have caused this issue? Thanks
Technical SEO | | Sayers0 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140