Can Google see all the pages that an seomoz crawl picks up?
-
Hi there
My client's site is showing around 90 pages indexed in Google. The seomoz crawl is returning 1934 pages.
Many of the pages in the crawl are duplicates, but there are also pages which are behind the user login.
Is it theoretically correct to say that if a seomoz crawl finds all the pages, then Google has the potential to as well, even if they choose not to index?
Or would Google not see the pages behind the login? And how come seomoz can see the pages?
Many thanks in anticipation!
Wendy
-
Well, that could be your easy solution. Make sure they're all set not to be indexed, then you'll be able to (mostly) ensure Google won't crawl them, and they'll probably disappear from your moz crawl report as well. As far has how moz is finding them to begin with behind your login wall, sorry, I have no idea.
-
The pages behind the login? No not yet - they are a new client, so I am just auditing at the moment to identify what we need to do
Many thanks for your replies!
-
This may be an obvious question, but to you have those pages set to noindex?
-
Hi Marisa
seomoz are crawling unecessary pages, (they return pages ignored by screaming frog for example)
BUT my concern is that if Google can also see them, even if they choose to ignore them my client maybe getting slammed for duplicate issues or the pages behind the login may suddenly appear in the index.
We'll get no index / no follow added, and fix the dupes, but am really interested as to how seomoz sees behind the login
-
Here's the real question: Do you WANT Google to see all these pages, or is SEOmoz crawling unnecessary pages?
-
Great, many thanks Nakul - they are a new client so am waiting on getting access to WMT - will go through with a fine tooth comb! Just seems really weird with regards to the pages behind the login ...
-
Wendy, if SEOMOZ can see it, I am sure Google can see it as well. I would login to your webmaster console and check the index status. Do you have an XML sitemap submitted for your website ? Once you do, you'll have a more accurate read on the number of pages you submitted and how many of them are indexed. The new index status Google introduced last month also lets you see pages Google ignored for multiple reasons.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is one page with long content better than multiple pages with shorter content?
(Note, the site links are from a sandbox site and has very low DA or PA) If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above. http://www.aspensecurityfasteners.com/Screws-s/432.htm My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g. http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
Moz Pro | | AspenFasteners
http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm0 -
Can anyone offer an example of a site or page that gets 100% or even close to that on Search Visibility?
I have a couple of sites that I manage that kill it in the SERPs and yet they get a low search visibility score from Moz Pro, I am talking 18%-19%, and another that ranks well has a search visibility score of 8.71%. I know there are factors that go into calculating the score, I am just curious if anyone is really up there.
Moz Pro | | -b.graves- 00 -
Duplicate Page Content on pages that appear to be different?
Hi Everyone! My name's Ross, and I work at CHARGED.fm. I worked with Luke, who has asked quite a few questions here, but he has since moved on to a new adventure. So I am trying to step into his role. I am very much a beginner in SEO, so I'm trying to learn a lot of this on the fly, and bear with me if this is something simple. In our latest MOZ Crawl, over 28K high priority issues were detected, and they are all Duplicate Page Content issues. However, when looking at the issues laid out, the examples that it gives for "Duplicate URLs" under each individual issue appear to be completely different pages. They have different page titles, different descriptions, etc. Here's an example. For "LPGA Tickets", it is giving 19 Duplicate URLs. Here are a couple it lists when you expand those:
Moz Pro | | keL.A.xT.o
http://www.charged.fm/one-thousand-one-nights-tickets
http://www.charged.fm/trash-inferno-tickets
http://www.charged.fm/mylan-wtt-smash-hits-tickets
http://www.charged.fm/mickey-thomas-tickets Internally, one reason we thought this might be happening is that even though the pages themselves are different, the structure is completely similar, especially if there are no events listed or if there isn't any content in the News/About sections. We are going to try and noindex pages that don't have events/new content on them as a temporary fix, but is there possibly a different underlying issue somewhere that would cause all of these duplicate page content issues to begin appearing? Any help would be greatly appreciated!0 -
What's the future of SERP Tracking? And... Is SEOMoz's SERP Rank Tracking in compliance with Google Adwords API Terms of Service?
My question is: Is SEOMoz's SERP Rank Tracking in compliance with Google Adwords API Terms of Service? Background: The reason I ask is because Raven Tools is now removing their SERP Reporting tool because it uses scraped Google position data. So, it looks like SEO's will either have to find a new rank tracking tool or find new ways to traffic the effects that rankings have on a website traffic volumes. For instance, there is a way to get the position a search results was in Google when it was clicked. We could create a secondary profile in Google Analytics for each client and use a custom filter to record the position that the keywords was in when the search result was clicked ( http://www.seomoz.org/blog/show-keyword-position-using-filters-and-advanced-segments ) Or perhaps we'll have to use Google Webmaster Tools' SEO Report to get data somehow ( http://support.google.com/analytics/bin/answer.py?hl=en&answer=1308626 ) What are your thoughts on this? As you know, ranking data is still a great way to show clients if they are gaining or losing visibility in the search engines. It helps SEO's to report how effective their efforts have been. Because other ranking software companies uses Adwords API data to show the keyword search volume and advertiser competition of a keyword, they can not or eventually will not be able to use scraped ranking data any more. But, if another rank tracking tool out there doesn't need to be in compliance with the Adwords API TOS because they don't use that API to show search volume and advertiser competition, they can still technically provide their ranking data and not be violating any TOS, right? I'm just trying to understand the best way to continue reporting impact of organic keyword rankings on a website. Does the SEOMoz SERP Tracker comply with Adwords API? Is there another rank tracking tool out there that already is using Average Position data from the GWT SEO Report tool? Should we all just stop reporting rankings to clients altogether? Scott
Moz Pro | | OrionGroup2 -
Pages Crawled: 250 | Limit: 250
One of my campaigns says: Pages Crawled: 250 | Limit: 250 Is this because it's new and the limit will go up to 10,000 after the crawl is complete? I have a pro account, 4 other campaigns running and should be allowed 50,000 pages in total
Moz Pro | | MirandaP0 -
How to Stop SEOMOZ from Crawling a Sub-domain without redoing the whole campaign?
I am using SEOMOZ for a client to track their website's performance and fix any errors and issues. A few weeks ago, they created a sub-domain (sub.example.com) to create a niche website for some of their specialized content. However, when SEOMOZ re-crawled the main domain (example.com), it also reported the errors for the subdomain. Is there any way to stop SEOMOZ from crawling the subdomain and only crawl the main domain? I know that can be done by starting a new campaign, but is there any way to work around an existing campaign? I'm asking because we would like to avoid the setting up the campaign again and losing the historical data as well. Any input would be greatly appreciated. Thanks!
Moz Pro | | TheNorthernOffice790 -
What do i do when all pages are grade A?
I've used the on page grade and now have all my pages at a grade A for relevant keywords. Most of them are cool, achieveing first page rankings apart from a few massive keywords. So the question is, what's next? What do i do now that I'm at grade A, but perhaps not #1 yet... Cheers -dan
Moz Pro | | spytunes0