Can anyone help me understand why google is "Not Selecting" a large number of my webpages to include when crawling my site.
-
When looking through my google webmaster tools, I clicked into the advanced settings under index status and was surprised to see that google has marked around 90% of my pages on my site as "Not Selected" when crawling. Please take a look and offer any suggestions.
-
Thank you. Thank you. Thank you. That makes so much sense. This is also the issue I am having with my communities and cities pages, pointing at my http://luxuryhomehunt.com/homes-for-sale page.
Does that make sense?
-
Thanks for the response. The pop up is running in java, and from what I have been told search engines can crawl pages so long as the opt in is running in java. Typically a visitor would hit one of our landing pages such as http://luxuryhomehunt.com/homes-for-sale/Longwood/alaqua-lakes.html where they can find information about the specific community they are searching for then if they click on a listing they would be prompted to opt in.
Do you think there may be any correlation to me using or not using canonical tags? Another thing I was wondering is if it has anything to do with, my handling of pages 2,3,4,5 etc of a city or community with more than ten listings.
I am not sure as to why your connection would have been refused, I am currently running a xml sitemap generator and maybe that had something to do with it. Either way, I am super grateful for your help and for you looking at this. I am very new to SEO and trying to learn my way through as much as possible.
-
Hmm, I just tried to click on a listing in Google but I was served a popup which required that I enter in my contact information before I could access the site http://luxuryhomehunt.com/view-property/40096215. Did you just add this pop up? Since there is no way for users to opt out of entering in contact information to view a listing, then it may be possible that the search engines are being blocked as well.
I also tried crawling the site with Screaming Frog SEO Spider and Xenu, but my connection was refused... not sure if my IP was blocked or if the site is blocking crawlers, but my guess is the search engines may be having some trouble accessing all of the pages on your site.
At the very least, I'd recommend removing that popup since it's bad for user experience and may be causing problems with the search engines.
EDIT - I did some more digging and looked the Google cache for one of your listings - http://webcache.googleusercontent.com/search?q=cache:L6LzTqj9gQUJ:luxuryhomehunt.com/view-property/40445850+&cd=6&hl=en&ct=clnk&gl=us&client=firefox-a. On this page, you have the rel="canonical" tag set to http://luxuryhomehunt.com/view-property so that tells the search engines that all of your property listing pages should use that canonical URL, which explains why most of your pages are "Not Selected" per Google -
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2642366
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can i make Google to consider my News pages
How can i make Google to consider my News pages as News and place them in the Google News section? Is there some syntax i need to mention in all my news pages?
Technical SEO | | AlexisWithers0 -
Massive Nonsensical 301 on Large ecommerce Site
We are in the process of launching a large ecommerce site, which is a rebuild. Their old URL structure does not make it possible in our eyes to logically map every URL to it's corresponding new page. We have done our best to properly and manually redirect all pages that were receiving any amount of organic traffic and have also covered all pages that had external links. Our question is we will end up with potentially tens of thousands of 404 errors that will never fix themselves. The manual work will need to stop at some point. Would it be better to leave these 404's the way they are and just let them fall out of the index or should we take everything we cannot assign appropriately to a page like the products root or the home page? I'm also open to hearing any suggestions about how others have solved massive nonsensical 301's. Thanks in advance,
Technical SEO | | Bevelwise0 -
"Not Selected" in index status rising continously
Hello, After the penguin update my site slowly suffered loss in traffic. and now from daily 15K-18K its droped to 8K. (6K in weekends) I have been trying to find out what the reasons are but i havent got any good luck yet been few months now. I noticed this change in the GWT tho : Not selected in index status significantly risen up. please see attached image. My site is Designzzz i am continously fixing errors and problems shown in the seomoz pro tools. If you guys can take few mins to evaluate what could be the reason for such drop i will be thankful :} cheers 6Xtkp.jpg
Technical SEO | | wickedsunny10 -
Hi can anyone let me know which is the better server
hi, i am trying to find out which is the better dedicated server and would like your opinion. the first one is Dell PowerEdge 😄 Intel Xeon E3-1220L, 2.2GHz Dual-Core
Technical SEO | | ClaireH-184886
4GB DDR3 RAM
2 x 500GB SATA HDD
Linux/Windows
10000GB Monthly Transfer
Up to 2 IP Addresses
LSI Raid Card and the second one is, Intel Atom 330 1MB L2 Cache 1.6GH 500GBStorage
4GBRAM
10TBBandwidth if you can please let me know the difference and which one is better for speed and for memory for a large site. many thanks0 -
Google Not liking Magento Sites?
Hello, I'm new to the community and I wonder if anyone can help us shed a light on this SEO issue we are having. We have 3 magento websites that is being affected. Whats happening is that those site were ranked for a specific keyword for few months, but all of a sudden, it just drop like crazy. It went from top 10 to about 150 in a bout a weeks period. Some site, it's not even ranked or stopped ranking and visible on the search engine. Is google not liking MAgento for some reason?? Any help or suggestions will be appreciated! thanks
Technical SEO | | solution.advisor0 -
Nofollow links appear to be still included in SEOMOZ crawl and Google
I have added the nofollow tag to links throughout my site to hide duplicate content from Google but these pages are still being shown in my SEOMOZ crawl. I also fetched an example page with the Googlebot within Webmaster tools and it showed all nofollow links. An example is http://www.adventurepeaks.com/news All News tags have nofollow but each tag is appearing in my SEOMOZ crawl report as duplicate content. Any suggestions on whether this is a problem or if i have applied the tag incorrectly? Many thanks in advance
Technical SEO | | adventure340 -
Seek help correcting large number of 404 errors generated, 95% traffic halt
Hi, The following GWT screen tells a bit of the story: site: http://bit.ly/mrgdD0 http://www.diigo.com/item/image/1dbpl/wrbp On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT. My traffic had been steady at about 1000 clicks/day. At midnight on 2/10, google traffic completely halted, down to 11 clicks/day. I submitted a recon request and was told 'no manual penalty' Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then. By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th. I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors! I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors. However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'. I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$. Assuming that the large number of 404 internal errors is the reason for the sudden shutoff... How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'.. Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP?? thanks
Technical SEO | | mantucket0 -
Suggested crawl rate in google webmaster tools?
hey moz peeps, got a general question: what is the suggested custom crawl rate in google webmaster tools? or is it better to "Let Google determine my crawl rate (recommended)" If you guys have any good suggestions on this and site why that would be very helpful, thanks guys!
Technical SEO | | david3050