My homepage is not getting indexed by google for some reason
-
My homepage http://www.truebluelifeinsurance.com is not indexed by google. The rest of my site is indexed.
The hompage is indexed by bing. I looked in the webmaster tools and there is no indication why.
I believe the issue started when I did a site re-design in August.
Any ideas?
-
Hi,
May i know How did you solve you website indexing problem?
I am facing the same problem with my website.
Thank you.
-
I can see it indexed. Looks like the problem is resolved.
-
Any update on this? Are you seeing your site indexed by Google yet? What is GWT saying?
-
If it does not resolve itself and by the sounds of it, it should have by now.There is always the option of contacting John Mueller at Google. There will be a Hangout Q&A this Friday where you can ask that exact question. You can post your question to him right now at http://www.google.com/moderator/?#15/e=203709&t=203709.7c&f=203709.6c6f70 and he will get to it live.
Sometimes when there are weird issues, John is a great resource as he has all the inside tech to check a problem.
He has helped me many times before.
-
There shouldn't be any problems with the robots.txt file as far as blocking the homepage in Google, especially since the homepage is indexed in Bing and both crawlers abide by the robots.txt.
-
Could it be this content in my robot.txt file?
User-agent: *
Disallow: /_borders/
Disallow: /_derived/
Disallow: /_fpclass/
Disallow: /_overlay/
Disallow: /_private/
Disallow: /_themes/
Disallow: /_vti_bin/
Disallow: /_vti_cnf/
Disallow: /_vti_log/
Disallow: /_vti_map/
Disallow: /_vti_pvt/
Disallow: /_vti_txt/
Disallow: /_private/
Disallow: /_ScriptLibrary/
Disallow: /cgi-binS/
Disallow: /add-site.php -
I did not change domains with the re-design. The sitemap is a mistake and from another site I own.
-
Interesting. That is a mistake. I have deleted the sitemaps on the site.
-
I have done this already today in the webmaster tools. I should have definitely been indexed by now as it has not been indexed for months.
-
Did you change domains when you did a site redesign? I noticed your sitemap has URLs from a completely different domain name so that may have something to do with it - http://www.truebluelifeinsurance.com/sitemap.xml. I'd make sure to change the sitemap and then set your preferred domain name in Google Webmaster Tools.
-
Have you given it enough time to get crawled? You can also submit it to Google webmaster tools to get crawled.
https://support.google.com/webmasters/answer/1352276?hl=en
Little odd I must admit but I'm sure someone around here will get to the bottom of it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Getting keywords to rank on new landing pages
I've built new landing pages for a website, and have loaded them with researched keywords in the content, alt image attributes and metas etc - but, after a number of crawls, the keywords are currently being matched to other existing web pages on the website. Does anyone have any advice on 'unlatching' these keywords from these pages, and instead getting them to match with pages that have been optimised for them? Many thanks!
Moz Pro | | Darkstarr6660 -
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Best Way to Seemlessly Use Moz and Google Analytics
Hi All, I have Moz Pro and I am beginning to explore Moz Analytics. I currently use Google Analytics for pretty much everything and I am almost at expert level at using the GA interface so I'm ready to learn a new tool and have a second source of data! My question is: what's the best way to get the most of Moz Analytics and GA congruently? Off hand I know that Moz's keyword, rank, and link data, including competitive insights, is more comprehensive than GA. But how else can I use Moz and get the most out of the Moz Analytics? Looking to really dive in and discuss how we can make use of both! Thanks - Vee
Moz Pro | | Symmetri0 -
Are the SEOMoz queries banned by Google?
Hi: In our company, we have notified that Google is banning our IP due to massive queries under the same IP. It is caused mainly by local softwares installed in PCs or some plugins of Firefox. We think that SEOmoz don´t causes this due to its tool uses its own IP and servers, not ours. We would like your opinion about this. Thank you,
Moz Pro | | Equipo_SEO_HUB_Interaction0 -
Adding new categories to my website and getting them ranked
Hi, I currently run a website and have added a couple of new categories lately. The website is Planes Trains Automobiles Co UK. For example I want to add "Planes for sale" as a new category. What is the best way to go about this? Will each aircraft added add content therefore ranking? I don't want to approach an SEO company yet as the service would be free for the time being? Thanks
Moz Pro | | PlanesTrainsAutos0 -
Why site is getting ranked in top three positions on Bing and Yahoo but not in Google?
My site Business-Training-Schools.com is getting descent traffic from Bing and Yahoo but very little from Google. Can you please help me in finding out the root cause of it?
Moz Pro | | HQP0 -
What is the quickest way to get OSE data for many URLs all at once?
I have over 400 URLs in a spreadsheet and I would like to get Open Site Explorer data (domain/page authority/trust etc) for each URL. Would I use the Linkscape API to do this quickly (ie not manually entering every single site into OSE)? Or is there something in OSE or a tool I am overlooking? And whatever the best process is, can you give a brief overview? Thanks!! -Dan
Moz Pro | | evolvingSEO0