Why my site is not indexing in google
-
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP?How many links are advisable to submit in sitemap for a website?
-
Our site content are unique ...Today i checked GW now 750 links got indexed out of 1300 links..I Couldnt understand the google crawl time & concept..We have waited for 1 month..it crawled only 5300 out of 22000 links....and got 404 error -469 & Not found - 57 errors, day by day the errors getting increase ..after that we redesigned our site model(user friendly) & changed few internal links and created new sitemap & resubmitted in GW last saturday ..Next day it crawled 4 links only ...after 5 days it crawled (today) 750 links..
The error links are changed in new sitemap ..so when will google crawl links completely & clear all errors..
-
Hate to say it, but it's not uncommon for pages to not be indexed because they contain thin content.
Take a look at the pages that have been indexed; I'm willing to be that they are a little more fleshed out and rich than those that have not been.
If a given page has only a sentence or two of weak or repetitive content, then it's likely that Google saw it in your sitemap and simply did not think it worthy of indexation.
Also, make sure to test your sitemap and check up on any errors that might have cropped up, that's happened to me quite a few times.
-
1200 is a small amount of 22000 of which 5300 were indexed. its possible that your new 1200 sitemap has links to almost anything but those which are already indexed. As chris has said a sitemap is a suggestion, on big sites google tends to do what it wants to get the data it need.
-
Rajesh,
It's not the size of your sitemap because Maximum number of URL on a sitemap is 50,000. Keep in mind, however, that your sitemap is really just a suggestion tool. Just because your sitemap contains a URL, doesn't mean Google will crawl it. The site's architecture and its back links have impact its crawl priority.
Read through these posts for more info:
http://www.seomoz.org/blog/testing-how-crawl-priority-works
http://www.seomoz.org/blog/diagrams-for-solving-crawl-priority-indexation-issues
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site Has Penalized By google Search Result Without Any Spam Score.
I Recently Make a Site Gizmocombot.com. tHE aITE has NO spam Record NO lousy BACKLINK.it has all unique article can anyone tell us how we can unpenalized our site from google webmaster and google search Result. i attcead a screenshot as well yoou need. 3nzmALp
Technical SEO | | litoginamaaba3332 -
Why google indexed pages are decreasing?
Hi, my website had around 400 pages indexed but from February, i noticed a huge decrease in indexed numbers and it is continually decreasing. can anyone help me to find out the reason. where i can get solution for that? will it effect my web page ranking ?
Technical SEO | | SierraPCB0 -
Does Google differentiate between a site with spammy link building practices from a victim of a negative SEO attack?
I've be tasked with figuring out how to recover our rankings as we are likely being hurt by an algorithmic penalty. I have no idea if this was the workings of a previously hired SEO or the result of negative SEO, **how does Google differentiate between a site with bad/spammy link building practices from a victim of a negative SEO attack? **
Technical SEO | | Syed_Raza0 -
Dev Site Was Indexed By Google
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.
Technical SEO | | ntsupply0 -
Google site: operator showing only 30 results for whatever website you may like, omitting the rest
site:wikipedia.org site:seomoz.org site:nytimes.com site:WHATEVER YOU PUT HERE 🙂 is currently always showing just 3 SERP pages and the well known ugly message: In order to show you the most relevant results, we have omitted some entries very similar to the 30 already displayed.
Technical SEO | | j.royal
If you like, you can repeat the search with the omitted results included. Any idea what's going on?!!? Ongoing update?0 -
Do Com Junct links affect your site with Google and Penguin
We received the dreaded letter from google in reference to "unnatural or artificial links" Our site has affiliate programs through Commission Junction and Link Share and between the two programs we have over 8000 affiliates or advertisers. Our site has been very successful, but our organic search traffic is down as our rankings in the search engines have dropped. My question is do the affiliate links have an effect on our site with Panda or Penguin?
Technical SEO | | Statrak0 -
Ensuring Assets (PDFs, PowerPoint Files, Word Docs, etc.) are Indexable on Site
Hi there - I'm working on an educational site in which users will be able to search our repository of PDF articles, PowerPoint files, and so on through an on-site search engine. What is the best way to ensure each of these documents/assets are indexable by Google since they technically don't reside on an HTML page....they are just pulled up if the user searches for them? The site itself is just a few pages, but the files, articles, and videos in the repository are in the hundreds. Should I just name and tag them properly and make sure they're all included in an XML site map? Anything else suggested? Thanks very much!
Technical SEO | | MedThinkCommunications0 -
Google refuses to index our domain. Any suggestions?
A very similar question was asked previously. (http://www.seomoz.org/q/why-google-did-not-index-our-domain) We've done everything in that post (and comments) and then some. The domain is http://www.miwaterstewardship.org/ and, so far, we have: put "User-agent: * Allow: /" in the robots.txt (We recently removed the "allow" line and included a Sitemap: directive instead.) built a few hundred links from various pages including multiple links from .gov domains properly set up everything in Webmaster Tools submitted site maps (multiple times) checked the "fetch as googlebot" display in Webmaster Tools (everything looks fine) submitted a "request re-consideration" note to Google asking why we're not being indexed Webmaster Tools tells us that it's crawling the site normally and is indexing everything correctly. Yahoo! and Bing have both indexed the site with no problems and are returning results. Additionally, many of the pages on the site have PR0 which is unusual for a non-indexed site. Typically we've seen those sites have no PR at all. If anyone has any ideas about what we could do I'm all ears. We've been working on this for about a month and cannot figure this thing out. Thanks in advance for your advice.
Technical SEO | | NetvantageMarketing0